Actuarial Systems Evolution
New possibilities emerge as hardware, software and collaboration methods improve, thanks to technological advances
Spring 2020Photo: Getty Images/Ryouchin
In this article, I provide a software engineer’s perspective on keeping pace with both the actuarial and software industries. Software engineering is a young industry compared to actuarial science, although the two lines of work share a common mathematical and scientific foundation.
One of the first users of the term “software engineer” was Margaret Hamilton, who worked at NASA on the Apollo space program. There is a beautiful story of how she re-engineered the software for guidance computers. Her daughter caused a system crash while playing in the simulator, which made Margaret rethink asynchronous messaging processing. This change resulted in the astronauts landing safely on the moon, as the computer was not overwhelmed by erroneous input and could focus on the task at hand.
The reason I share this story is because software engineers and actuarial scientists have learned a great deal from each other. Margaret’s daughter prompted her to ask, “What if that did happen?”—a question that actuaries strive to answer. Software engineering needed to grow up fast. It did, and as a result, it solved problems that benefit the way actuaries work today.
The actuarial platform I work on continues to evolve from its inception 25 years ago. Complexity has grown to support new types of insurance, riders and so on. Yet it must remain resilient enough to support policies issued many years prior. Over the years, regulations and the economic environment have changed in ways that couldn’t have been predicted. Black swans, ideally a 1-in-200-year event, have been more like a 1-in-10-year event. In that time, there also have been significant paradigm shifts in technology. The technology evolution has enabled much of the regulatory change, but it also caused some of the economic shifts. Technology has changed the way we live and the shape of many industries. This pace of disruption is increasing—a startup can become an incumbent and displace the status quo quickly; Uber and Airbnb are the most obvious examples.
There are two distinct aspects of the actuarial system evolution on which I would like to reflect. The first is looking at what has become possible thanks to the evolution of hardware and software; the second is the evolution of how actuaries work together. In both cases, I see a fusion of ideas that gives me hope that the insurance industry will continue to evolve and adapt to serve the needs of a world that is very different than the one it originally set out to serve.
Computation
Models in the Desktop Era
The problem with the mainframe computer was the fact that it was a shared resource. You had to be efficient and precise. There was nothing worse than coming to work in the morning to find an error message where your results should have been displayed. The personal computer (PC) was liberating—the actuary was free. As insurers started rolling out desktop computers to their staffs, actuaries were given a capable computing environment in a box under their desk. It didn’t take them long to start making that central processing unit (CPU) hot calculating reserves and pricing new products. But Moore’s Law,1 the observation that processing power doubles every two years, meant the humble PC was keeping up with the needs of actuaries.
The new challenge was that actuaries needed to learn to program. Vendors introduced domain-specific languages (DSLs), so actuaries could be more expressive and productive than they would be in general-purpose languages, such as C. Vendors implemented standard domain concepts, such as the double-entry accounting structure, and they delivered standard libraries to meet regulatory needs, further increasing the leverage of the individual actuary. This all reduced the barrier to entry and helped the actuarial department concentrate on innovation.
Moore Processing Power
It didn’t take long for the PC to start to run out of the horsepower needed to run models in a timely fashion, due in large part to increased product complexity, regulatory needs and more data. Thankfully, servers with multiple CPUs and more memory had become ubiquitous, so actuaries could move their models to servers.
The DSLs enabled vendors to leverage multi-CPU architectures without actuaries needing to change all of their business logic. It’s also relatively straightforward to parallelize an actuarial model. There were two embarrassingly parallel distribution mechanisms available, assuming you were happy with some limitations/simplifications. Distributing calculations by economic scenario meant the stochastic requirements could be met by running scenarios in parallel. It also was possible to parallelize a single scenario by liability/asset cell.
The single server soon became the limiting factor. Fortunately, the vendors’ engineers could take advantage of the same distribution mechanics used to leverage multiple CPUs in a single server to scale out across multiple servers.
The interesting observation is that the actuarial modeling platform—at least the one for which I am responsible—started with the tagline: “Built by actuaries, for actuaries.” At this point, the needs of the actuarial department created a diverse product development group, one of actuaries working with software engineers. The addition of engineers accelerated the rate of innovation: It modernized the platform and led to a product built for actuaries with actuarial input.
Cloud
Despite the computational wins of the multiple-server solution, in a way, the actuarial department found itself back in the days of the mainframe. Models needed a shared compute infrastructure to execute, and teams were back to waiting in line to get an answer. It was not as bad as it once was—at least actuaries could test and debug models locally before running them—but it was still expensive and frustrating to wait for the answer. What if there were unlimited compute capacity and no more waiting in line?
The cloud is delivering this promise. As you lease servers by the second, it is possible to scale a grid to hundreds of thousands of servers in minutes. Owning this many servers simply would not be economical, and even if that capital investment were justified, utilization rates would be extremely low. The cloud has the economy of scale to make this investment worthwhile.
This is similar to the evolution of software in other industries. RenderMan, the technology behind the Toy Story movies, has scaled out to meet the needs of modern photorealistic computer-animated movies like the Lion King. This technology is cloud-compatible, allowing anyone access to huge compute capacity to render their ideas a reality with very little upfront investment. Access to super computers is no longer a barrier to innovation.
Modeling Complexity and Controls
The Challenges of Model Development
In parallel to the technical evolution, the process around modeling—especially for financial reporting—has become more sophisticated. The need for a team of actuaries to collaborate during model development has increased due to the need to draw upon specialized expertise and the sheer volume of model updates to keep pace with emerging regulations and product innovation. There are various operating models, but the requirement for more collaboration on a single model created the significant challenge of organizing and merging changes from many contributors across the business.
Chief actuaries further raised the bar when they became frustrated by different sources of truth being used to answer different business questions. For example, answers from the forecasting models differed from valuation model answers. Delivering on the diverse modeling requirements required collaboration. But at the same time, modeling teams needed to consolidate models to provide consistency across the business, which was forced, in part, by principle-based regulations such as Valuation Manual 20 (VM-20).
Commodity vs. Specialized Compute
Actuarial modeling vendors have not yet been able to translate their existing domain-specific languages (DSLs) to execute efficiently on graphics processing unit (GPU) cards. GPUs are specialized microprocessors designed to render high-resolution images and video. This specialism can be leveraged to perform floating-point arithmetic at higher degrees of parallelism than a central processing unit (CPU). …
CONTINUE READING
Regulators have become increasingly interested in the rigor with which models are built and maintained, so the actuarial development team members are required to demonstrate that controlled change processes are embedded and followed. Fortunately, leaning on the engineering teams for diversity of thought (as well as leveraging centralized and shareable resources) provides an alternative way to work. There are easy wins to the workflows of updating models and running production models, which actuaries can gain by leveraging what engineering teams do.
All this means that modeling software needed to:
- Evolve to be more open to concurrent editors.
- Simplify bringing that work together.
- Reflect transparency to show what has changed over time.
Collaborative and Auditable Model Development
The nice thing about version-control systems, like GitHub (see sidebar), is they enable additional use cases: collaboration, auditability and traceability. Collaboration becomes easier as these systems make it easy to combine changes and handle conflicts where several people change the same model. This reduces the overhead of multiple contributors. The same mechanisms for tracking changes from one person to another creates an immutable audit trail that allows users to see what changes happened between two points in time. That audit trail links requirements to changes and then results, creating end-to-end traceability. This is an auditor’s dream, and it frees actuaries to innovate within a controlled framework.
Making change control a first-class citizen in model development has simplified the GitHub approach, much like SharePoint has done for business users collaborating on Microsoft Word documents. Rather than asking actuaries to learn Git—and externalize models in a source-control system—vendors have been able to make these best practices simple and less obtrusive by integrating them into their tools. They also can provide semantic merge capabilities thanks to the simplicity of their DSLs, so merge conflict resolution is less onerous than merging unfamiliar files.
GitHub
Git is a distributed version-control system to help manage concurrent changes made by teams of developers working collaboratively on source code. It provides a single source of truth of all code changes over time.
GitHub revolutionized how the open-source software community collaborated on open-source projects. This, in turn, transformed software teams within established organizations to be more collaborative and open internally. GitHub made it easy for developers to host Git and layered on new concepts, such as pull requests (for code reviews), that engaged individuals in collaborative problem-solving and created a community where people learn, share and work together to build software.
Producing Results
The model’s business logic and configuration is just one part of the end-to-end production process for creating actuarial results. Models typically reference externalized assumptions, which are set during a separate experience analysis process. There are also inputs that change from period to period—for example, assets and liabilities, which often undergo transformation from one format to another. And there are economic scenarios, either fed in or generated by integrated economic scenario generators (ESGs). There often is a mechanism, after model runs are completed, for including nonmodeled results and manual adjustments. New regulations, such as Long Duration Targeted Improvements (LDTI) and International Financial Reporting Standard (IFRS) 17, require historical modeled results and transactional data as inputs, which adds a new dimension of change management not previously in scope for actuarial reporting. To add even more pressure, the regulator defines tight reporting timelines, which requires insurers to be faster and more agile. This has driven vendors to enable automated processes that can easily evolve over time.
Providing traceability across all areas of change, including data lineage, is extremely powerful for change attribution analysis and essential for building confidence in the modeled result. Demonstrating Sarbanes Oxley (SOX) controls during an audit is onerous if this process is not managed holistically, and now with historical input, that causality chain spans several years. Managing this in a piecemeal way requires that all participants in the value chain provide a mechanism for end-to-end traceability.
Combining a controlled development process for models, data transformations and iterations with data (asset, liabilities, assumptions and economic scenarios) extends the traceability of the change control system to production results. Results are trustworthy thanks to an automated and locked-down execution environment that prevents tampering and human error. Distributed computing provides the means to deliver these results quickly, and the cloud reduces capital expense by scaling to meet the demands of increasing complexity and regulatory needs.
Conclusion
When I reflect on how the software product I helped build has evolved, and the use cases we now support, I feel very proud. The pride comes from the fact that we created a diverse team of people who empowered actuaries and helped them handle the increasing demands of their jobs. Our clients no longer must choose between compliance and innovation—we provide tools that support freedom to innovate within a controlled framework. These tools allow actuaries to collaborate, provide access to unlimited cloud computing and more data, derive more insight, and reimagine the way they work for the better.
That partnership has meant the clients we serve are more equipped to build better insurance products. Without these products, people would not be able to retire or provide for their loved ones when they die. So I also feel a great sense of pride that the software I help deliver fulfills a huge need in society.
These efforts are critical, because insurance provides financial security against so many risks: premature death, disability, poor health, living longer and so on. In the United Kingdom, where I am from originally, 55 percent of Generation X are at a high risk of not achieving a moderate level of income in retirement. A report by The Phoenix Group2 contrasts the situation from the prior generation: Members of Generation X lose £13,000 in state pension over their lifetimes, and they occupy rented accommodations at a rate 8 percent higher than baby boomers, leaving them with less disposable income and fewer assets.
As software has evolved, so have the needs of retirees. My goal is to continue to innovate in partnership with my actuarial friends and accelerate the transformation of the tools they use, so they are empowered to solve the vast array of issues related to financial security—including the transformation of insurance—so everyone has access to the retirement they deserve.
References:
- 1. Moore, Gordon E. 1965. Cramming More Components. Electronics 38. ↩
- 2. Phoenix Group. “Generation VeXed” Faces a Grey Retirement. Phoenix Group, November 7, 2019, (accessed January 28, 2020). ↩
Copyright © 2020 by the Society of Actuaries, Chicago, Illinois.