Risks: Here Now, Closing Fast or Detected Just Over the Horizon
The evaluation of emerging risks requires a different mindset December 2017/January 2018“The thing that hath been, it is that which shall be; and that which is done is that which shall be done: and there is no new thing under the sun.”
— Ecclesiastes
“In the beginner’s mind there are many possibilities, but in the expert’s there are few.”
— Zen Mind, Beginner’s Mind
“The work of science is to substitute facts for appearances and demonstrations for impressions.”
— John Ruskin, Motto of the Society of Actuaries
As reported by The Economist, Stanislav Petrov died on Sept.18, 2017, at the age of 77. After his graduation in 1972 from the Radio-Technical College in Kiev, he worked at the secret Serpukhov-15 early warning facility near Moscow. He monitored satellite surveillance of U.S. missile launch sites to detect a missile attack on the Soviet Union. On Sept. 26, 1983, shortly after midnight, the wall screen flashed a message: “START.” A missile launch had been detected. Petrov reported a fault in the system to his superiors amid the panic that had seized the staff. But the system detected more missiles, five in all, which almost certainly meant an attack had been launched. If there were an actual attack, ground radar would pick it up in 10 minutes; two minutes later the missiles would hit. Petrov continued to believe that the warning was due to a malfunction and again reported it as such to his bosses. An agonizing 15 minutes passed and there was no attack—the sun’s rays reflecting off clouds high above the American launch site had been interpreted as missiles on their way.
We should remember how Petrov’s courage under extreme duress prevented the deaths of (potentially) millions in a nuclear battle. Had he simply reacted to the screen’s message, the Soviet Union would have (apparently) launched a first strike against the United States. Since there was no effective anti-missile defense in 1983, the United States likely would have counterattacked, and an all-out war could have occurred in minutes. One man’s skepticism about what his machines were telling him saved the day.
But what if no human were involved in the process? On May 7, 2016, the first fatal crash of a Tesla under control of its autopilot occurred in Florida. The Tesla was driving at 74 mph along a highway when a truck traveling in the opposite direction made a left turn across its path. According to Tesla, the autopilot was unable to recognize the white side of the truck against a bright sky and failed to brake the car.
The relatively primitive systems of the 1980s are being replaced by more sophisticated designs. As people rely on them, disasters occur.
Mark Birdsall, the contributing editor of this issue of The Actuary, has identified and enlisted authors who write about emerging risks from a variety of angles. These are not existential risks like nuclear war—at least not yet—but they can be serious, with substantial financial harm to insurers and policyholders. Some of the risks have been there all along (operational risks) and are just recently being recognized and factored in calculations. Others are beginning to be measured and monitored in the hopes of detecting changes that could have a significant financial impact (the Actuaries Climate Index). In other articles, our authors scan the horizon for new risks whose potential dangers are still being identified and assessed. Several articles in this issue describe methodologies for quantifying risks and building safety margins into firm operations.
Not every risk that is identified will turn out to be dangerous (recall the Y2K alarm). Some calamities arise from a mixture of the complacency and greed of human beings with complex modern financial contracts (the panic of 2007–2008, aka the Financial Crisis). Some arise because of a change in the regulatory environment, and the formerly irrelevant or benign becomes a problem. The introduction of a new technology, such as genetic modification or artificial intelligence, or the internet, triggers a cascade of effects. Demographic changes may shift the ground around us after a few decades.
The human mind struggles to make sense of changes in the fundamental framework of our lives. In one sense, it is an impossible task, but it is a task that must be taken up. The evaluation of emerging risks requires a different mindset than that which is used for normal actuarial work. You must leave behind the data that have been collected on known risks and temporarily put aside your experience studies. You should let your mind become receptive to different patterns. You might be advised to read a few classic science fiction stories or novels to understand how a new world is formed as a consequence of a few fundamental changes.
Read all of the articles in this issue for the total experience! Don’t forget to thank Mark for putting the issue together, and Jacque Kirkwood of the Society of Actuaries for keeping everything on track.