
Jeremy S. Platt, Managing Director and US Cyber Specialty Practice Leader
Contact
In a digital world, cyber exposure evolves every day, making it one of the most dynamic emerging risks in the industry. Just as the housing boom along the US shoreline accelerated property losses, the technological sophistication and digital connectivity of the global economy have increased the cyber threat for all sectors. As large-scale breaches become more damaging and pervasive, the (re)insurance industry needs to continue to innovate to address potential systemic events, aggregations, and modeling capabilities. As such, 2018 will be a year of product growth and new challenges. In order to advance this important market, we must develop a common analytical language, harness advanced modeling technologies and learn lessons from other lines of business.
To diversify the US cyber market beyond confidentiality and data breach, we must continue to expand into other areas, such as operational technology (OT) risk and data integrity. Growing supply chain dependencies both between companies and across operating systems and internet services greatly enhance OT exposure, while over 80 percent of the Standard & Poor's Index's value is tied to information-based assets (1) that could be impaired in a systemic cyber-attack.
Forty years ago, the opposite was true. To develop solutions for both exposures, (re)insurers need a common cyber risk currency that gives underwriters, actuaries, cat modelers, brokers and others a common terminology and set of metrics with which to measure companies' reliance on particular suppliers and systems and the true, sometimes nebulous value of data. As insured cyber loss continues to emerge and transparency around network and operating system supply chain dependencies evolves, Guy Carpenter is helping this become a reality.
While we have made improvements in modeling modern interconnections, it remains extremely challenging to accurately model how dependent any one company may be on a given supplier or system, or its level of enterprise risk management and cyber resilience. As seen with WannaCry and Petya earlier this year, reliance on one specific operating system determined the level of exposure many companies faced, and those with less rigorous risk mitigation strategies often fared the worst. But it is difficult to quantify that dependence and resilience to develop an accurate picture of potential risk. And with growth in automation, cyber-physical systems, the Internet of Things, cloud computing and cognitive computing, manufacturing and infrastructure assets operating in an Industry 4.0 world are more exposed to cyber-attack than ever before. There is much greater systemic OT potential, yet current models are not yet quite as credible as those property catastrophe carriers use to define a given event's impact when taking into account a particular exposure zone and resiliency level.
In the United States and soon, Europe - given the General Data Protection Regulation and plans for a new UK Data Protection Bill - regulation drives data confidentiality and breach solutions. And while these protect against data reconstruction costs, if that data has been compromised, it may be worthless. Data integrity and availability is as critical as data confidentiality, and are not always associated with a company-specific breach or malicious attack. Reliance on an infected or malfunctioning operating system or internet service may render a company's information assets useless without them even knowing it, and without a common risk understanding that recognizes and defines the intrinsic value of data integrity, providing the most comprehensive coverage for the risk is difficult.
Note:
1) Annual Study of Intangible Asset Market Value from Ocean Tomo, LLC