logo image
NIGEL ALLENMay 17, 2017
MachineLeaning
MachineLeaning
A New Way of Learning
May 17, 2017

EXPOSURE delves into the algorithmic depths of machine learning to better understand the data potential that it offers the insurance industry. Machine learning is similar to how you teach a child to differentiate between similar animals,” explains Peter Hahn, head of predictive analytics at Zurich North America. “Instead of telling them the specific differences, we show them numerous different pictures of the animals, which are clearly tagged, again and again. Over time, they intuitively form a pattern recognition that allows them to tell a tiger from, say, a leopard. You can’t predefine a set of rules to categorize every animal, but through pattern recognition you learn what the differences are.” In fact, pattern recognition is already part of how underwriters assess a risk, he continues. “Let’s say an underwriter is evaluating a company’s commercial auto exposures. Their decision-making process will obviously involve traditional, codified analytical processes, but it will also include sophisticated pattern recognition based on their experiences of similar companies operating in similar fields with similar constraints. They essentially know what this type of risk ‘looks like’ intuitively.” Tapping the Stream At its core, machine learning is then a mechanism to help us make better sense of data, and to learn from that data on an ongoing basis. Given the data-intrinsic nature of the industry, the potential it affords to support insurance endeavors is considerable. “If you look at models, data is the fuel that powers them all,” says Christos Mitas, vice president of model development at RMS. “We are now operating in a world where that data is expanding exponentially, and machine learning is one tool that will help us to harness that.” One area in which Mitas and his team have been looking at machine learning is in the field of cyber risk modeling. “Where it can play an important role here is in helping us tackle the complexity of this risk. Being able to collect and digest more effectively the immense volumes of data which have been harvested from numerous online sources and datasets will yield a significant advantage.” “MACHINE LEARNING CAN HELP US GREATLY EXPAND THE NUMBER OF EXPLANATORY VARIABLES WE MIGHT INCLUDE TO ADDRESS A PARTICULAR QUESTION” CHRISTOS MITAS RMS He also sees it having a positive impact from an image processing perspective. “With developments in machine learning, for example, we might be able to introduce new data sources into our processing capabilities and make it a faster and more automated data management process to access images in the aftermath of a disaster. Further, we might be able to apply machine learning algorithms to analyze building damage post event to support speedier loss assessment processes.” “Advances in natural language processing could also help tremendously in claims processing and exposure management,” he adds, “where you have to consume reams of reports, images and facts rather than structured data. That is where algorithms can really deliver a different scale of potential solutions.” At the underwriting coalface, Hahn believes a clear area where machine learning can be leveraged is in the assessment and quantification of risks. “In this process, we are looking at thousands of data elements to see which of these will give us a read on the risk quality of the potential insured. Analyzing that data based on manual processes, given the breadth and volume, is extremely difficult.” Looking Behind the Numbers Mitas is, however, highly conscious of the need to establish how machine learning fits into the existing insurance eco-system before trying to move too far ahead. “The technology is part of our evolution and offers us a new tool to support our endeavors. However, where our process as risk modelers starts is with a fundamental understanding of the scientific principles which underpin what we do.” Making the Investment Source: The Future of General Insurance Report based on research conducted by Marketforce Business Media and the UK’s Chartered Insurance Institute in August and September 2016 involving 843 senior figures from across the UK insurance sector “It is true that machine learning can help us greatly expand the number of explanatory variables we might include to address a particular question, for example – but that does not necessarily mean that the answer will more easily emerge. What is more important is to fully grasp the dynamics of the process that led to the generation of the data in the first place.” He continues: “If you look at how a model is constructed, for example, you will have multiple different model components all coupled together in a highly nonlinear, complex system. Unless you understand these underlying structures and how they interconnect, it can be extremely difficult to derive real insight from just observing the resulting data.” “WE NEED TO ENSURE THAT WE CAN EXPLAIN THE RATIONALE BEHIND THE CONCLUSIONS” PETER HAHN ZURICH NORTH AMERICA Hahn also highlights the potential ‘black box’ issue that can surround the use of machine learning. “End users of analytics want to know what drove the output,” he explains, “and when dealing with algorithms that is not always easy. If, for example, we apply specific machine learning techniques to a particular risk and conclude that it is a poor risk, any experienced underwriter is immediately going to ask how you came to that conclusion. You can’t simply say you are confident in your algorithms.” “We need to ensure that we can explain the rationale behind the conclusions that we reach,” he continues. “That can be an ongoing challenge with some machine learning techniques.” There is no doubt that machine learning has a part to play in the ongoing evolution of the insurance industry. But as with any evolving technology, how it will be used, where and how extensively will be influenced by a multitude of factors. “Machine learning has a very broad scope of potential,” concludes Hahn, “but of course we will only see this develop over time as people become more comfortable with the techniques and become better at applying the technology to different parts of their business.”

Helen YatesMarch 17, 2017
botnet
botnet
The Day a Botnet Took Down the Internet
March 17, 2017

The Dyn distributed denial of service (DDoS) attack in October 2016 highlighted security flaws inherent in the Internet of Things (IoT). EXPOSURE asks what this means for businesses and insurers as the world becomes increasingly connected. A decade ago, Internet connections were largely limited to desktop computers, laptops, tablets, and smart phones. Since then there has been an explosion of devices with IP addresses, including baby monitors, connected home appliances, motor vehicles, security cameras, webcams, ‘Fitbits’ and other wearables. Gartner predicts there will be 20.8 billion things connected to the Internet by 2020. In a hyper-connected world, governments, corporates, insurers and banks need to better understand the potential for systemic and catastrophic risk arising from a cyber attack seeking to exploit IoT vulnerabilities. With few actual examples of how such attacks could play out, realistic disaster scenarios and cyber modeling are essential tools by which (re)insurers can manage their aggregate exposures and stress test their portfolios. “IF MALICIOUS ACTORS WANTED TO, THEY WOULD ATTACK CORE SERVICES ON THE INTERNET AND I THINK WE’D BE SEEING A NEAR GLOBAL OUTAGE” KEN MUNRO PEN TEST PARTNERS Many IoT devices currently on the market were not designed with strict IT security in mind. Ethical hackers have demonstrated how everything from cars to children’s toys can be compromised. These connected devices are often an organization’s weakest link. The cyber criminals responsible for the 2013 Target data breach are understood to have gained access to the retailer’s systems and the credit card details of over 40 million customers via the organization’s heating, ventilation and air conditioning (HVAC) system. The assault on DNS hosting firm Dyn in October 2016, which brought down multiple websites including Twitter, Netflix, Amazon, Spotify, Reddit, and CNN in Europe and the U.S., was another wake-up call. The DDoS attack was perpetrated using the Mirai virus to compromise IoT systems. Like a parasite, the malware gained control of an estimated 100,000 devices, using them to bombard and overwhelm Dyn’s infrastructure. This is just the tip of the iceberg, according to Ken Munro, partner, Pen Test Partners. “My first thought [following the Dyn attack] was ‘you ain’t seen nothing yet’. That particular incident was probably using the top end of a terabyte of data per second, and that’s nothing. We’ve already seen a botnet that is several orders of magnitude larger than that. If malicious actors wanted to, they would attack core services on the Internet and I think we’d be seeing a near global outage.” In the rush to bring new IoT devices to market, IT security has been somewhat of an afterthought, thinks Munro. The situation is starting to change, though, with consumer watchdogs in Norway, the Netherlands and the U.S. taking action. However, there is a significant legacy problem to overcome and it will be several years before current security weaknesses are tackled in a meaningful way. “I’ve still got our first baby monitor from 10 years ago,” he points out. “The Mirai botnet should have been impossible, but it wasn’t because a whole bunch of security camera manufacturers did a really cheap job. IT security wasn’t on their radar. They were thinking about keeping people’s homes secure without even considering that the device itself might actually be the problem.” In attempting to understand the future impact of such attacks, it is important to gain a better understanding of motivation. For cyber criminals, DDoS attacks using IoT botnets could be linked to extortion attempts or to diverting the attention of IT professionals away from other activities. For state-sponsored actors, the purpose could be more sinister, with the intent to cause widespread disruption, and potentially physical damage and bodily harm. Insurers Stress-Test “Silent” Cyber It is the latter scenario that is of growing concern to risk and insurance managers. Lloyd’s, for instance, has asked syndicates to create at least three internal “plausible but extreme” cyber attack scenarios as stress-tests for cyber catastrophe losses. It has asked them to calculate their total gross aggregate exposure to each scenario across all classes, including “silent” cyber. AIG is also considering how a major cyber attack could impact its book of business. “We are looking at it, not only from our own ERM perspective, but also to understand what probable maximum losses there could be as we start to introduce other products and are able to attach cyber to traditional property and casualty policies,” explains Mark Camillo, head of cyber at AIG. “We look at different types of scenarios and how they would impact a book.” AIG and a number of Lloyd’s insurers have expanded their cyber offerings to include cover for non-damage business interruption and physical damage and bodily harm arising from a cyber incident. Some carriers – including FM Global – are explicitly including cyber in their traditional suite of products. Others have yet to include explicit wording on how traditional products would respond to a cyber incident. “WE HAVE RELEASED A NUMBER OF CYBER-PHYSICAL ATTACK SCENARIOS THAT CAUSE LOSSES TO TRADITIONAL PROPERTY INSURANCE” ANDREW COBURN RMS “I don’t know if the market will move towards exclusions or including affirmative cyber coverage within property and casualty to give insureds a choice as to how they want to purchase it,” states Camillo. “What will change is that there is going to have to be some sort of due diligence to ensure cyber exposures are coded properly and carriers are taking that into consideration in capital requirements for these types of attacks.” In addition to markets such as Lloyd’s, there is growing scrutiny from insurance industry regulators, including the Prudential Regulation Authority in the U.K., on how a major cyber event could impact the insurance industry and its capital buffers. They are putting pressure on those carriers that are currently silent on how their traditional products would respond, to make it clear whether cyber-triggered events would be covered under conventional policies. “The reinsurance market is certainly concerned about, and constantly looking at the potential for, catastrophic events that could happen across a portfolio,” says William Henriques, senior managing director and co-head of the Cyber Practice Group at Aon Benfield. “That has not stopped them from writing cyber reinsurance and there’s enough capacity out there. But as the market grows and gets to US$10 billion, and reinsurers keep supporting that growth, they are going to be watching that accumulation and potential for catastrophic risk and managing that.” Catastrophic Cyber Scenarios In December 2015 and again in December 2016, parts of Ukraine’s power grid were taken down. WIRED magazine noted that many parts of the U.S. grid were less secure than Ukraine’s and would take longer to reboot. It was eerily similar to a fictitious scenario published by Cambridge University’s Centre for Risk Studies in partnership with Lloyd’s in 2015. ‘Business Blackout’ considered the impact of a cyber attack on the US power grid, estimating total economic impact from the 1-in-200 scenario would be US$243 billion, rising to US$1 trillion in its most extreme form. It is not beyond the realms of possibility for a Mirai-style virus targeting smart thermostats to be used to achieve such a blackout, thinks Pen Test Partners’ Ken Munro. “You could simultaneously turn them all on and off at the same time and create huge power spikes on the electricity grid. If you turn it on and off and on again quickly, you’ll knock out the grid – then we would see some really serious consequences.” Smart thermostats could be compromised in other ways, for instance by targeting food and pharmaceutical facilities with the aim to spoil goods. There is a commonly held belief that the industrial and supervisory control and data acquisition systems (ICS/SCADA) used by energy and utility companies are immune to cyber attacks because they are disconnected from the Internet, a protective measure known as “air gapping”. Smart thermostats and other connected devices could render that defense obsolete. In its Cyber Accumulation Management System (CAMS v2.0), RMS considered how silent cyber exposures could impact accumulation risk in the event of major cyber attacks on operations technology, using the Ukrainian power grid attack as an example. “We’ve released a number of cyber-physical attack scenarios that cause losses to traditional property insurance,” explains Andrew Coburn, senior vice president at RMS and a founder and member of the executive team of the Cambridge Centre for Risk Studies. “We’re working with our clients on trying to figure out what level of stress test should be running,” he explains. “The CAMS system we’ve released is about running large numbers of scenarios and we have extended that to look at silent cover, things in conventional insurance policies that could potentially be triggered by a cyber attack, such as fires and explosions.” Multiple lines of business could be impacted by a cyber event thinks Coburn, including nearly all property classes, including aviation and aerospace. “We have included some scenarios for marine and cargo insurance, offshore energy lines of business, industrial property, large numbers of general liability and professional lines, and, quite importantly, financial institutions professional indemnity, D&O and specialty lines.” “The IoT is a key element of the systemic potential of cyber attacks,” he says. “Most of the systemic risk is about looking at your tail risk. Insurers need to look at how much capital they need to support each line of business, how much reinsurance they need to buy and how they structure their risk capital.” RMS CAMS v2.0 Scenarios Cyber-Induced Fires in Commercial Office Buildings Hackers exploit vulnerabilities in the smart battery management system of a common brand of laptop, sending their lithium-ion batteries into thermal runaway state. The attack is coordinated to occur on one night. A small proportion of infected laptops that are left on charge overnight overheat and catch fire, and some unattended fires in commercial office buildings spread to cause major losses. Insurers face claims for a large numbers of fires in their commercial property and homeowners’ portfolios. Cyber-Enabled Marine Cargo Theft From Port Cyber criminals gain access to a port management system in use at several major ports. They identify high value cargo shipments and systematically switch and steal containers passing through the ports over many months. When the process of theft is finally discovered, the hackers scramble the data in the system, disabling the ports from operating for several days. Insurers face claims for cargo loss and business interruption in their marine lines. ICS-Triggered Fires in Industrial Processing Plants External saboteurs gain access to the process control network of large processing plants, and spoof the thermostats of the industrial control systems (ICS), causing heat-sensitive processes to overheat and ignite flammable materials in storage facilities. Insurers face sizeable claims for fire and explosions in a number of major industrial facilities in their large accounts and facultative portfolio. PCS-Triggered Explosions on Oil Rigs A disgruntled employee gains access to a Network Operations Centre (NOC) controlling a field of oil rigs, and manipulates several of the Platform Control Systems (PCS) to cause structural misalignment of well heads, damage to several rigs, oil and gas release, and fires. At least one platform has a catastrophic explosion. Insurers face significant claims to multiple production facilities in their offshore energy book. Regional Power Outage From Cyber Attack on U.S. Power Generation A well-resourced cyber team infiltrates malware into the control systems of U.S. power generating companies that creates desynchronization in certain types of generators. Sufficient generators are damaged to cause a cascading regional power outage that is complex to repair. Restoration of power to 90 percent of customers takes two weeks. Insurers face claims in many lines of business, including large commercial accounts, energy, homeowners and speciality lines. The scenario is published as a Lloyd’s Emerging Risk Report ‘Business Blackout’ by Cambridge Centre for Risk Studies and was released in RMS CAMS v1.1. Regional Power Outage From Cyber Attack on UK Power Distribution A nation-state plants ‘Trojan Horse’ rogue hardware in electricity distribution substations, which are activated remotely to curtail power distribution and cause rolling blackouts intermittently over a multi-week campaign. Insurers face claims in many lines of business, including large commercial accounts, energy, homeowners and specialty lines. The scenario is published as ‘Integrated Infrastructure’ by Cambridge Centre for Risk Studies, and was released in RMS CAMS v1.1.  

NIGEL ALLENMarch 17, 2017
earthquake risk
earthquake risk
An Unparalleled View of Earthquake Risk
March 17, 2017

As RMS launches Version 17 of its North America Earthquake Models, EXPOSURE looks at the developments leading to the update and how distilling immense stores of high-resolution seismic data into the industry’s most comprehensive earthquake models will empower firms to make better business decisions. The launch of RMS’ latest North America Earthquake Models marks a major step forward in the industry’s ability to accurately analyze and assess the impacts of these catastrophic events, enabling firms to write risk with greater confidence due to the underpinning of its rigorous science and engineering. The value of the models to firms seeking new ways to differentiate and diversify their portfolios as well as price risk more accurately, comes from a host of data and scientific updates. These include the incorporation of seismic source data from the U.S. Geological Survey (USGS) 2014 National Seismic Hazard Mapping Project. First groundwater map for Liquefaction “Our goal was to provide clients with a seamless view of seismic hazards across the U.S., Canada and Mexico that encapsulates the latest data and scientific thinking— and we’ve achieved that and more,” explains Renee Lee, head of earthquake model and data product management at RMS. “There have been multiple developments – research and event-driven – which have significantly enhanced understanding of earthquake hazards. It was therefore critical to factor these into our models to give our clients better precision and improved confidence in their pricing and underwriting decisions, and to meet the regulatory requirements that models must reflect the latest scientific understanding of seismic hazard.” Founded on Collaboration Since the last RMS model update in 2009, the industry has witnessed the two largest seismic-related loss events in history – the New Zealand Canterbury Earthquake Sequence (2010-2011) and the Tohoku Earthquake (2011). “We worked very closely with the local markets in each of these affected regions,” adds Lee, “collaborating with engineers and the scientific community, as well as sifting through billions of dollars of claims data, in an effort not only to understand the seismic behavior of these events, but also their direct impact on the industry itself.” A key learning from this work was the impact of catastrophic liquefaction. “We analyzed billions of dollars of claims data and reports to understand this phenomenon both in terms of the extent and severity of liquefaction and the different modes of failure caused to buildings,” says Justin Moresco, senior model product manager at RMS. “That insight enabled us to develop a high-resolution approach to model liquefaction that we have been able to introduce into our new North America Earthquake Models.” An important observation from the Canterbury Earthquake Sequence was the severity of liquefaction which varied over short distances. Two buildings, nearly side-by-side in some cases, experienced significantly different levels of hazard because of shifting geotechnical features. “Our more developed approach to modeling liquefaction captures this variation, but it’s just one of the areas where the new models can differentiate risk at a higher resolution,” said Moresco. The updated models also do a better job of capturing where soft soils are located, which is essential for predicting the hot spots of amplified earthquake shaking.” “There is no doubt that RMS embeds more scientific data into its models than any other commercial risk modeler,” Lee continues. “Throughout this development process, for example, we met regularly with USGS developers, having active discussions about the scientific decisions being made. In fact, our model development lead is on the agency’s National Seismic Hazard and Risk Assessment Steering Committee, while two members of our team are authors associated with the NGA-West 2 ground motion prediction equations.” The North America Earthquake Models in Numbers 360,000 Number of fault sources included in the UCERF3, the USGS California seismic source model >3,800 Number of unique U.S. vulnerability functions in RMS’ 2017 North America Earthquake Models for building shake coverage, with the ability to further differentiate risk based on 21 secondary building characteristics >30 Size of team at RMS that worked on updating the latest model Distilling the Data While data is the foundation of all models, the challenge is to distil it down to its most business-critical form to give it value to clients. “We are dealing with data sets spanning millions of events,” explains Lee, “for example, UCERF3 — the USGS California seismic source model — alone incorporates more than 360,000 fault sources. So, you have to condense that immense amount of data in such a way that it remains robust but our clients can run it within ‘business hours’.” Since the release of the USGS data in 2014, RMS has had over 30 scientists and engineers working on how to take data generated by a super computer once every five to six years and apply it to a model that enables clients to use it dynamically to support their risk assessment in a systematic way. “You need to grasp the complexities within the USGS model and how the data has evolved,” says Mohsen Rahnama, chief risk modeling officer and general manager of the RMS models and data business. “In the previous California seismic source model, for example, the USGS used 480 logic tree branches, while this time they use 1,440 logic trees. You can’t simply implement the data – you have to understand it. How do these faults interact? How does it impact ground motion attenuation? How can I model the risk systematically?” As part of this process, RMS maintained regular contact with USGS, keeping them informed of how they were implementing the data and what distillation had taken place to help validate their approach. Building Confidence Demonstrating its commitment to transparency, RMS also provides clients with access to its scientists and engineers to help them drill down in the changes into the model. Further, it is publishing comprehensive documentation on the methodologies and validation processes that underpin the new version. Expanding the Functionality Upgraded soil amplification methodology that empowers (re)insurers to enter a new era of high-resolution geotechnical hazard modeling, including the development of a Vs30 (average shear wave velocity in the top 30 meters at site) data layer spanning North America  Advanced ground motion models leveraging thousands of historical earthquake recordings to accurately predict the attenuation of shaking from source to site New functionality enabling high and low representations of vulnerability and ground motion 3,800+ unique U.S. vulnerability functions for building shake coverage. Ability to further differentiate risk based on 21 secondary building characteristics Latest modeling for very tall buildings (>40 stories) enables more accurate underwriting of high-value assets New probabilistic liquefaction model leveraging data from the 2010-2011 Canterbury Earthquake Sequence in New Zealand Ability to evaluate secondary perils: tsunami, fire following earthquake and earthquake sprinkler leakage New risk calculation functionality based on an event set includes induced seismicity Updated basin model for Seattle, Mississippi Embayment, Mexico City and Los Angeles. Added a new basin model for Vancouver Latest historical earthquake catalog from the Geological Survey of Canada integrated, plus latest research data on the Mexico Subduction Zone Seismic source data from the U.S. Geological Survey (USGS) 2014 National Seismic Hazard Mapping Project incorporated, which includes the third Uniform California Earthquake Rupture Forecast (UCERF3) Updated Alaska and Hawaii hazard model, which was not updated by USGS

Helen YatesMarch 17, 2017
drought
drought
Managing the Next Financial Shock
March 17, 2017

EXPOSURE reports on how a pilot project to stress test banks’ exposure to drought could hold the key to future economic resilience. here is a growing recognition that environmental stress testing is a crucial instrument to ensure a sustainable financial system. In December 2016, the Task Force on Climate-related Financial Disclosures (TCFD) released its recommendations for effective disclosure of climate-related financial risks. “This represents an important effort by the private sector to improve transparency around climate-related financial risks and opportunities,” said Michael Bloomberg, chair of the TCFD. “Climate change is not only an environmental problem, but a business one as well. We need business leaders to join us to help spread these recommendations across their industries in order to help make markets more efficient and economies more stable, resilient and sustainable.” Why Drought? Drought is a significant potential source of shock to the global financial system. There is a common misconception that sustained lack of water is primarily a problem for agriculture and food production. In Europe alone, it is estimated that around 40 percent of total water extraction is used for industry and energy production (cooling in power plants) and 15 percent for public water supply. The main water consumption sectors are irrigation, utilities and manufacturing. The macro-economic impact of a prolonged or systemic drought could therefore be severe, and is currently the focus of a joint project between RMS and a number of leading financial institutions and development agencies to stress test lending portfolios to see how they would respond to environmental risk. “ONLY BY BRINGING TOGETHER MINISTERIAL LEVEL GOVERNMENT OFFICIALS WITH LEADERS IN COMMERCE CAN WE ADDRESS THE WORLD’S BIGGEST ISSUES” DANIEL STANDER RMS “Practically every industry in the world has some reliance on water availability in some shape or form,” states Stephen Moss, director, capital markets at RMS. “And, as we’ve seen, as environmental impacts become more frequent and severe, so there is a growing awareness that water — as a key future resource — is starting to become more acute.” “So the questions are: do we understand how a lack of water could impact specific industries and how that could then flow down the line to all the industrial activities that rely on the availability of water? And then how does that impact on the broader economy?” he continues. “We live in a very interconnected world and as a result, the impact of drought on one industry sector or one geographic region can have a material impact on adjacent industries or regions, regardless of whether they themselves are impacted by that phenomenon or not.” This interconnectivity is at the heart of why a hazard such as drought could become a major systemic threat for the global financial system, explains RMS scientist, Dr. Navin Peiris. “You could have an event or drought occurring in the U.S. and any reduction in production of goods and services could impact global supply chains and draw in other regions due to the fact the world is so interconnected.” The ability to model how drought is likely to impact banks’ loan default rates will enable financial institutions to accurately measure and control the risk. By adjusting their own risk management practices there should be a positive knock-on effect that ripples down if banks are motivated to encourage better water conservation behaviors amongst their corporate borrowers, explains Moss. “The expectation would be that in the same way that an insurance company incorporates the risk of having to payout on a large natural event, a bank should also be incorporating that into their overall risk assessment of a corporate when providing a loan – and including that incremental element in the pricing,” he says. “And just as insureds are motivated to defend themselves against flood or to put sprinklers in the factories in return for a lower premium, if you could provide financial incentives to borrowers through lower loan costs, businesses would then be encouraged to improve their resilience to water shortage.” A Critical Stress Test In May 2016, the Natural Capital Finance Alliance, which is made up of the Global Canopy Programme (GCP) and the United Nations Environment Programme Finance Initiative, teamed up with Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ) GmbH Emerging Markets Dialogue on Finance (EMDF) and several leading financial institutions to launch a project to pilot scenario modeling. “THERE IS A GROWING AWARENESS THAT WATER — AS A KEY FUTURE RESOURCE — IS STARTING TO BECOME MORE ACUTE” STEPHEN MOSS RMS Funded by the German Federal Ministry for Economic Cooperation and Development (BMZ), RMS was appointed to develop a first-of-its-kind drought model. The aim is to help financial institutions and wider economies become more resilient to extreme droughts, as Yannick Motz, head of the emerging markets dialogue on finance, GIZ, explains. “GIZ has been working with financial institutions and regulators from G20 economies to integrate environmental indicators into lending and investment decisions, product development and risk management. Particularly in the past few years, we have experienced a growing awareness in the financial sector for climate-related risks.” The Dustbowl – The first distinct drought (1930 – 1931) in the ‘dust bowl’ years affected much of the north east and western U.S. “The lack of practicable methodologies and tools that adequately quantify, price and assess such risks, however, still impedes financial institutions in fully addressing and integrating them into their decision-making processes,” he continues. “Striving to contribute to filling this gap, GIZ and NCFA initiated this pilot project with the objective to develop an open-source tool that allows banks to assess the potential impact of drought events on the performance of their corporate loan portfolio.” It is a groundbreaking project between key stakeholders across public and private sectors, according to RMS managing director Daniel Stander. “There are certain things in this world that you can only get done at a Davos level. You need to bring ministerial-level government officials and members of commerce together. It’s only that kind of combination that is going to address the world’s biggest issues. At RMS, experience has taught us that models don’t just solve problems. With the right level of support, they can make markets and change behaviors as well. This initiative is a good example of that.” RMS adapted well-established frameworks from the insurance sector to build – in a consortium complemented by the Universities of Cambridge and Oxford – a tool for banks to stress test the impact of drought. The model was built in close collaboration with several financial institutions, including the Industrial and Commercial Bank of China (ICBC), Caixa Econômica Federal, Itaú and Santander in Brazil, Banorte, Banamex and Trust Funds for Rural Development (FIRA) in Mexico, UBS in Switzerland and Citigroup in the US. “Some of the largest losses we saw in some of our scenarios were not necessarily as a result of an industry sector not having access to water, but because other industry sectors didn’t have access to water, so demand dropped significantly and those companies were therefore not able to sell their wares. This was particularly true for petrochemical businesses that are heavily reliant on the health of the broader economy,” explains Moss. “So, this model is a broad framework that incorporates domestic interconnectivity and trade, as well as global macroeconomic effects.” There is significant scope to apply this approach to modeling other major threats and potential sources of global economic shock, including natural, manmade and emerging perils. “The know-how we’ve applied on this project can be used to evaluate the potential impacts of other stresses,” explains Peiris. “Drought is just one environmental risk facing the financial services industry. This approach can be replicated to measure the potential impact of other systemic risks on macro and micro economic scales.”

EDITORMarch 17, 2017
Analytics
Analytics
The Analytics-Driven Organization
March 17, 2017

Over the past 15 years, revolutionary technological advances and an explosion of new digital data sources have expanded and reinvented the core disciplines of insurers. Today’s advanced analytics for insurance push far beyond the boundaries of traditional actuarial science. The opportunity for the industry to gain transformational agility in analytics is within reach. EXPOSURE examines what can be learnt from other sectors to create more analytics-driven organizations and avoid ‘DRIP’. Many (re)insurers seeking a competitive edge look to big data and analytics (BD&A) to help address a myriad of challenges such as the soft market, increasing regulatory pressures, and ongoing premium pressures. And yet amidst the buzz of BD&A, we see a lack of big data strategy specifically for evolving pricing, underwriting and risk selection, areas which provide huge potential gains for firms. IMAGINE THIS LEVEL OF ANALYTICAL CAPABILITY PROVIDED IN REAL-TIME AT THE POINT OF UNDERWRITING; A UTOPIA MANY IN THE INDUSTRY ARE SEEKING While there are many revolutionary technological advances to capture and store big data, organizations are suffering from ‘DRIP’– they are data rich but information poor. This is due to the focus being on data capture, management, and structures, at the expense of creating usable insights that can be fed to the people at the point of impact – delivering the right information to the right person at the right time Other highly regulated industries have found ways to start addressing this, providing us with sound lessons on how to introduce more agility into our own industry using repeatable, scalable analytics. Learning From Other Industries When you look across organizations or industries that have got the BD&A recipe correct, three clear criteria are evident, giving good guidance for insurance executives building their own analytics-driven organizations: Delivering Analytics to the Point of Impact In the healthcare industry, the concept of the back-office analyst is not that common. The analyst is a frontline worker – the doctor, the nurse practitioner, the social worker, so solutions for healthcare are designed accordingly. Let’s look within our own industry at the complex role of the portfolio manager. This person is responsible for large, diverse sets of portfolios of risk that span multiple regions, perils and lines of business. And the role relies heavily on having visibility across their entire book of business. A WILLIS TOWERS WATSON SURVEY REVEALS THAT LESS THAN 45 PER CENT OF U.S. PROPERTY AND CASUALTY INSURANCE EXECUTIVES ARE USING BIG DATA FOR EVOLVING PRICING, UNDERWRITING AND RISK SELECTION. THIS NUMBER IS EXPECTED TO JUMP TO 80 PERCENT IN TWO YEARS’ TIME Success comes from insights that give them a clear line of sight into the threats and opportunities of their portfolios – without having to rely on a team of technical analysts to get the information. They not only need the metrics and analytics at their disposal to make informed decisions, they also need to be able to interrogate and dive into the data, understand its underlying composition, and run scenarios so they can choose what is the right investment choice. If for every analysis, they needed a back-office analyst or IT supporter to get a data dump and then spend time configuring it for use, their business agility would be compromised. To truly become an analytics-driven organization, firms need to ensure the analytics solutions they implement provide the actual decision-maker with all the necessary insights to make informed decisions in a timely manner. Ensuring Usability Usability is not just about the user interface. Big data can be paralyzing. Having access to actionable insights in a format that provides context and underlying assumptions is important. Often, not only does the frontline worker need to manage multiple analytics solutions to get at insights, but even the user persona for these systems is not well defined. At this stage, the analytics must be highly workflow-driven with due consideration given to the veracity of the data to reduce uncertainty. Consider the analytics tools used by doctors when diagnosing a patient’s condition. They input standard information – age, sex, weight, height, ethnicity, address – and the patient’s symptoms, and are provided not with a defined prognosis but a set of potential diagnoses accompanied by a probability score and the sources. Imagine this level of analytical capability provided in real-time at the point of underwriting; a Utopia many in the industry are seeking that has only truly been achieved by a few of the leading insurers. In this scenario, underwriters would receive a submission and understand exactly the composition of business they were taking on. They could quickly understand the hazards that could affect their exposures, the impact of taking on the business on their capacity – regardless of whether it was a probabilistically–modeled property portfolio, or a marine book that was monitored in a deterministic way. They could also view multiple submissions and compare them, not only based on how much premium could be bought in by each, but also on how taking on a piece of business could diversify the group-level portfolio. The underwriter not only has access to the right set of analytics, they also have a clear understanding of other options and underlying assumptions. Integration Into the Common Workflow To achieve data nirvana, BD&A output needs to integrate naturally into daily business-as-usual operations. When analytics are embedded directly into the daily workflow, there is a far higher success rate of it being put to effective use. A good illustration is customer service technology. Historically, customer service agents had to access multiple systems to get information about a caller. Now all their systems are directly integrated into the customer service software – whether it is a customer rating and guidance on how best to handle the customer, or a ranking of latest offers they might have a strong affinity for. SKILLED UNDERWRITERS WANT ACCESS TO ANALYTICS THAT ALLOW THEM TO DERIVE INSIGHTS TO BE PART OF THE DAILY WORKFLOW FOR EVERY RISK THEY WRITE It is the same principle in insurance. It is important to ensure that whatever system your underwriter, portfolio manager, or risk analyst is using, is built and designed with an open architecture. This means it is designed to easily accept inputs from your legacy systems or your specific intellectual property-intensive processes. Underwriting is an art. And while there are many risks and lines of business that can be automated, in specialty insurance there is a still a need for human-led decision-making. Specialty underwriters combine the deep knowledge of the risks they write, historical loss data, and their own underwriting experience. Having good access to analytics is key to them, and they need it at their fingertips – with little reliance on technical analysts. Skilled underwriters want access to analytics that allow them to derive insights to be part of the daily workflow for every risk they write. Waiting for quarterly board reports to be produced, which tell them how much capacity they have left, or having to wait for another group to run the reports they need, means it is not a business-as-usual process. How will insurers use big data? Survey of property and casualty insurance executives (Source: Willis Towers Watson)

EDITORMarch 17, 2017
sun-set
sun-set
What One Thing Would Help Close The Protection Gap?
March 17, 2017

In each edition of EXPOSURE, we ask three experts their opinion on how they would tackle a major risk and insurance challenge. This issue, we consider the protection gap, which can be defined as the gap between insured and economic losses in a particular region and/or type of exposure. As our experts John Seo, Kate Stillwell and Evan Glassman note, protection gaps are not just isolated to the developing world or catastrophe classes of business. John Seo Co-founder and managing principal of Fermat Capital The protection gap is often created by the terms of the existing insurance itself, and hence, it could be closed by designing new, parametric products. Flood risk is excluded or sub-limited severely in traditional insurance coverage, for instance. So the insurance industry says “we cover flood”, but they don’t cover it adequately and are heavily guarded in the way they cover it. A great example in the public domain was in 2015 in the Southern District Court of New York with New York University (NYU) versus FM Global. NYU filed a claim for $1.45 billion in losses from Hurricane Sandy to FM Global and FM Global paid $40 million. FM Global’s contention was that it was a flood clause in NYU’s coverage that was triggered, and because it was a flood event in essence their coverage was limited to $40 million. Ten to 20 years down the line… we might find that we’re actually naked on cyber Ostensibly on the surface NYU had $1.85 billion in coverage, but when it came to a flood event they really only had $40 million. So the protection gap is not just because there’s absolutely no insurance coverage for these types of perils and risks in these geographies and locations, but because the terms of protection are severely sub-limited. And I would claim that’s the case for cyber risk for sure. The industry is very enthusiastic about its growth, but I can see, 10 to 20 years down the line, with a significant national event on cyber that we might find that we’re actually naked on cyber, as NYU discovered with Sandy. You could have a Fortune 50 company in the U.S. thinking they have $1 billion of cyber coverage, and they’re going to have an event that threatens their existence… but they’ll get a check for $50 million in the post. Kate Stillwell Founder and CEO of Jumpstart Recovery My absolute fundamental goal is to get twice as many people covered for earthquake in California. That doesn’t mean they’re going to have the same kind of earthquake insurance product that’s available now. What they will have is a  product which doesn’t fill the whole gap but does achieve the goal of immediate economic stimulus, and that creates a virtuous circle that gets other investment coming in. I wouldn’t have founded Jumpstart if I didn’t believe that a lump-sum earthquake-triggered cover for homeowners and renters wouldn’t help to build resilience… and building resilience fundamentally means filling the protection gap. I am absolutely motivated to ensure that people who are impacted by natural catastrophes have financial protection and can recover from losses quickly. Developing resources and financial products that tap into human optimism can fill this gap And in my mind, if I had to choose only one thing to help close the protection gap, it would be to align the products (and the resources) that are available with human psychology. Human beings are not wired to process and consider low-probability, high-consequence catastrophe events. But if we can develop resources and financial products that tap into human optimism then potentially we can fill this protection gap. Providing a bit of money to jumpstart the post-earthquake recovery process will help to transform consumer thinking around earthquakes from, ‘this is a really bad peril and I don’t want to think about it’ into, ‘it won’t be so bad because I will have a little bit of resource to bounce back’. Evan Glassman President and CEO, New Paradigm Underwriters  There’s a big disconnect between the insured loss and economic loss when it comes to natural catastrophes such as U.S. windstorm and earthquake. From our perspective, parametric insurance becoming more mainstream and a common and widely-adapted vehicle to work alongside traditional insurance would help to close the protection gap. The insurance industry overall does a good job of providing an affordable large limit layer of indemnity protection. But the industry is only able to do that, and not go out of business after every event, as a result of attaching after a significant buffer layer of the most likely losses. Parametric insurance is designed to work in conjunction with traditional insurance to cover that gap. The tranche of deductibles in tier one wind-zones from the Gulf Coast to the Northeast has been estimated at $400 billion by RMS… and that’s just the deductible tranche. Parametric insurance is designed to work with traditional insurance to cover the gap The parametric insurance space is growing but it hasn’t reached a critical mass yet where it’s a mainstream, widely-accepted practice, much like when people buy a property policy, they buy a liability policy and they buy a parametric policy. We’re working towards that and once the market gets there the protection gap will become a lot smaller. It’s good for society and it’s a significant opportunity for the industry as it’s a very big, and currently very underserved market. This model does have the potential to be used in underdeveloped insurance markets. However, I am aware there are certain areas where there are not yet established models that can provide the analytics for reinsurers and capital markets to be able to quantify and charge the appropriate price for the exposure.

Helen YatesMarch 17, 2017
technology-dna-cover
technology-dna-cover
The Future of (Re)Insurance: Evolution of the Insurer DNA
March 17, 2017

The (re)insurance industry is at a tipping point. Rapid technological change, disruption through new, more efficient forms of capital and an evolving risk landscape are challenging industry incumbents like never before. Inevitably, as EXPOSURE reports, the winners will be those who find ways to harmonize analytics, technology, industry innovation, and modelling. There is much talk of disruptive innovation in the insurance industry. In personal lines insurance, disintermediation, the rise of aggregator websites and the Internet of Things (IoT) – such as connected car, home, and wearable devices – promise to transform traditional products and services. In the commercial insurance and reinsurance space, disruptive technological change has been less obvious, but behind the scenes the industry is undergoing some fundamental changes. The Tipping Point The ‘Uber’ moment has yet to arrive in reinsurance, according to Michael Steel, global head of business development at RMS. “The change we’re seeing in the industry is constant. We’re seeing disruption throughout the entire insurance journey. It’s not the case that the industry is suffering from a short-term correction and then the market will go back to the way it has done business previously. The industry is under huge competitive pressures and the change we’re seeing is permanent and it will be continuous over time.” Experts feel the industry is now at a tipping point. Huge competitive pressures, rising expense ratios, an evolving risk landscape and rapid technological advances are forcing change upon an industry that has traditionally been considered somewhat of a laggard. And the revolution, when it comes, will be a quick one, thinks Rupert Swallow, co-founder and CEO of Capsicum Re. “WE’RE SEEING DISRUPTION THROUGHOUT THE ENTIRE INSURANCE JOURNEY” MICHAEL STEEL RMS Other sectors have plenty of cautionary tales on what happens when businesses fail to adapt to a changing world, he explains. “Kodak was a business that in 1998 had 120,000 employees and printed 95 percent of the world’s photographs. Two years later, that company was bankrupt as digital cameras built their presence in the marketplace. When the tipping point is reached, the change is radical and fast and fundamental.” While it is impossible to predict exactly how the industry will evolve going forward, it is clear that tomorrow’s leading (re)insurance companies will share certain attributes. This includes a strong appetite to harness data and invest in new technology and analytics capabilities, the drive to differentiate and design new products and services, and the ability to collaborate. In particular, the goal of an analytic-driven organization is to leverage the right technologies to bring data, workflow and business analytics together to continuously drive more informed, timely and collaborative decision making across the enterprise. And while there are many choices with the rise of insurtech firms, history shows us that success is achieved only when the proper due diligence is done to really understand and assess how these technologies enable the longer term business strategy, goals and objectives. One of the most important ingredients to success is the ability to effectively blend the right team of technologists, data scientists and domain experts who can work together to understand and deliver upon these key objectives. The most successful companies will also look to attract and retain the best talent, with succession planning that puts a strong emphasis on bringing Millennials up through the ranks. “There is a huge difference between the way Millennials look at the workplace and live their lives, versus industry professionals born in the 1960s or 1970s – the two generations are completely different,” says Swallow. “Those guys [Millennials] would no sooner write a cheque to pay for something than fly to the moon.” Case for Collaboration If (re)insurers drag their heels in embracing and investing in new technology and analytics capabilities, disruption could well come from outside the industry. Back in 2015, Lloyd’s CEO Inga Beale warned that insurers were in danger of being “Uber-ized” as technology allows companies from Google to Walmart to undermine the sector’s role of managing risk. Her concerns are well founded, with Google launching a price comparison site in the U.S. and Rakuten and Alibaba, Japan and China’s answers to Amazon respectively, selling a range of insurance products on their platforms. “No area of the market is off-limits to well-organized technology companies that are increasingly encroaching everywhere,” says Rob Procter, CEO of Securis Investment Partners. “Why wouldn’t Google write insurance… particularly given what they are doing with autonomous vehicles? They may not be insurance experts but these technology firms are driving the advances in terms of volumes of data, data manipulation, and speed of data processing.” Procter makes the point that the reinsurance industry has already been disrupted by the influx of third-party capital into the ILS space over the past decade to 15 years. Collateralized products such as catastrophe bonds, sidecars and non-traditional reinsurance have fundamentally altered the reinsurance cycle and exposed the industry’s inefficiencies like never before. “We’ve been innovators in this industry because we came in ten or 15 years ago, and we’ve changed the way the industry is structured and is capitalized and how the capital connects with the customer,” he says. “But more change is required to bring down expenses and to take out what are massive friction costs, which in turn will allow reinsurance solutions to be priced competitively in situations where they are not currently. “It’s astounding that 70 percent of the world’s catastrophe losses are still uninsured,” he adds. “That statistic has remained unchanged for the last 20 years. If this industry was more efficient it would be able to deliver solutions that work to close that gap.” Collaboration is the key to leveraging technology – or insurtech – expertise and getting closer to the original risk. There are numerous examples of tie-ups between (re)insurance industry incumbents and tech firms. Others have set up innovation garages or bought their way into innovation, acquiring or backing niche start-up firms. Silicon Valley, Israel’s Silicon Wadi, India’s tech capital Bangalore and Shanghai in China are now among the favored destinations for scouting visits by insurance chief innovation officers. One example of a strategic collaboration is the MGA Attune, set up last year by AIG, Hamilton Insurance Group, and affiliates of Two Sigma Investments. Through the partnership, AIG gained access to Two Sigma’s vast technology and data-science capabilities to grow its market share in the U.S. small to mid-sized commercial insurance space. “The challenge for the industry is to remain relevant to our customers,” says Steel. “Those that fail to adapt will get left behind. To succeed you’re going to need greater information about the underlying risk, the ability to package the risk in a different way, to select the appropriate risks, differentiate more, and construct better portfolios.” Investment in technology in and of itself is not the solution, thinks Swallow. He thinks there has been too much focus on process and not enough on product design. “Insurtech is an amazing opportunity but a lot of people seem to spend time looking at the fulfilment of the product – what ‘Chily’ [Swallow’s business partner and industry guru Grahame Chilton] would call ‘plumbing’. “In our industry, there is still so much attention on the ‘plumbing’ and the fact that the plumbing doesn’t work, that insurtech isn’t yet really focused on compliance, regulation of product, which is where all the real gains can be found, just as they have been in the capital markets,” adds Swallow. Taking out the Friction Blockchain however, states Swallow, is “plumbing on steroids”. “Blockchain is nothing but pure, unadulterated, disintermediation. My understanding is that if certain events happen at the beginning of the chain, then there is a defined outcome that actually happens without any human intervention at the other end of the chain.” In January, Aegon, Allianz, Munich Re, Swiss Re, and Zurich launched the Blockchain Insurance Industry Initiative, a “US$5 billion opportunity” according to PwC. The feasibility study will explore the potential of distributed ledger technologies to better serve clients through faster, more convenient and secure services. “BLOCKCHAIN FOR THE REINSURANCE SPACE IS AN EFFICIENCY TOOL. AND IF WE ALL GET MORE EFFICIENT, YOU ARE ABLE TO INCREASE INSURABILITY BECAUSE YOUR PRICES COME DOWN” KURT KARL SWISS RE Blockchain offers huge potential to reduce some of the significant administrative burdens in the industry, thinks Kurt Karl, chief economist at Swiss Re. “Blockchain for the reinsurance space is an efficiency tool. And if we all get more efficient, you are able to increase insurability because your prices come down, and you can have more affordable reinsurance and therefore more affordable insurance. So I think we all win if it’s a cost saving for the industry.” Collaboration will enable those with scale to behave like nimble start-ups, explains Karl. “We like scale. We’re large. I’ll be blunt about that,” he says. “For the reinsurance space, what we do is to leverage our size to differentiate ourselves. With size, we’re able to invest in all these new technologies and then understand them well enough to have a dialogue with our clients. The nimbleness doesn’t come from small insurers; the nimbleness comes from insurance tech start-ups.” He gives the example of Lemonade, the peer-to-peer start-up insurer that launched in 2016, selling discounted homeowners’ insurance in New York. Working off the premise that insurance customers lack trust in the industry, Lemonade’s business model is based around returning premium to customers when claims are not made. In its second round of capital raising, Lemonade secured funding from XL Group’s venture fund, also a reinsurance partner of the innovative new firm. The firm is also able to offer faster, more efficient, claims processing. “Lemonade’s [business model] is all about efficiency and the cost saving,” says Karl. “But it’s also clearly of benefit to the client, which is a lot more appealing than a long, drawn-out claims process.” Tearing up the Rule Book By collecting and utilizing data from customers and third parties, personal lines insurers are now able to offer more customized products and, in many circumstances, improve the underlying risk. Customers can win discounts for protecting their homes and other assets, maintaining a healthy lifestyle and driving safely. In a world where products are increasingly designed with the digital native in mind, drivers can pay-as-they-go and property owners can access cheaper home insurance via peer-to-peer models. Reinsurers may be one step removed from this seismic shift in how the original risk is perceived and underwritten, but just as personal lines insurers are tearing up the rule book, so too are their risk partners. It is over 300 years since the first marine and fire insurance policies were written. In that time (re)insurance has expanded significantly with a range of property, casualty, and specialty products. However, the wordings contained in standard (re)insurance policies, the involvement of a broker in placing the business and the face-to-face transactional nature of the business – particularly within the London market – has not altered significantly over the past three centuries. Some are questioning whether these traditional indemnity products are the right solution for all classes of risk. “We think people are often insuring cyber against the wrong things,” says Dane Douetil, group CEO of Minova Insurance. “They probably buy too much cover in some places and not nearly enough in areas where they don’t really understand they’ve got a risk. So we’re starting from the other way around, which is actually providing analysis about where their risks are and then creating the policy to cover it.” “There has been more innovation in intangible type risks, far more in the last five to ten years than probably people give credit for. Whether you’re talking about cyber, product recall, new forms of business interruption, intellectual property or the huge growth in mergers and acquisition coverages against warranty and indemnity claims – there’s been a lot of development in all of those areas and none of that existed ten years ago.” Closing the Gap Access to new data sources along with the ability to interpret and utilize that information will be a key instrument in improving the speed of settlement and offering products that are fit for purpose and reflect today’s risk landscape. “We’ve been working on a product that just takes all the information available from airlines, about delays and how often they happen,” says Karl. “And of course you can price off that; you don’t need the loss history, all you need is the probability of the loss, how often does the plane have a five-hour delay?” “All the travel underwriters then need to do is price it ‘X’, and have a little margin built-in, and then they’re able to offer a nice new product to consumers who get some compensation for the frustration of sitting there on the tarmac.” With more esoteric lines of business such as cyber, parametric products could be one solution to providing meaningful coverage for a rapidly-evolving corporate risk. “The corporates of course want indemnity protection, but that’s extremely difficult to do,” says Karl. “I think there will be some of that but also some parametric, because it’s often a fixed payout that’s capped and is dependent upon the metric, as opposed to indemnity, which could well end up being the full value of the company. Because you can potentially have a company destroyed by a cyber-attack at this point.” One issue to overcome with parametric products is the basis risk aspect. This is the risk that an insured suffers a significant loss of income, but its cover is not triggered. However, as data and risk management improves, the concerns surrounding basis risk should reduce. Improving the Underlying Risk The evolution of the cyber (re)insurance market also points to a new opportunity in a data-rich age: pre-loss services. By tapping into a wealth of claims and third-party data sources, successful (re)insurers of the future will be in an even stronger position to help their insureds become resilient and incident-ready. In cyber, these services are already part of the package and include security consultancy, breach-response services and simulated cyber attacks to test the fortitude of corporate networks and raise awareness among staff. “WE DO A DISSERVICE TO OUR INDUSTRY BY SAYING THAT WE’RE NOT INNOVATORS, THAT WE’RE STUCK IN THE PAST” DANE DOUETIL MINOVA INSURANCE IoT is not just an instrument for personal lines. Just as insurance companies are utilizing data collected from connected devices to analyze individual risks and feedback information to improve the risk, (re)insurers also have an opportunity to utilize third-party data. “GPS sensors on containers can allow insurers to monitor cargo as it flows around the world – there is a use for this technology to help mitigate and manage the risk on the front end of the business,” states Steel. Information is only powerful if it is analyzed effectively and available in real-time as transactional and pricing decisions are made, thinks RMS’ Steel. “The industry is getting better at using analytics and ensuring the output of analytics is fed directly into the hands of key business decision makers.” “It’s about using things like portfolio optimization, which even ten years ago would have been difficult,” he adds. “As you’re using the technologies that are available now you’re creating more efficient capital structures and better, more efficient business models.” Minova’s Douetil thinks the industry is stepping up to the plate. “Insurance is effectively the oil that lubricates the economy,” he says. “Without insurance, as we saw with the World Trade Center disaster and other catastrophes, the whole economy could come to a grinding halt pretty quickly if you take the ‘oil’ away.” “That oil has to continually adapt and be innovative in terms of being able to serve the wider economy,” he continues. “But I think we do a disservice to our industry by saying that we’re not innovators, that we’re stuck in the past. I just think about how much this business has changed over the years.” “It can change more, without a doubt, and there is no doubt that the communication capabilities that we have now mean there will be a shortening of the distribution chain,” he adds. “That’s already happening quite dramatically and in the personal lines market, obviously even more rapidly.”

Loading Icon
close button
Overlay Image
Video Title

Thank You

You’ll be contacted by an Moody's RMS specialist shortly.