RMS has just completed a two-year exercise documenting all the different types of insurance that are available in the market and a classification system for all the assets that they protect. This is published as a data definitions document v1.0 as a standardized schema for insurance companies to have a consistent method of evaluating their exposure.
This project, in collaboration with research partners Centre for Risk Studies at University of Cambridge, and a steering committee of RMS clients, involved extensive interviews with 130 industry specialists and consultation with 38 insurance, analyst, and modeling organizations.
The project will enable insurance companies to monitor and report their exposure across many different classes of insurance, which globally today covers an estimated US$554 trillion of total insured value. The data standard will improve interchanges of data between market players to refine risk transfer to reinsurers and other risk partners, reporting to regulators, and exchanging information for risk co-share, delegated authority, and bordereau activities.
A key point of developing the data schema is to identify concentrations of exposure, and to assess accumulation risk by enabling new types of loss models. Insurers are concerned about several ways that accumulations can occur — through having multiple insurance policies with the same policyholder, having different lines of insurance with clusters of insured value in the same geographical location, and by having “clash” risk from underlying events that impact several classes of insurance in an insurer’s portfolio.
The project has demonstrated the usefulness of the data definitions document in providing a framework for loss modeling by developing three catastrophe scenarios: a severe hurricane hitting the energy fields and marine installations in the Gulf of Mexico; an influenza pandemic that hits life and health insurers, as well as causing financial losses to the economy and stockmarkets; and a geopolitical conflict located in Southeast Asia that triggers losses across all the major classes of insurance. Insurance companies are now assessing the clash risk from these scenarios for the portfolios of multiline exposure that they manage.
The data schema was officially launched at a conference in Cambridge in early September, where many representatives of insurance companies, regulators, market associations, rating agencies, modelers, and academics gathered.
The data definition document v1.0 is published and available, and is currently being implemented internally by members of the project steering committee and will be made available by RMS in their platforms and products. The hope is that the availability of the data definitions document will enable a new generation of risk model development and improvements across the insurance market in the ability to manage their multiline exposure risk. Feedback on the project outputs are welcomed.
Atlas of Global Insurance Exposure: An A2-size poster (42 x 59.4 centimeters / 16.5 inch x 23.4 inch) depicting a visual distribution of US$540 trillion of insured exposure worldwide across the major classes of insurance.
Share:
You May Also Like
September 19, 2014
Using Network Theory to Understand the Interconnectivity of Financial Risk
For today’s regulators, systemic risk remains a major issue. Tracing the connections between financial institutions and understanding how different mechanisms of financial contagion might flow through the system is complex.
Modern finance is a collective of the activities of tens of thousands of individual enterprises, all interacting in a “living” system. Today, nobody truly understands this system. It is organic and market-driven, but the fundamental processes that drive it occasionally collapse in a financial crisis that affects us all.
The increasing risk of financial contagion in the financial industry has triggered a new discipline of research – called “network theory in financial risk management” – which is quickly gathering pace. These valuable studies aim to identify and analyze all possible connections between financial institutions, as well as how their interconnectivity can contribute to crisis propagation.
Later this month, Risk.net will launch the Journal of Network Theory in Finance. This journal will compile the key papers of financial risk studies worldwide to provide industry participants with a balanced view of how network theory in finance can be applied to business.
Papers from the inaugural edition of the new journal will be showcased on September 23 at the Financial Risk & Network Theory conference, which is hosted by the Centre for Risk Studies at the University of Cambridge. I will be presenting a keynote on how catastrophe modeling methodologies can be applied to model financial risk contagion.
Our financial institutions are connected in a multitude of ways. For example, by holding similar portfolios of investments, using common settlement mechanisms, owning shares in each other’s companies, and through inter-bank lending.
As the interconnectivity of the world’s financial institutions and markets deepens, financial risk managers and macro-economic planners need to know the likelihood and severity of potential future downturns, particularly the “tail” events of economic catastrophe. Companies must continually understand how they are exposed to the risk of contagion; many were surprised by how fast contagion spread through the financial system during the 2008 credit crunch.
The regulator’s role in limiting the risk of future financial crises includes identifying Systemically Important Financial Institutions (SIFIs) and understanding what aspects of a SIFI’s business to monitor. Regulators have already pioneered network modelling to identify the core banks and to rank their systemic importance, and can now demand much higher standards of risk management from the SIFIs. Increasingly, similar models are being used by risk practitioners and investment managers.
The studies of network theory in financial risk management, such as those carried out by the Centre of Risk Studies, provide valuable insight for all risk practitioners involved in managing financial risk by providing a robust foundation of science from which to understand, model and, ultimately, manage financial risk effectively.…
We are in the middle of a health awareness revolution.
Attitudes to fitness, health, diet, and social risk factors are changing more rapidly than at any time in history. This has fueled a massive increase in life expectancy, particularly in better-educated social groups. Actions by individuals taking responsibility for their own health have outstripped the benefits of modern medicine in driving recent mortality reduction.
It also appears that the appetite for health-risk information is outstripping the capability of medical science to provide it. This is problematic not only for the medical profession, but also for the financial services industry in funding our retirement provisions.
The recent furor in the American Heart Association and American College of Cardiology is about the accuracy of risk models in new guidelines published last week. Risk models are used to help individuals make decisions about actions to improve their health. In this case, models were used to produce guidelines for taking statin drugs to reduce blood cholesterol – a leading risk factor for heart disease.
Medical-risk models take volumes of historical statistical data and deconstruct the importance of a large number of variables to try to assess their relative importance. The human body is a very complex system – it is not a piece of engineering that can be easily subjected to analysis using the laws of physics. It has many interacting biological processes and interdependencies, and human bodies have wide variations in characteristics in any population.
Because risk models need large volumes of data to tease out all the different variables that apply to an individual person, the historical data needs to be collected over a long time period. The newly-released calculator is based on data from the 1990s when many of the social habits and medical practices were very different than present – for example the gap between male and female mortality has narrowed significantly in the past 20 years. Leading cardiologists argue that these new guidelines have failed to keep up with and anticipate all the recent changes in patterns of public health and life expectancy.
Past health patterns aren’t a great guide to the future.
Similar problems also underpin the life expectancy estimations made by annuity providers and life insurers, who use past mortality data to project life expectancy in future decades for their retirees and pensioners. The fact that most pension liabilities are under-funded is not new news, yet solutions to ensure the future financial health of our elderly population are only as effective as the reliability of the underlying life expectancy projections.
Projections that fail to properly consider how the future may differ from the past, whether due to lifestyle or biomedical advances, can lead to the wrong strategies. Medical risk models developed by organizations like the American Medical Association suggest that even without future biotech advances, mortality rates could almost halve again from present rates if more people adopted highly healthy lifestyles. Models, such as those developed by Risk Management Solutions, incorporate all the variation that future mortality trends might follow and suggest that there is a 1-in-100 likelihood of a future mortality trend that would cause a trillion-dollar increase in annuity liabilities for the global pensions industry.
Improving medical risk models to ensure that they incorporate the potential for changes in the patterns of public health and life expectancy is a high priority for modern society, feeding into future healthcare planning, the provision of accurate advice for individual decision making and for the future financial health of our elderly population.…
Dr. Andrew Coburn currently leads the cyber risk research at RMS, developing cyber risk scenarios and analytics. In his 20 year career at RMS, he has managed the innovation of many new risk models, ranging from natural catastrophes, to terrorism, pandemics, longevity, and most recently, cyber risk. Dr. Coburn is recognized as an authority on catastrophe risk modeling.
Andrew is also a founder and member of the executive team of the Centre for Risk Studies, University of Cambridge, where he directs research into the risk of catastrophic collapse of complex systems. He leads a research team that coordinates a program of cyber risk research, whose work has included the development of the cyber insurance exposure data schema, the production of the Lloyd’s Business Blackout scenario of a cyber attack on U.S. power grid now used as a Lloyd’s RDS, and research that underpinned the decision by Pool Re to extend their cover to cyber terrorism.
Andrew is the author of "Earthquake Protection", co-authored with R.J.S. Spence, John Wiley & Sons, first edition 1998, second edition 2002. He is also the co-author of a forthcoming book "Solving Cyber Risk", to be published by Wiley in 2019.