logo image

Insurance Solutions

Formerly Moody’s RMS

There has been much industry focus on the value of digitization at the customer interface, but what is its role in risk management and portfolio optimization?

In recent years, the perceived value of digitization to the insurance industry has been increasingly refined on many fronts. It now serves a clear function in areas such as policy administration, customer interaction, policy distribution and claims processing, delivering tangible, measurable benefits.

However, the potential role of digitization in supporting the underwriting functions, enhancing the risk management process and facilitating portfolio optimization is sometimes less clear. That this is the case is perhaps a reflection of the fact that risk assessment is by its very nature a more nebulous task, isolated to only a few employees, and clarifying the direct benefits of digitization is therefore challenging.

To grasp the potential of digitalization, we must first acknowledge the limitations of existing platforms and processes, and in particular the lack of joined-up data in a consistent format. But connecting data sets and being able to process analytics is just the start. There needs to be clarity in terms of the analytics an underwriter requires, including building or extending core business workflow to deliver insights at the point of impact.

Data Limitation

For Louise Day, director of operations at the International Underwriting Association (IUA), a major issue is that much of the data generated across the industry is held remotely from the underwriter.

“You have data being keyed in at numerous points and from multiple parties in the underwriting process. However, rather than being stored in a format accessible to the underwriter, it is simply transferred to a repository where it becomes part of a huge data lake with limited ability to stream that data back out.”

That data is entering the “lake” via multiple different systems and in different formats. These amorphous pools severely limit the potential to extract information in a defined, risk-specific manner, conduct impactful analytics and do so in a timeframe relevant to the underwriting decision-making process.

“The underwriter is often disconnected from critical risk data,” believes Shaheen Razzaq, senior product director at RMS. “This creates significant challenges when trying to accurately represent coverage, generate or access meaningful analysis of metrics and grasp the marginal impacts of any underwriting decisions on overall portfolio performance.

“Success lies not just in attempting to connect the different data sources together, but to do it in such a way that can generate the right insight within the right context and get this to the underwriter to make smarter decisions.”

Without the digital capabilities to connect the various data sets and deliver information in a digestible format to the underwriter, their view of risk can be severely restricted — particularly given that server storage limits often mean their data access only extends as far as current information. Many businesses find themselves suffering from DRIP, being data rich but information poor, without the ability to transform their data into valuable insight.

“You need to be able to understand risk in its fullest context,” Razzaq says. “What is the precise location of the risk? What policy history information do we have? How has the risk performed? How have the modeled numbers changed? What other data sources can I tap? What are the wider portfolio implications of binding it? How will it impact my concentration risk? How can I test different contract structures to ensure the client has adequate cover but is still profitable business for me? These are all questions they need answers to in real time at the decision-making point, but often that’s simply not possible.”

When extrapolating this lack of data granularity up to the portfolio level and beyond, the potential implications of poor risk management at the point of underwriting can be extreme.  With a high-resolution peril like U.S. flood, where two properties meters apart can have very different risk profiles, without granular data at the point of impact, the ability to make accurate risk decisions is restricted. Rolling up that degree of inaccuracy to the line of business and to the portfolio level, and the ramifications are significant.

Looking beyond the organization and out to the wider flow of data through the underwriting ecosystem, the lack of format consistency is creating a major data blockage, according to Jamie Garratt, head of innovation at Talbot.

“You are talking about trying to transfer data which is often not in any consistent format along a value chain that contains a huge number of different systems and counterparties,” he explains. “And the inability to quickly and inexpensively convert that data into a format that enables that flow, is prohibitive to progress.

“You are looking at the formatting of policies, schedules and risk information, which is being passed through a number of counterparties all operating different systems. It then needs to integrate into pricing models, policy administration systems, exposure management systems, payment systems, et cetera. And when you consider this process replicated across a subscription market the inefficiencies are extensive.”

A Functioning Ecosystem

There are numerous examples of sectors that have transitioned successfully to a digitized data ecosystem that the insurance industry can learn from. One such industry is health care, which over the last decade has successfully adopted digital processes across the value chain and overcome the data formatting challenge.

It can be argued that health care has a value chain similar to that in the insurance industry. Data is shared between various stakeholders — including competitors — to create the analytical backbone it needs to function effectively. Data is retained and shared at the individual level and combines multiple health perspectives to gain a holistic view of the patient.

The sector has also overcome the data-consistency hurdle by collectively agreeing on a data standard, enabling the effective flow of information across all parties in the chain, from the health care facilities through to the services companies that support them.

Garratt draws attention to the way the broader financial markets function. “There are numerous parallels that can be drawn between the financial and the insurance markets, and much that we can learn from how that industry has evolved over the last 10 to 20 years.”

“As the capital markets become an increasingly prevalent part of the insurance sector,” he continues, “this will inevitably have a bearing on how we approach data and the need for greater digitization. If you look, for example, at the advances that have been made in how risk is transferred on the insurance-linked securities (ILS) front, what we now have is a fairly homogenous financial product where the potential for data exchange is more straightforward and transaction costs and speed have been greatly reduced.

“It is true that pure reinsurance transactions are more complex given the nature of the market, but there are lessons that can be learned to improve transaction execution and the binding of risks.”

For Razzaq, it’s also about rebalancing the data extrapolation versus data analysis equation. “By removing data silos and creating straight-through access to detailed, relevant, real-time data, you shift this equation on its axis. At present, some 70 to 80 percent of analysts’ time is spent sourcing data and converting it into a consistent format, with only 20 to 30 percent spent on the critical data analysis. An effective digital infrastructure can switch that equation around, greatly reducing the steps involved, and re-establishing analytics as the core function of the analytics team.”

The Analytical Backbone

So how does this concept of a functioning digital ecosystem map to the (re)insurance environment? The challenge, of course, is not only to create joined-up, real-time data processes at the organizational level, but also look at how that unified infrastructure can extend out to support improved data interaction at the industry level.

An ideal digital scenario from a risk management perspective is where all parties operate on a single analytical framework or backbone built on the same rules, with the same data and using the same financial calculation engines, ensuring that on all risk fronts you are carrying out an ‘apples-to-apples’ comparison. That consistent approach would need to extend from the individual risk decision, to the portfolio, to the line of business, right up to the enterprise-wide level.

At the underwriting trenches, it is about enhancing and improving the decision-making process and understanding the portfolio-level implications of those decisions.

“A modern pricing and portfolio risk evaluation framework can reduce assessment times, providing direct access to relevant internal and external data in almost real time,” states Ben Canagaretna, managing director at Barbican Insurance Group. “Creating a data flow, designed specifically to support agile decision-making, allows underwriters to price complex business in a much shorter time period.”

“It’s about creating a data flow designed specifically to support decision-making”

Ben Canagaretna

Barbican Insurance Group

“The feedback loop around decisions surrounding overall reinsurance costs and investor capital exposure is paramount in order to maximize returns on capital for shareholders that are commensurate to risk appetite. At the heart of this is the portfolio marginal impact analysis – the ability to assess the impact of each risk on the overall portfolio in terms of exceedance probability curves, realistic disaster scenarios and regional exposures. Integrated historical loss information is a must in order to quickly assess the profitability of relevant brokers, trade groups and specific policies.”

There is, of course, the risk of data overload in such an environment, with multiple information streams threatening to swamp the process if not channeled effectively.

“It’s about giving the underwriter much better visibility of the risk,” says Garratt, “but to do that the information must be filtered precisely to ensure that the most relevant data is prioritized, so it can then inform underwriters about a specific risk or feed directly into pricing models.”

Making the Transition

There are no organizations in today’s (re)insurance market that cannot perceive at least a marginal benefit from integrating digital capabilities into their current underwriting processes. And for those that have started on the route, tangible benefits are already emerging. Yet making the transition, particularly given the clear scale of the challenge, is daunting.

“You can’t simply unplug all of your legacy systems and reconnect a new digital infrastructure,” says IUA’s Day. “You have to find a way of integrating current processes into a data ecosystem in a manageable and controlled manner. From a data-gathering perspective, that process could start with adopting a standard electronic template to collect quote data and storing that data in a way that can be easily accessed and transferred.”

“There are tangible short-term benefits of making the transition,” adds Razzaq. “Starting small and focusing on certain entities within the group. Only transferring certain use cases and not all at once. Taking a steady step approach rather than simply acknowledging the benefits but being overwhelmed by the potential scale of the challenge.”

There is no doubting, however, that the task is significant, particularly integrating multiple data types into a single format. “We recognize that companies have source-data repositories and legacy systems, and the initial aim is not to ‘rip and replace’ those, but rather to create a path to a system that allows all of these data sets to move. For RMS, we have the ability to connect these various data hubs via open APIs to our Risk Intelligence platform to create that information superhighway, with an analytics layer that can turn this data into actionable insights.”

Talbot has already ventured further down this path than many other organizations, and its pioneering spirit is already bearing fruit.

“We have looked at those areas,” explains Garratt, “where we believe it is more likely we can secure short-term benefits that demonstrate the value of our longer-term strategy. For example, we recently conducted a proof of concept using quite powerful natural-language processing supported by machine-learning capabilities to extract and then analyze historic data in the marine space, and already we are generating some really valuable insights.

“I don’t think the transition is reliant on having a clear idea of what the end state is going to look like, but rather taking those initial steps that start moving you in a particular direction. There also has to be an acceptance of the need to fail early and learn fast, which is hard to grasp in a risk-averse industry. Some initiatives will fail — you have to recognize that and be ready to pivot and move in a different direction if they do.”

You May Also Like
link
California wildfire
May 05, 2021
The Data Driving Wildfire Exposure Reduction
Read More
link
Forest fire near Peachland, British Columbia
May 05, 2021
Data From the Ashes
Read More
Risk Analytics and Collaborative Applications

See How RMS Helps Customers Outperform

close button
Overlay Image
Video Title

Thank You

You’ll be contacted by an Moody's RMS specialist shortly.