Doing Business With a Better High Definition Flood Model: From Flood Re to Harvey
Robert Muir-WoodSeptember 26, 2017
When we set out to design the latest generation RMS High Definition (HD) flood loss models we identified five key challenges:
a) Resolution: Neighboring properties separated by a few feet in elevation can have dramatically different flood risk costs. However, the finer the model resolution, the bigger the hazard and loss files — and the slower the modeling. We needed an objective way of determining the best trade-offs around model resolution.
b) Duration: Once groundwater levels are raised, more precipitation can set off another round of flooding. The climatology can persist, bringing;
a succession of storms on the same path, or
a fixed long lasting atmospheric river, or
a procession of rain bands from a stalled circulation parked at the coast as from Hurricane Harvey.
Flooding episodes can endure for days and weeks, but reinsurers want a predetermined definition of event duration. The model needed to provide the flexibility to explore how event duration affects results.
c) Areal extent: Flooding can occur over wide regions, sometimes with several separate areas of inundation, shifting locus through time. Meanwhile a big river can carry a flood wave far beyond the region of high precipitation. The model needed to capture the full complex geography of flooding.
d) Defenses: No barrier can keep out hurricane winds or earthquake shaking, but floods can be kept at bay with a wall. However, a wall may fail or become overtopped. The presence and performance of a flood defense will be very critical to loss outcomes and needed to be something the model user can adjust, especially when the protection provided by the defense has to be inferred.
e) Loss variance: When plotted against water depth, flood percentage losses show a high degree of scatter: damage is rarely structural and reflects the immersion sensitivity of building materials and electrical equipment as well as the level of precautions taken. For certain industries business interruption can be very extensive after a flood. It will be important to capture this scatter of losses, especially when faced with aggregate data.
For both Europe and the U.S., how could we take advantage of the expanded capabilities of High Definition (HD) models to solve these five key challenges?
a) Resolution : File size and run time concerns have to be balanced against the accuracy of land elevations. We also need seamless elevation data straddling national frontiers. The highest resolution LIDAR data is still beset with artefacts; such as residual trees and buildings, and especially where the data needs to be most accurate — in towns. After a programme of tests to find the influence of the digital terrain model (DTM) resolution on results, it was clear that DTMs principally derived from ground survey measurements were preferred and a 40 meter base resolution “bare earth” DTM was employed for the full European stochastic flood model, while reverting to five meter horizontal resolution data for outputting flood risk data.
b) Duration: In a study of 383 past flood events in Europe, 10 percent were found to persist for more than four weeks. In 1993, the flooding on the Mississippi River lasted for months. Yet reinsurers need a predefined maximum event duration. The only solution is to store daily information on the extent and depth of flooding throughout the whole model domain. The RMS Europe Inland Flood HD Models are the first ever to allow the user to explore the sensitivity of any selected definition of event duration. The same procedure is an integral component of the new RMS U.S. HD Flood model.
c)Extent: In the analysis of historic European floods, a quarter were found to have affected more than one country. Among this subset the average number of countries impacted was between three and four. The RMS Europe Inland Flood HD Models are the first to contain a coherent stochastic precipitation, river flow and flooding event set extending across the whole of western and central Europe (from Ireland to Hungary). This capability acknowledges that as multicountry floods are to be anticipated, (re)insurers should manage their portfolios across the whole region.
d) Defenses: In some countries and regions there are now detailed databases of river flood defense locations and design levels. However, in many catchments defence heights can only be inferred based on property “value at risk” assumptions. The new HD Flood Models are the first to enable the model user to test local results by varying the flood defense assumptions. This is important not only where defense heights have been inferred but also when underwriting industrial facilities with their own private defenses.
e) Loss Variance: Introduction of the four component beta function to represent the loss distribution, acknowledges the potential for weights to be applied to both zero losses and 100 percent losses. Among many improvements in financial modeling, this enables the full treatment of flood deductibles. When only aggregate exposure data is available, the HD model automatically samples multiple residential and commercial/industrial exposure locations within the postcode (based on a building level exposure dataset) to gain a full appreciation of the uncertainty in modelled losses.
The flooding from Hurricane Harvey has reminded us of the potential for massive losses from inland flooding, when a stalled circulation sends a succession of rain bands over a coastal city. However, the total rainfall from Harvey is exceeded by many extreme events in the new RMS U.S. HD Flood Model. Harvey also illustrates one configuration of how wind, surge, wave, and inland flood losses can all be generated by the same event. The new RMS U.S. HD Flood Model will be the first to contain these complex interlinkages across the perils, so that it is possible to measure both the peril specific and combined losses of hurricanes.
Validation and Calibration
The flood model presents multiple opportunities to compare and test the hazard and loss models, whether it is the probabilities of extreme daily rainfalls or river flows, monitored throughout the model domain, or even the seasonality of peak flows. For validation and calibration, it is also possible to test whether flood defenses achieved their intended design level of protection and to compare modeled loss results with previous historical flood losses normalized for current exposures.
However, such validations are becoming increasingly challenged by urbanization and climate change. Higher sea surface temperatures can leader to higher coastal precipitation — as seen in Harvey’s rainfall. Unchecked paving of local catchments ensures a higher proportion of rainfall turns into rapid surface runoff than in the original bare earth landscape. All of this means we can’t simply use the statistics of past extreme river flows or water levels, to determine the return periods of the latest flooding. The era when engineers could confidently assume stationarity in catchment runoffs, rainfall extremes and river channels is dead. This will present a major challenge to determine the return periods of significant floods and flood losses and how much the odds have been shifted.
The new European Inland Flood HD Models were designed to solve an expanded range of business questions around flood risk. For example, in the U.K. insurers can now pass onto Flood Re their highest risk household properties for which they are confident the risk costs exceed what they are permitted to charge for flood cover. Insurers need the best data to identify the worst risks. Meanwhile Flood Re itself is eager to have intelligence on the risks being passed into the pool, while watching the degree to which portfolio correlation will then require the purchase of more reinsurance. The new HD model has been designed to help optimize these kinds of combined single location and portfolio-wide business decisions.
To provide all this HD capability requires expanded computing capabilities, generally beyond what any individual organisation will have on-site. Providing the full modelling capability is then only fully possible within the Cloud. For many insurers, the new Europe and U.S. HD Flood models may provide the first opportunity to experience the advantages the Cloud can bring to risk modeling.
Robert Muir-Wood works to enhance approaches to natural catastrophe modeling, identify models for new areas of risk, and explore expanded applications for catastrophe modeling. Robert has more than 25 years of experience developing probabilistic catastrophe models. He was lead author for the 2007 IPCC Fourth Assessment Report and 2011 IPCC Special Report on Extremes, and is Chair of the OECD panel on the Financial Consequences of Large Scale Catastrophes.
He is the author of seven books, most recently: ‘The Cure for Catastrophe: How we can Stop Manufacturing Natural Disasters’. He has also written numerous research papers and articles in scientific and industry publications as well as frequent blogs. He holds a degree in natural sciences and a PhD both from Cambridge University and is a Visiting Professor at the Institute for Risk and Disaster Reduction at University College London.