Spatial data is a catalyst for understanding risks

Spatial data is a catalyst for understanding risks

SHARE
Iain Willis
Iain Willis
Product Manager, EQECAT

The insurance industry’s need for accuracy means geocoding and GIS are key tools in this business, says Iain Willis, Product Manager, EQECAT

How is EQECAT using geospatial technology as part of its workflow?
EQECAT was established in 1994 and from the start it became apparent that location was fundamental to catastrophe modelling. Whether you are modelling forest fires, earthquakes, hurricanes or tornados, the common denominator is that location dictates the extent to which you are at risk of these hazards. For instance, the geographic distance between whether your property lies in a flood plain or in a safe, elevated area can be just a few metres. However, the insured risk potential of these two locations for our clients is entirely different.

Geospatial technologies are harnessed at every step at EQECAT. Our clients’ need for accuracy means geocoding and GIS are key tools in our business. In developing our European Windstorm Model, for example, we used multiple data sources to build our hazard components — global land-use information, digital elevation models, as well as discrete vector layers such as recorded windspeed at thousands of locations. The only way to take this disparate information and turn it into a meaningful hazard is by using software tools like GIS. Whether it is the ability to overlay these geographic sources as map layers or using spatial interpolation to produce a smoothed hazard surface, GIS is an integral part of our model development.

EQECAT’s report Catastrophe Watch, or CatWatch, was out within a week after the devastating Tohoku earthquake (in Japan) in March 2011. How did EQECAT manage all that in such a short period of time?
We really worked hard to get our CatWatch reports out as quickly as possible. Our subscribers rely on us to deliver a timely, well-informed narrative of a developing disaster, so it is important we use a wide range of sources. The Tohoku event is a good example of this. As the event unfolded, we used a wide range of sources including USGS data, regional news websites, market releases, as well as the scientific resources available within the company. Given that we have several experienced seismologists, we were able to publish the CatWatch report that captured the humanitarian and economic scale of the crisis, as well as the scientific context of Tohoku, being the largest recorded earthquake in the Japan trench and something of a game-changer in seismology.

The report included digital maps showing tsunami flood zones, location of aftershocks, damaged areas, including roads and location of population centres in affected areas. Of course, a huge element of the reports is visual representation. The GIS-derived seismic maps, images and conceptual diagrams are a key tool in helping convey the story to our clients. The CatWatch visuals help show where the disaster occurred, the magnitude and the likely insurance exposure. These aspects are of immediate concern to our clients, who will ultimately be impacted by these events and therefore, need to react to the market quickly and accordingly. Spatial data provides a catalyst for this understanding.

How does RQE, EQECAT’s catastrophe risk modelling platform released in 2013 use spatial technology to aid the insurance industry?
There is a great deal of uncertainty in the insurance industry around location data. Since many insurers, reinsurers and brokers keep data at highly aggregated levels of geography (zip code, county), this poses a big problem for modellers when it comes to accurately locating these exposures. In our previous platform, as in RQE, we employ a technique called ‘disaggregation,’ whereby we can take aggregate exposure data and suitably map this risk into the most likely locations in that zip code or county. Based on local demography, we are able to make informed decisions about the placement of risk and essentially downscale a client’s portfolio to a suitable level for risk analysis. Such techniques are critical to many of our clients, and thus help them manage risk that is difficult to locate.

While geoinformation systems were largely being used by reinsurers and modelling firms in handling of property insurance risks till a few years back, today they are used for a much broader spectrum, including primary insurers. Your comments?
Definitely. There is no doubt there has been a continued growth in geographic information systems among the primary insurers the last few years. I think there is a number of factors driving this. Firstly, the rise in geo-information is something that is embracing every aspect of society, not just the insurance industry. Whether it is using your smartphone’s geo-mobile technology to find the nearest Starbucks, using Google Maps to plan a car journey, or GPS for Facebook check-in, all of us are becoming more aware of location. This distributed availability of geospatial solutions, together with the increasing awareness of underwriters and chief risk officers as to the benefits of these technologies, are key factors in its rapid adoption by primary insurers.

Which are the future areas of application of geospatial technology you foresee for the insurance industry?
The key areas of application I see growing in the future are really around data precision and browser-based geo-visualisation.

Also, although street-level geocoding solutions exist for numerous countries across the world, they are still lacking in many places. I see a huge growth potential there and something that will be greatly welcomed by the insurance industry. It will essentially mean that exposure in emerging markets can be more precisely located.

Likewise, another growth area is lightweight geo-visualisation tools. Although GIS desktop tools will continue to be needed to do the ‘heavy lifting’ for analytical work, I foresee an increase in the amount of spatially enabled dashboard reports. With browser plugins and web map APIs, the capability to produce these reports already exists and is starting to be widely used. I see this trend continuing in the near future.