Why Poor Data is Costing the Specialty Insurance Market Millions

The specialty insurance market has a long-standing data problem and it’s costing P&C carriers millions. As a market that began some 300 years ago, specialty insurance presents some unique challenges in its data flow that are worth exploring and understanding.

We’re talking Oil Rigs not Automobiles

Each year over $400Bn worth of specialty insurance revenue flows into the US, London, Bermudian and Asian markets from around the globe. There are well over 200 classes of business from Aviation to Marine Cargo and Fine Art, each with their own data set and, unlike personal/SME lines, the sheer volume and variety of that data is significant.

For example, to insure a personal automobile involves around 15 data items or less with much – like registration numbers – available via standard public lookups. By contrast, an oil platform in the Gulf of Mexico may require hundreds of data items buried in accompanying spreadsheets.

The Holy Grail of globally adopted standards and straight through processing continues to elude this market and so this data is manually “wrangled” from the assured through a human chain involving multiple intermediaries, services and systems.

An Inconvenient Truth

With such a hand-cranked approach it’s no surprise that this data is of highly variable quality but what is more shocking is the cost.  Research estimates that this lack of data hygiene can comprise up to 2% of a carrier’s expense ratio.

This represents a huge opportunity because much of this cost has long become normalised into carrier operations. Carriers are usually aware of the issues but have been unwilling to “lift the rock” because what lies beneath is far from pretty.  Suggesting that “it’s the way we’ve always done it” is no longer good enough as the industry struggles to digitally transform itself to meet the needs of a data first world.

If you feel your company hasn’t done enough in this area, here are three hard-hitting facts to support your case:

  1. Bad data compounds in the back office, horribly so – An incorrect broker commission at source impacts multiple units downstream; Actuarial, Claims, Finance, regulatory reporting and often, the financial close process. A £100 error in Underwriting Operations can result in a 5X multiple in terms of total effort and cost to fix. Early identification upstream at the point at which data enters the organization is key.

  2. Missing data adversely impacts pricing and reservingWhilst there’s a stream of useful AI emerging to help underwriters better assess and price risk, these algorithms are reliant on a key pre-requisite: a steady flow of comprehensive, high quality data. Many carriers have been seduced by the AI dream without realising the foundational work with their data and data culture that needs to happen first.  This is a major reason why many of the first AI pilots have quietly failed to date.  Not sure? Just ask a data scientist.

  3. A lack of real time data controls creates latent operating riskit’s a well-known secret amongst insiders that the specialty market is littered with examples of operational malpractice and underwriting error.  It could be misuse of a US Trust Fund designed to protect claims payouts to US policy holders.  Or underwriters writing risks above their limits in places they shouldn’t.   The consequences are severe; reputational damage, a fine from the regulator or simply a huge hike in the carriers E&O premiums when errors are declared.

Are you an Ostrich, a Selfie, Shelfware…or someone else’s problem?

Despite good intentions, carriers have long been dealing with the problem of how they check and fix their data badly and for the first time we can group those efforts into four types:

– The Ostrich

Head in the sand?  It’s hard to believe, but in the face of overwhelming evidence there are still carriers who believe they don’t have a material problem with checking their data. They are happy to work as it has always been done, because the hidden cost in handling issues has become normalised. The potential for a data disaster is high, but if no one has noticed why take any action?

– The Selfie (DIY)

More forward thinking specialty carriers have attempted an in-house solution, often using internal IT resource. Typically, these efforts are “issue notification only”, clunky and Excel or report based.  More advanced efforts might use email as simple workflow, or even a basic web UI to “suppress” issues. These systems are most dangerous as they are often thought of as “enough.” They require ongoing in-house knowledge and support, and lack the critical end- user engagement features like business workflow, market SLA’s to drive accountability, comprehensive fix tracking and issue trend analysis.  They also scale badly, especially where the specialty carrier has a global footprint.

 Someone Else’s problem  (Outsourcing)

A wise man once said “never outsource a problem”. Nowhere does this hold moretrue than with actual outsourcing.  Outsourcing data entry and QA is initially appealing from a cost perspective but the lack of native business knowledge means many issues are routed back to the carrier or dealt with badly. Similarly, most carriers “check the checkers”, to ensure adherence to the contract – a crazy and inefficient duplication.  And unless adding human headcount is the future, this scales badly too.

– Shelfware for Specialty  (Technical DQ tools)

Technical DQ toolsets can be powerful profiling tools in the right hands for data discovery or where data structures are unknown.  Their major Achilles heel?  Their technical heritage means they fail to engage regular business users. Often brought in on a whim to “do the right thing”, usage typically declines, rather than increases) over time leading to the product gathering dust and contracts shelved. They are general purpose and lack market features, are often expensive and require specialist training.  Need some real life examples?  Ask us and we’ll put you in touch!

A Blueprint for the Future

So, what can be done?  When we set out to solve this problem we re-imagined the future.  We knew the global market required an easy to integrate and scale solution, which continuously monitors carrier data, upstream at source using flexible, market specific rules and standards.

A solution which accurately triages issues and routes them to the right users at the right time.  Which engages non-technical, everyday business users with an intuitive UI, assigning SLAs and driving accountability.

Which:

  • Pro-actively sorts escalates issues that are not dealt with in time before cost and risk is incurred
  • Automatically monitors and tracks human resolution for 100% visibility
  • Identifies the root cause to drive immediate action or conversation with the relevant parties

With DQPro, we’re on a mission to give specialty carriers, MGA’s and brokers continuous data confidence to equip them for a digital, data first future.  

Interested?  Then get in touch. We’d love to hear from you.

More from this Author