Lack of Internal Data Ownership Could Be Costing Carriers $Millions

Nick Mair, co-founder and CEO of bootstrapped insurtech start-up Atticus DQPro, based in London, England, exposes the hidden costs in dealing with insurance data issues and asks why so many traditional data quality initiatives fail to make a difference .

 

From Facebook founder Mark Zuckerberg’s historic US senate grilling to the implications of General Data Protection Regulation coming into force in Europe this month, data ownership and accountability are part of the zeitgeist as critical issues for all modern businesses.

In our industry, getting data right is not only a reputational issue, it also has far-reaching implications when it comes to carrier cost and efficiency. In fact, lack of internal data ownership is already costing carriers millions, and significantly hampering efforts to streamline market processes.

 

Manual data entry and duplication

There is no ‘one size fits all’ approach to data quality, particularly where large carriers handle business flowing in from all over the world and through different third parties. The result is that multiple underwriting operations teams often spend time manually keying and rekeying data into legacy systems. Indeed, insurance companies the world over are often guilty of a ‘people and spreadsheets’ quality approach to fixing issues, leaving significant scope for errors to creep in.

Data quality issues are like pollution in the headwaters of a river system, impacting manifold areas downstream. That small exchange rate error can get used in multiple calculations, forecasts and reports before it is discovered, the damage already done.

It’s expensive and no one would say it’s efficient. But whilst most of us are aware of the myriad of data issues, the problem has often been viewed as too complex to solve and so it has become normalized as part of doing business in the market. The net result? A huge hidden, avoidable daily cost in dealing with data issues.

 

Data hungry insurtech

The current debate in insurtech is around how data generated from emergent technologies such as Artificial Intelligence and the Internet of Things can be harnessed alongside existing carrier data to produce new insights on risk, pricing and customer engagement.

However, a clear prerequisite to this is the necessity for quality data that cost-effectively helps carriers meet regulatory needs, as well as supporting emerging innovation initiatives.

Emerging technology, data sharing and contributory databases help insurers to better understand risks and settle claims as well as reducing workload – quite simply, data is a key resource for both individual businesses and for collaboration in the market as a whole.

The role of data and data confidence in insurance will only grow in significance as insurtech enters the mainstream. A culture of data quality and accountability must be the foundation of the modern insurance market, and it is imperative that each party takes ownership and responsibility for the data they are bringing to the table.

 

Data errors – a US$3 Trillion issue

According to the Harvard Business Review, just a 1% error rate can double the cost of a transaction. Employees in data related positions typically spend 50% of their time correcting data issues, while data scientists can spend up to 60% of their time cleaning and preparing data. The estimated cost to the US economy in 2016 of all this data inefficiency? A mere US$3 trillion!

But despite the obvious waste and costs, many traditional data quality initiatives aimed at reducing the impact issue ultimately fail. Why is this?

Many large US carriers typically use heavyweight data quality tools which often boast powerful capabilities but are limited by overly complicated user interfaces. The upshot is that regular business users fail to engage, and the tool – whatever its potential efficacy – is seen as too technical, ‘back office only’ and is underused in an organisation.  

In fact, generic data quality tools aimed solely at the technical users will almost inevitably fail to engage front office, business-side users. Time and again, users invest in an initial tranche of licences which are then never fully used; an expensive waste of time and resource.

For a data quality platform to be truly effective and user friendly, it should show increased usage and more licence demand with use over time – surely the sign of a successful product! It should also not just empower back office technicians with confidence – instead data integrity must be driven upstream to business level, where it belongs.

So how best to engage them and drive change? The answer is twofold – firstly data quality technology should be simple, with quick and easy-to-use tech embedded into workflow. Secondly it must be visible, with data tracking issue resolutions to a company-set Service Level Agreements.

 

Practical applications

In practical terms, this means replacing ad-hoc manual checks with innovative, automatic monitoring processes to automatically flag incorrect data as soon as it appears. Issues should be smartly routed directly to the relevant business owner or team to be fixed/resolved within agreed timelines.

It is completely in the control of carriers to make changes with minimal cost and disruption. When business users feel empowered for owning and improving the data for which they are accountable, and are given the tools to manage this visibly and efficiently, they will do so. Which means less work and less cost downstream in the back office.

New tech allows this to happen in the simplest of ways – with happy users on both sides of the office and pushing ownership of data to the business end of the company, where it rightly belongs.

Surely it is time that we question the sense of using manual, people- intensive methods and spreadsheets to check for data errors often created by other people and spreadsheets?

Issues surrounding data ownership and accountability are significantly impacting businesses and making headlines around the world. The insurance industry needs to make 2018 the year of taking effective data governance back into the heart of the market.

 

www./atticus-associates.com
Follow Atticus DQpro on Twitter and Linkedin

More from this Author