With contributors from: Carl Lambert, Vice President, National P&C Business Intelligence at The Co-operators and Hashmat Rohian, AVP, R&D, Innovation at Aviva

Data analysis has emerged as a key differentiator in the realm of insurance in Canada, expanding beyond the actuary and improving performance across all business units. Today’s insurers are placing many future hopes on strong platforms to provide insights for customers, risks and opportunities, all of which require a strong investment in data and its processing.

Expanding analytics platforms is the next market step, though insurers will have to move through bureaucratic burdens to ensure that data governance is properly aligned with legislation. The mix of national and regional laws that manage insurers and general data use present an opportunity for data analytics to flex its muscle, if architectures are properly constructed.

While the promise of Big Data is exceptional for insurers, failure to adequately prepare and adopt it has just as dire consequences.

“Without the ability to execute on analytics properly, companies will find themselves in big problems. I wouldn’t go so far as to say they’d find them¬selves bankrupt, but we’re looking at problems in that order of magnitude,” said Carl Lambert, Vice President, National P&C Business Intelligence at The Co-operators.

TODAY’S DATA NEEDS

Data analytics in insurance is in a unique position because of the impor¬tance data has always held in the industry.

“If you look at the maturity curve of any organization, it starts with data governance and the curve ends with predictive analytics and prescriptive analytics. When I started working as an actuary in 1995, we already had predictive analytics,” notes Lambert.This means the analytics maturity of the insurance industry doesn’t fit with the models of other industries. In the 90s and early 2000s, insurance was ahead of the market in data science and business applications because actuaries were the ones doing analytics. 

Insurers have managed predictive analytics for decades, but some have fallen behind in the modern application of the technology backing anal¬ysis. The technological needs around data at the turn of the century were focused on warehousing and monitoring, and insurers invested heavily.What many haven’t kept up with is the need to introduce speed into systems and allow individuals a fuller control over data to provide insights. The actuaries who once led the use of analytics have a hand in some of this lag behind, suggests Lambert. For many, this is because the new data tools and methodology are being produced by outside sources. 

The immergence of data scientists, statisticians and predictive modelling tools has allowed other industries to catch up. In some cases, these indus¬tries grew faster due to a management perspective that was quick to adopt new tools and positions thanks to a vision of analytics as a new area, not an expansion of an existing business unit.

EXPANDING HORIZONS

The most immediate culture shift resulting from these new tools and opera¬tion expansions is the ability to go from a macro view of business to a micro view, generating ever-smaller segmentations of clients.

“Predictive personalization is the new norm,” notes Hashmat Rohian, AVP, R&D, Innovation at Aviva. “Traditionally, insurance companies would focus on location or demographics alone and market products to all prospects in that segment. Customers expect more now. We need to factor in structured social data, prospect demographics and behaviours to create community and personality profiles. Using these, the aim is to then predict customer behaviour, needs or wants using propensity modelling – and tailor offers and communications very precisely to them. This makes granular, relevant and channel-aware up-sell and cross-sell recommendations key to winning customers over.”

Data tracking and modelling allows insurers – and data scientists in many other industries – to identify the individual customer and treat them exactly as who they are. This allows the marketing side of the house to send the right push through the right channel, while other units can provide infor¬mation in the contact method the user prefers each time.

This comes from the direct data and digital footprint present in today’s online world. The trail of personal-information breadcrumbs is now easier to follow, so insurers simply need to work on transitioning to a granular view of data that operates well at the micro level.

ANALYTICS AND CONSUMER EXPECTATIONS

“Personalization as a trend has evolved significantly driven by custom-ers who want to be treated as individuals and not just as a profile. They expect us to have a holistic view of them and look at both internal and external data to get to know the next best action for them. Behavior based aka wisdom of the crowds and collaboration or graph based techniques have emerged that allow more differentiated customer experiences. The customer also has become more willing to share the data to enable that personalized treatment,” said Rohian.

The majority of insurance customers will not regard analytics as having any impact on their policies or regular interactions with their providers, despite common industry knowledge that analytics plays a role in pricing and fraud.

One area where consumer expectation should drive analytics, however, is in the overall experience that each customer has. The Internet is no longer a secondary business channel; it requires our full attention and must be an experience in line with in-person or phone interactions between customers and their agent or insurer.

The industry must realize that expectations of the insurance experience don’t come from past interactions with insurance providers. “If I buy some¬thing, whether it’s in a nearby store or on their website, I can get the same product at the same price; it’s the same everything. The expectations of the client are not based on insurers but based on other industries,” said Lambert.

Surprisingly, this expectation plays a direct role in how data governance is applied and experienced. Data collection must be the same or very similar at every touch-point, whether the agent is entering information in an office or a consumer has navigated to a website for a quote. As a result, websites and apps have become core services just with a different access method.

Governance needs to dictate how the data is collected and tagged, making a consistent user-interface necessary. This UI must also be similar to other experiences the customer has, that way they’re familiar with data fields and less likely to provide false information.Insurance does have the benefit of an industry standard in a non-imme¬diate response for many actions. Consumers don’t expect their insurance provider to operate on the same timeframe as their stock broker. 

“Our real-time used to be weeks, days, and hours, instead of minutes, seconds and milliseconds,” said Rohian. “We’re now closer to real-time and we’re much more integrated with our customers and partners to make better decisions with them. As in most things, the law of diminishing returns applies to real-time analytics as you get closer to real-time.”

QUALITY CONTROL UPON ENTRY

By providing similar portals to enter information for both the customer and their agents, insurers can add in a smart layer of protection that does not interrupt analytics.The current set of industry best practices includes a call for data scientists to receive unaltered or adjusted data when they go to perform broad analyt¬ics. This means that the warehousing structure shouldn’t be applying a filter or protocol to make information uniform in the same way it would for an output to a dashboard.

The quality control must then take place at the point of data entry. Management upon entry, thanks to proper governance, not only helps to prevent user error but may also serve as a fraud deterrent.

Insurers currently apply tools to make sure data is entered correctly by an agent, and these can often be extended to online services, which is espe¬cially important for a customer’s address. Data management that cleans up information at the warehouse level may detect that a customer has entered two or more different, but very similar, addresses and will automatically correct these. This can let some fraud slip through because no auditor or data scientist has the opportunity to see that multiple addresses were used. 

“For this fraud, the client is hoping the discrepancies aren’t caught. If the data is being corrected, we won’t see the changes and may in fact help the fraud go through,” said Lambert. “It should be presented in a way that is always the same, using some external tools to verify that address.”

So, applying a governance tool to each input option would allow an insurer to immediately check the address against existing policies and only allow a correct input. The fraud becomes harder thanks to a governance policy that changes where discrepancies are sought.

NEW STRUCTURES IN DATA MANAGEMENT

Not only are the processes around data being revamped, but insurers are also finding new ways to structure the entire warehousing system to make significant gains.One of the newest models gaining favor is a system similar to the hub-and-spoke model that allows for feedback and limited two-way control options. Think of spokes as data farms that can continually feed information back to the hub to enrich the entire operation.

“You allow data analytics to merge with the hub and feed information back as in a distributed system. If you use the hub as the central point to pull all data from, it can become a bottleneck. That’s what the industry has learned,” said Rohian. He said that this federated model allows the hub to focus on best practices and governance, while the spokes work on the actual analyt¬ics itself.

The federated model also avoids major costs previously associated with minimal changes. Using the hub as storage and the spokes as process¬ing points, insurers are less likely to see new data integration cost several millions of dollars and months of work for each new column.This model combines some of the benefits of in-house teams and consult¬ing models because the hub can serve as a consultation arm and bring coaching and structure to deployments that move to each business unit within an organization. This also facilitates the use of compliance tools at data entry points, allowing for applications such as the address check to be implemented with ease and have proper guidance.

To make this successful, company needs must be shared and executive sponsorship is vital, but it has some of the greatest potential to deliver insights for day-to-day business processes and decisions.

NEW DATA SOURCES

Data management practices and changing governance models have made it easier for many insurers to integrate new data sources. However, impulses to add vast quantities of new data must be controlled until data quality is verified and business cases are found.

“Technological innovations such as cloud computing, distributed file systems and ensemble based machine learning techniques, to name a few, have absolutely allowed us to look at more data, more relevant data and faster data and insights than ever before,” notes Rohian. More and new data is great, but can be dangerous. If you have bad data and good models, but only at a small scale, you can only make small bad decisions. If you have bad data quality in big data, you make big bad decisions.”

The warehousing concerns, regardless of using a traditional model or adopting a new federated approach, are immense. According to both Rohian and Lambert, problems tend to be exponential; when newly intro¬duced data makes the whole collection twice as worse, decisions become four times worse.

The speed of decision making and the ability to integrate more data thanks to innovation and cloud platforms makes evaluation and testing an abso¬lute must before adding data to any machine learning models. Data must be useful and clean, not packaged or cut by the vendor or your own ware¬housing, to ensure that it is at its most useful.