Why Aren’t Insurers Doing More With Their Data Strategy?

Jeff Goldberg

While the business intelligence space has matured greatly in the last decade, it has been and remains an area where insurers need to work with a variety of platforms and tools to build out their capabilities. This requires a mix of technical skillsets, business expertise, and vendor relationships. While few efforts at an insurer are more complex or time consuming as a core system replacement, a major BI initiative will eventually touch all aspects of an organization. I will be presenting more on this topic in a webinar on December 1, 2015.

Today there are more vendors that provide a full business intelligence suite which includes the data layer, the industry-specific data models, the presentation and visualization tools, and the pre-built insurance reports and dashboards. But these suites still need to be tailored to and integrated into the rest of the insurer’s environment. Some policy administration system vendors are either pre-integrating with or releasing their own business intelligence suites. This does simplify the deployment but adds another variable to the “suite vs best-of-breed” decision, and until these options have more production exposure most insurers are still opting for best-of-breed.

For now, most of these approaches don’t provide some of the ancillary but very important data and analytics areas such as predictive modeling tools and the models themselves, the use of aggregated third-party data sources, or the emerging area of big data. And no matter what approach an insurer takes, it is a near-universal condition that there will be siloes of historical data that need to be considered with or migrated into the new BI solution, and that will take time and effort.

So despite plethora of vendor options and the general acknowledgement across the industry that good business intelligence is key to ongoing success, why aren’t more insurers further along in their data strategy?

1. Most insurers struggle with multiple legacy systems and siloes of disparate data, and they are still at the first steps of bringing that data together.

2. The data that does exist in the organization is of variable quality or completeness. New systems don’t immediately solve that problem without a broader plan in place.

3. Insurers and core systems have traditionally looked at data from a policy view, complicating the effort to move to new models that tend to take a 365 degree customer view.

4. Many insurers still have no formal data governance in place and lack organizational agreement on data definitions.

A good vendor partner can help put the framework and some tools in place to solve the above four roadblocks, but it requires more than just technology. It requires process and cultural change, which can’t be driven solely by IT.

Many insurers are still looking for a data champion to help push a strategy across the organization and get buy-in from other business leaders. As organizations mature, this data champion role is often formalized as a Chief Data Officer (CDO), and that person typically has a strong data background. But for insurers who are still looking to get a data strategy off the ground, it’s most important to find a leader who is respected in the organization and who is passionate about the value that good business intelligence can bring, even if they have little data experience themselves.

Related Reports:

  • Business and Technology Trends: Business Intelligence
  • Novarica Market Navigators: Business Intelligence Solutions
  • Workers’ Comp Insurers Look to Analytics and Core Systems

    Jeff Goldberg

    In our most recent Novarica Council Special Interest Group meeting, several Workers’ Compensation CIOs discussed core systems replacement strategies and long-term visions, as well as emerging uses of mobile, analytics, and end-user-facing technologies.

    All the attendees were in various stages of core system replacement—ranging from just-completed to the initial stages—so they were eager to learn from others’ experiences, and to gain perspective on their own challenges. Everyone agreed that a flexible, modern core system was at this point table stakes, hence the flurry of transformation activity. A minority of companies are changing appetite, but the vast majority of P/C insurers are looking to grow by moving into new territories. To do that a modern, flexible system is absolutely necessary.

    The shrinking lifespan and growing price of core systems was another area of concern. Everyone agreed that that new core systems are increasingly costly to implement, and that they must be replaced more frequently than older legacy systems. Gone are the days that a core system lasted for 40 years. Some participants also noted that many strategic business initiatives—like new product deployment—must be put on hold during a multi-year implementation project, increasing the indirect costs of the implementation.

    As an antidote to this gloom and doom, the CIOs in attendance were confident in their strategy to overcome these obstacles. Core systems today are much more flexible than legacy systems, relying on componentized architectures and configurable logic, meaning that the next round of replacements (and possibly even conversions) should be easier. More importantly, past lessons have been learned, and both insurers and vendors know how important it is to avoid custom coding and to stick with a vendor’s upgrade path. If those rules are followed, ten or fifteen years down the road the insurer’s system will be “new” even if they’ve stayed with the same vendor and system all along! It’s critical to choose a vendor who acts as a long-term partner and not just a one-time purveyor of a technology.

    Of all the strategic considerations discussed, one of the most important was a concrete plan for data conversion and sun-setting old systems. One participant shared that if he could go back in time, he’d focus much more on a transition plan, so as not to lose the project’s momentum after go-live. Other attendees described the challenges of data conversion and new data warehouses, and the legal and data integrity-related risks of fully sun-setting old components.

    Attracting and retaining good talent was another concern for many of the attendees. One insurer reported being well ahead of their Guidewire implementation schedule, due to a concerted effort to focus talented IT and other business unit resources on the project. Several attendees noted that the structure of their projects—agile, waterfall, or a combination of the two—was much less important than the staffing and communication strategies of those projects. When agile first started becoming prevalent in the insurance industry, carriers all over the country were told it might be the answer to all their project/logistical problems. But that’s not how software works. Everyone in attendance was reminded that there are rarely, if ever, silver bullets for these huge problems.

    Related Research:
    Business and Technology Trends in Workers Compensation

    An Honest Look at the State of Big Data in Insurance

    Jeff Goldberg

    With the recent publication of Novarica’s Analytics and Big Data at Insurers report, it’s time to take an honest look at the state of big data in the industry. One of the most telling–and disappointing–charts showed that of the insurers working with big data, seventy percent were using traditional computing, storage, database, and analytics technology, meaning they’re working with SQL databases in their existing environment. Of all the other technology options (Hadoop, NoSQL, columnar databases, etc) only a small percentage of insurers use them and almost no insurer uses them extensively.


    Compare that to the percentage of insurers who say they are using big data sources, which is significantly higher than the percentage of insurers using big data technology. This includes third-party consumer or business data, geospatial data, weather data at the top the list, with audio/video data, telematics, social-media content, and internet clickstreams lagging behind. But what’s really happening when those big data sources are loaded into a traditional structured database? Most likely, a few key elements are pulled from the data and the rest is ignored, allowing the so-called big data to be treated like “small data,” processed along with all the policy, claims, and customer data already being stored. For example, while a weather feed might be coming with minute-by-minute updates, most insurers are probably just pulling region condition along with daily temperature highs and lows, filtering it down to a subset that stores easily. While I’m not saying such reduced data doesn’t augment an insurer’s understanding of incoming claims (for example), it’s far from what we think about when we imagine how a big data analysis of the weather might impact our understanding of insurance claim patterns.

    There’s no denying that there are a few exciting use cases of insurers truly leveraging big data, but it’s still extremely limited. The industry as a whole is getting much better at data analysis and predictive modeling in general, where the business benefits are very clear. But the use cases for true big data analysis are still ambiguous at best. Part of the allure of big data is that insurers will be able to discover new trends and new risk patterns when those sources are fully leveraged, but “new discoveries” is another way of saying “we’re not yet sure what those discoveries will be.” And for many insurers, that’s not a compelling enough rationale for investing in an unfamiliar area.

    And that investment is the second problem. The biggest insurers may have the budget to hire and train IT staff to work on building out a Hadoop cluster or set up several MongoDB servers, but small to mid-size insurers are already stretched to their limits. Even insurers who dedicate a portion of IT budgets to innovation and exploration are focusing on more reliable areas.

    What this means is that insurers–no matter how many surveys show they anticipate its adoption–will likely not see a significant increase in big data tech. However, that doesn’t mean the industry will let big data pass it by. Instead, much of the technology innovation around big data will need to come from the vendor community.

    We’re already seeing a growing number of third-party vendors that provide tools and tech to do better analysis and get deeper understanding from big data, a second-generation of big data startups. Most of these vendors, however, expect that the insurer will already have a Hadoop cluster or big data servers in place, and (as we know) that’s exceedingly rare. Instead, vendors need to start thinking about offering insurers a “big data in a box” approach. This could means SaaS options that host big data in the cloud, appliances that offer both the analysis and the infrastructure, or even just a mix of professional services and software sales to build and manage the insurer’s big data environment on which the licensed software will run.

    We’ll also begin to see insurance core system vendors begin to incorporate these technologies into their own offerings. The same thing has happened for traditional data analytics, with many top policy admin vendors acquiring or integrating with business intelligence and analysis tools. Eventually they’ll take a similar approach to big data.

    And finally, some third-party vendors will move the entire big data process outside of the insurers entirely, instead selling them access to the results of the analysis. We’re already seeing vendors like Verisk and LexisNexis utilize their cross-industry databases to take on more and more of the task of risk and pricing assessment. Lookups like driver ratings, property risk, and experience-based underwriting scores will become as common as credit checks and vehicle history. These third-party players will be in a better position to gather and augment their existing business with big data sources, leveraging industry-wide information and not just a single book of business. This mean that smaller insurers can skip building out their own big data approach and instead get the benefits without the technology investment, and they can compete against bigger players even if their own store of data is relatively limited.

    So while the numbers in Novarica’s latest survey may look low and the year-on-year trend may show slow growth, that doesn’t mean big data won’t transform the insurance industry. It just means that transformative change will come from many different directions.

    On Tuesday, July 14th at 2 pm I’ll be hosting a Business Intelligence and Analytics webinar, which will review the recent report, go into more detail on big data, and cover how insurance is being transformed by the growth in available data and information both inside and outside the enterprise. For more information, visit:


    Related Report

    Top Technology Priorities for Personal Lines Carriers

    Jeff Goldberg

    Today’s personal lines marketplace is more competitive than ever due to slow growth, intense price competition and customer acquisition costs rising.

    Personal Lines insurers have always been a leader in insurance technology innovation and conversations with CIOs and research in the space shows that trend will continue, with technology playing an ever-larger role in insurers’ ability to attract, retain, and profitably serve clients. Across the industry, insurers continue to make investments across the Novarica Insurance Core Systems Map.

    Novarica Insurance Core Systems Map: Personal Lines

    Novarica Insurance Core Systems Map: Personal Lines

    In a market with very competitive conditions and intense profitability pressures, personal lines carriers are focusing on growth strategies, expense reduction, and improving underwriting results. Below I have listed four technology priorities CIOs and business executives should consider to remain competitive.

    Business Intelligence
    A data quality initiative, which examines data warehousing, operational data stores, and appropriate data marts, is key before undertaking more advanced business analytics initiatives. Once data quality is ensured, carriers can then overlay business intelligence tools. Predictive analytics tools for carriers with sufficient data are becoming more popular. Small carriers should look at working with an organization that can provide pooled data and insights. All carriers can use models to improve underwriting insights, to more consistently apply pricing, and to improve claims activities. In addition, third party big-data sources are going to become more and more prevalent for personal lines insurers. Companies who take advantage of this first will have an edge in pricing and retaining business.

    Policy Administration Systems
    Upgrades to policy admin systems will help carriers gain operational efficiencies and flexibility in the ability to add data. Using business rules to manage workflow and predictive analytics to build pricing models can improve risk selection, risk pricing, and reduce operating expenses. Carriers should look for highly configurable solutions with product configurators, simple rules, and tools for launching new rating algorithms. They should also look for the ability for business units to make their own modifications, though practical experience with configurable systems reveals IT often still ends up managing most changes. As long as the time and cost of such work for IT is reduced, that’s still a big value.

    Agent Connectivity
    Extending functionality to the agents continues to rise in importance. It’s less and less about differentiation and more and more simply the price to pay to be in the game. At this point in time, most personal lines insurers have built an agent portal, and are often quite proud of the results. As a next step, both agents and carriers would prefer to receive and provide information electronically and process that information with as little human touch as possible, eliminating double entry. Real-time upload, download, and data translation deliver tangible benefits including reduced costs of handling, improved data quality, and improved turnaround time.

    Claims Management
    Streamlining claims management by automating processes improves customer service by speeding up claims service, providing consistent and fair best practices to all customers, and delivering personal service. On top of these customer benefits, insurers who have implemented modern claims systems report tangible speed-to-market benefits. If a carrier hasn’t already begun to upgrade their claims administration system, now is the time to start. Carriers who are using modern systems are rapidly gaining competitive advantages by improved efficiencies in claims handling and improved data leading to better outcome management. In addition, better claims processing has become a significant part of how personal lines insurers market themselves to consumers and how consumers select an insurer.

    These technology priorities are based on the expertise of Novarica’s staff, conversations with members of the Novarica Insurance Technology Research Council, and a review of secondary published resources. For more information, download a free preview of our new Business and Technology Trends: Personal Lines report at: http://novarica.com/business-and-technology-trends-personal-lines/ or email me to set up a complimentary 30 minute consultation.

    Related Reports

    Enterprise Architecture and Digitalization

    Mitch Wein

    At our 8th annual Novarica Insurance Technology Research Council Meeting, discussion began with a definition of “digital” at the Enterprise Architecture and Digitization breakout session. Novarica stresses that digitalization encompasses the entire customer interaction lifecycle, from front-end to back-end. A common theme was that of silos: a given process might be automated, but still require manual elements to be fully executed. A example of this could be creating and exposing a form through a portal but requiring the form to be mailed in by the agent or customer and scanned in by the carrier back office. In part, this is a function of change management and budget – an agent portal project as a quick win has appeal compared to a more exhaustive transformation.

    We explored how prepared carriers were for digital given the definition above. Clearly key emerging technology trends like IoT, social media, big data and cloud were pushing firms to reconsider what they needed to do to remain competitive. While some firms were doing regular Strength, Weakness, Opportunity, Threat (SWOT) reviews, many were not. All agreed that a SWOT analysis should be an ongoing effort.

    As part of their digital transformation, many carriers were concerned about data quality (and thus resisting automation and subsequent “exposure”). They were also under investing in areas that will be critical to their digital success; many carriers do not have business and technology architects involved with their digital efforts or in some cases don’t even have a business or enterprise architecture group!!!

    We briefly touched on security and the importance of making sure security risks are mitigated to avoid reputational, regulatory or legal exposure that could impact a digital insurer. Existing cyber security frameworks like the NIST framework released in 2014 can save carriers time and effort, by providing metrics and a framework that regulators are familiar with. Security exposure of vendor practices and products was mentioned as key to a carrier. We discussed requiring vendors to conduct security audits and report on the results, which can serve as a lever for negotiations.

    Finally, we talked about the need to introduce a test and learn culture into the carriers, allowing projects to “fail fast” if they do not work well. We related this to the notion of Fast IT and how it compares with Core IT. Core IT will often use traditional waterfall methodologies and take a long time to evolve systems. In the new digital world Fast IT will behave differently. Fast IT will evolve solutions through Agile processes and allow for solutions to be “thrown-away” if they are not working out. We discussed examples like smartphone portal front-end apps for agents, brokers, or customers. All of the carriers participating felt that the Fast IT concept was new to their firms and will probably difficult to adjust to from a process and funding perspective.

    After digital, we moved on to enterprise architecture. We discussed how a reference architecture can provide consistency across projects as well as help with project prioritization. This allows carriers to be proactive and not just chase the hot new technology and feeling constantly behind. Linking architecture governance, aligned to a reference architecture, to funding governance through logical gates was being done in an informal way in some carriers and not at all in others.

    If you’d like to discuss enterprise architecture or digitalization further, please feel free to email me to set up a complimentary 30 minute consultation.

    Related Reports

    Related Blogs

    Upcoming Novarica Insurance Technology Research Council Meetings
    The Novarica Insurance Technology Research Council is a free, moderated knowledge-sharing community of nearly 400 insurer CIO members. Members represent a cross-section of property/casualty, life/annuity, and health insurers, and range from the very small to the largest companies in the industry. Some of our upcoming 2015 events include:

    Will Big Data Replace the Enterprise Data Warehouse?

    Jeff Goldberg

    At our 8th annual Novarica Insurance Technology Research Council Meeting, we had some great conversations about data, both as part of the formal event agenda and also via informal discussions during breaks or meals. Many of the trending or future technologies being discussed (Internet of Things, wearable devices, social media) from an insurer perspective are really just more and bigger channels of data. And just in case you think insurance data is lacking in thrills: my favorite story of the day was about auto insurers looking for fraud or stolen vehicles by integrating data feeds from the license plate scanners used by bounty hunters.

    At the analytics breakout session I had the opportunity to sit around a table with CIOs from a variety of insurers of different sizes and lines. One thing that struck me was how much big data and big data technology is part of the conversation. Novarica typically urges caution when asked about big data, stressing that insurers shouldn’t go looking for a use case to fit a technology. But it seems that more insurers are finding the use case which leads them towards a big data approach, and it raises questions about the future of the enterprise data warehouse.

    Building a data warehouse with a single data model that supports the whole business is notoriously difficult, with a success rate that drops as the amount of data and number of lines of business get larger, until becoming a near impossibility. A big data approach allows bringing together many different data sources with different structures without having to go through the normalizing and cleansing process that often derails EDW projects. While both approaches have value, larger insurers are discovering the big data route may be their only option for cross-business analysis. And some smaller insurers see this as a way to rapidly gather and review data as well, often running side-by-side with an existing data warehouse. In fact, in at least one case, an insurer’s big data lake has become the first step in the data workflow, with a data extraction from it feeding the older data warehouse as a way to maintain legacy processes.

    We’re currently reaching out to the Novarica Insurance Technology Research Council with a survey about analytics and big data, so will be reporting soon on just how many insurers have taken a big data approach. But even though we expect the percentages to remain small, it’s clear the growth will continue. The tools and options around big data are increasing and more vendors are providing services that make it easier for insurers to leverage big data tech without having to build it from scratch themselves. In time, I expect to see more insurers running a big data project alongside an enterprise data warehouse, and, in some cases, taking over as the hub of that insurer’s data.

    If you’re interested in talking more about big data and data technology, please feel free to email me to set up a complimentary 30 minute consultation.

    Related Reports

    Related Blogs

    Upcoming Novarica Insurance Technology Research Council Meetings
    The Novarica Insurance Technology Research Council is a free, moderated knowledge-sharing community of nearly 400 insurer CIO members. Members represent a cross-section of property/casualty, life/annuity, and health insurers, and range from the very small to the largest companies in the industry. Some of our upcoming 2015 events include:

    Enterprise Data Initiatives Now Taking Center Stage as Insurers Look to Improve their Digital and Analytical Capabilities

    Mitch Wein

    With insurance carriers of all sizes and lines of business focusing on improving their digital and analytical capabilities to meet customer expectations, the importance of enterprise data has never been greater.

    Over the last 3-5 years data has once again become a key focus of the insurance industry. Data is now seen as a key enabler of the insurance industry’s evolution into a fully digitized provider of risk services focused on customer, not product.

    When data is collected from either internal or external sources, the CIO becomes responsible for its storage, management and use. This becomes highly complex in today’s world since data is both structured and unstructured and is controlled by various legal and regulatory considerations, as well as internal security and risk policies. Based on the direct experience of Novarica’s senior team and its Council members, the following best practices should be considered when initiating an MDM initiative.

    • Create policies around data and how the data should be managed and controlled.
    • Establish organizational structure for data.
    • Link MDM architecture to business goals and objectives.
    • Consider supplementing internal data in an MDM infrastructure with enriched data and external big data
    • Determine what system or process creates the official data of record.
    • Determine the senior level project sponsor and who pays.
    • Deploy multi-year MDM programs in an incremental fashion.

    MDM is more than a technology. It is a program of work involving an assessment of business needs, a data sourcing strategy, a data cleansing strategy to address quality, an architecture and integration initiative, data documentation and classification, as well as an organizational evolution for data governance and ownership. The goal is to create a single view of the master data which can then be referenced by all systems, reporting, and business processes.

    Novarica’s experience has shown that the data governance and ownership dimension is often the hardest aspect of this program after identifying a business sponsor and developing a solid business case. Who manages the data, who owns the data, and who is ultimately accountable are difficult challenges that must be addressed.

    If your organization is in the process of implementing an MDM program or if your efforts have stalled, please send me an email to set up a complimentary 30 minute consultation.

    Recent Novarica CIO Checklist Briefs

  • Master Data Management: A CIO Checklist
  • Architectural Governance: A CIO Checklist
  • Preparing for Digital Transformation: A CIO Checklist
  • The Usage-Based Insurance (UBI) Short Cut: Developing a usage-based insurance program has now gotten easier

    Thuy Osman

    I’ve been following telematics in insurance since 2012 and the main thing I’ve noticed is that the application of telematics by insurers moves at a sluggish pace. While telematics service providers continually improve their platforms and services, offering insurers multiple ways to collect data (connected car, OBD plug in, mobile app), numerous data points to analyze (speed, braking, cornering, road type, etc.), and various applications for the analyzed data (from underwriting to claims), many insurers are still stuck in the initial phase of designing and developing a UBI program.

    Aside from the issue of patents, the concern I hear most from carriers who are in the process of developing a UBI program is the data approach. Deciding on what data to collect, how to collect the data, and how to store and analyze the data is a huge obstacle. The main point of the data process is to arrive at some sort of driver risk score. Instead of throwing time and resources to come up with that score, carriers can now partner with an external analytics vendor who will collect, analyze and synthesize the data into a risk score that can then be used by underwriters to determine eligibility for discounts or rewards.

    Partnering with an analytics vendor and using their program, such as Verisk Telematics Safety Scoring program (which, it was recently announced, is now approved in 41 states), essentially allow carriers to skip over the data process and jump straight to developing insurance products based on risk scores. Sure, carriers will still have to determine which driving behaviors constitute risk to their company. But, checking off the behaviors on a list for the analytics provider is far more efficient than planning out the hardware to collect the data, storing the data, and in particular, wading through pools of data to make some sense of it all.

    A few years ago at a summit for telematics in insurance, the presenters were talking about the possibility of deriving a driver risk score from the telematics data collected from a vehicle. At that time, only a small number of vendors had large enough data sets that could be representative of a segment of the customer driver pool. Today, analytics vendors and telematics service providers alike have collected enough miles and other driving data to develop an algorithm (however simple or complex) to calculate a driver risk score. In fact, providing carriers with a driver score is one of the standard services a TSP provides.

    For carriers who are looking to develop a UBI program, or even for carriers who are already entrenched in this process, partnering with an external vendor can save time and decrease the risk involved in launching such a program. As carriers see the benefit in this partnership and as they begin to outsource more to the data process to the data experts, maybe the application of telematics in insurance will pick up.

    If you’re interested in discussing this topic further, please contact me at email to arrange a call.

    Related Research

  • Telematics in Insurance: Key Issues and Trends 2014
  • The Evolving Data Analytics Group: Applying Lessons from IT Organizations of the Past

    Jeff Goldberg

    I’ve been working with a lot of insurance companies who are struggling to find the right home for data analytics within their organization and I’m struck by the similarity these questions have to the more general evolution of IT organizations over the last two decades. Matt Josefowicz pointed this connection out in a blog post and I’d like to examine the phenomenon in more detail.

    When I call this a “struggle” I don’t mean that in a negative way. Often companies try out different approaches with an updated technology or a shift in process or–in this case–an entirely new group because they believe in its value to the business but industry best practices haven’t been worked out yet. This struggle is part of the growth and leadership process. That’s definitely the case now with data analytics and was (and is still!) the case with IT. There won’t be a general agreement about the best practices for a data analytics group within an insurance company for a long time yet (if ever) but we can shortcut some of the learning curve by looking at a similar evolution of the IT organization.

    The most common question: where does the role of data analytics fit? Should insurers create a new separate data analytics group or should different data analytics resources report directly to their respective business units? Insurers are trying it both ways right now, and there’s no sure right answer. This is the same centralized v federated (or horizontal v vertical) debate that has gone on for decades in regards to IT resources.

    By centralizing data analytics resources together in their own corporate unit, it (1) allows a sharing of skills, tools, and best practices, (2) avoids redundancy and rework, and (3) leads to an easier adoption of a corporate mission for insight and business intelligence. By federating resources out to each operating business unit it (1) allows very tight alignment of business goals with the assigned data analytics expert, (2) helps that expert gain a depth of understanding about the business that they may lack otherwise, (3) better promotes the mission of data analytics to business users.

    These are similar–if not the same–as the drivers in IT, and just like IT there are benefits to both approaches. In fact, for IT alignment, while shifting between a horizontal and vertical approach, many organizations have found that the shift itself is valuable, giving employees the opportunity to spread what they’ve learned to others, either in terms of business insight or best practices. So insurers trying different approaches to the organization of data analytics should be rest assured that multiple approaches all have value.

    The second similarity between data analytics now and IT organizations of the past is about recruiting talent. These days colleges offer a variety of information technology and computer science degrees, creating a pipeline of potential employees. But that wasn’t always the case, and insurance companies (and companies in other industries) had to staff their IT departments by hiring out of other engineering programs or find people who had a technical aptitude and train them in computer programming.

    With data analytics, there’s a similar lack of clarity about who to hire. Some insurers are recruiting PhDs and creating teams of data scientists, others are looking internally for technical staff who have a knack for data insight and exploration and can be cross-trained. But as demand grows, more universities will offer data analysis coursework at an undergraduate or masters level, increasing the availability of trained hires. Of course, just like IT, insurers will be competing against specialized companies to recruit those graduates, and will need to figure out how to attract them to our industry.

    If you’ve been struggling with the role of data analytics within your organization or are interested in benchmarking your company’s BI approach against your peers, please feel free to reach out to me. To send me a note or set up a complimentary 1 hour consultation, contact me via email.

    Mitigation versus Prevention: Three Questions Insurers Must Answer In Order to Evolve in a IoT and Big Data World

    Jeff Goldberg

    The insurance industry is only just beginning to deal with a host of emerging technologies, including Big Data, the Internet of Things (IoT), and a vast wealth of sensor data such as automotive telematics, smart watches, and fitness trackers. In 2015 it’s likely that the industry as a whole will take some small steps (and maybe a few big ones) towards capturing this plethora of data and applying it towards better underwriting decisions and risk management. But there’s more at stake than just incremental improvements to the existing business model.

    There are three key questions that need to be answered in order to move towards fundamental change:

    1. Will insurers be able to develop the technology (Big Data or otherwise) to capture and use this growing amount of customer data?
    2. Even if insurers can build this technology, will customers be willing to share their growing streams of personal data with insurers?
    3. How can insurers use this customer data to do more than just update their current process?

    For the most part, the industry has been fixated on the first question. Insurers have been experimenting with Big Data technologies and looking into hiring Hadoop experts, and while there are a few examples of big impact projects, the majority of insurers haven’t made much headway. Doing more with traditional data technologies has been sufficient, partially because–despite all the hype–the bulk of the business is still dealing with “small data.” Everyone knows about how some auto insurers gather vehicle driving data from participating customers, but outside of personal auto examples are rare.

    But what happens when all new vehicles in the marketplace capture driving data, not just those where customers have plugged in insurer-provided dongles? And what happens when a maturing IoT means that all customer homes are gathering data about safety and security? And what about when a significant percentage of people are wearing devices that monitor their health and well being?

    This leads us to the second question: Will insurers even be able to get access to this data?

    Right now many consumers seem to be willing to share personal data with a host of third party companies in return for convenience and cool features. But will consumers be as open to sharing this data with their insurance company? Many fear that their data will be used against them, resulting in higher rates or cancelled policies. That’s a hurdle the industry needs to overcome.

    In terms of true goal alignment, the insurance industry is actually much better positioned than many tech companies that currently control so much of the current personal data flow. The main goal of these tech companies is to use personal data to monetize the consumer. Companies like Google and many other less-trustworthy third parties want access to the customer’s data in order to properly position advertisements. Companies like Apple and high tech device makers want access to the customer’s data in order to sell them ever more gadgets. But insurance companies want access to the customer’s data to manage a customer’s risk, not to advertise or continually sell more to them.

    But how does an insurer convince a consumer that they will use their data to help them rather than to sell or advertise to them? With automotive telematics we’ve learned that the first step is monetary: provide discounts. But the second step is making use of that data for more than just underwriting and risk management.

    This brings us to the third question: How can insurers use this customer data to do more than just update their current process?

    As the amount of real-time customer data expands and an insurer’s ability to process and understand that data grows, insurers will find themselves able to make informed insights about a customer earlier in the process. Instead of responding to a burst pipe after a winter freeze, an insurer monitoring home sensors will be able to alert a customer before weather damage occurs, saving both the customer and the insurer time and money. Instead of paying claims after an auto accident, an insurer will be able to make recommendations to a customer about patterns that will help them avoid accidents altogether. A workers comp insurer tracking RFID badges will be able to help reduce worksite mishaps and liability.

    Insurers will be in the unique position of looking out for the customer’s well being, helping to prevent accidents, theft, and loss. Unlike most industries, the insurer wants exactly what the consumer wants, and both are happier when claims never need to be filed.

    While some insurance companies might lead this charge, it’s unlikely that all insurers will develop the technology to capture and analyze the full range of data. Instead, third party companies will probably emerge that serve as consumer data hubs and begin to monitor and make suggestions to their clients. Insurers who don’t build this technology themselves will have the opportunity to partner with these third party companies, offering discounts and possibly covering the cost of the services for their customers.

    The new year is a good time for thinking about longer term change. As multiple technologies converge, insurers should be planning for their role to evolve, migrating from one of just risk mitigation to include risk prevention as well.

    As always I welcome your feedback. To send me a note or set up a complimentary 1 hour consultation, contact me via email.

    Related Reports

  • Big Data Technologies for Insurers
  • Big Data Lessons for Insurers from Other Industries (Executive Brief)
  • Insurer Digital Capabilities and Strategies
  • Wearable Technologies and Insurance
  • Benchmarking the “New Normal”: 50 Advanced Capabilities for P&C Insurers
  • Telematics in Insurance:Key Issues and Trends in 2014