Reading KPCB’s Annual Internet Trends Report

Matthew Josefowicz


As always, the KPCB Annual Internet Trends Report, published yesterday by Mary Meeker and her team makes for some fascinating reading.

Insurers may want to pay particularly close attention to the following sections:

  • Pages 61 and following, on how retailers like Amazon and others have used their massive data collection capabilities to launch private labelled products in far-flung categories like outdoor furniture and fashion.

  • Pages 66 and following, on how companies are micro-segmenting their target markets and designing niche products for them (see our “Quick Quote” thoughts on how It’s All Program Business Now)

  • Pages 102 and following, on the evolution of new messaging platforms to support business conversations. While financial services examples are in Asia today, they’ll be in North America over the next decade.

  • Pages 151 and following, on changes in the auto industry and usage trends, of special interest to Personal Lines insurers.

  • Pages 188 and following, on how non-tech incumbents have increased their purchase of tech new entrants by 2.6x since 2012. Insurers aren’t the only ones buying innovation.

The data-as-a-platform and cybercrime sections are also interesting, although those sections are heavily focused on what seem to be KPCB portfolio companies.

Update from the CIO Insurance Summit

Rob McIsaac

It is amazing to think that we’re already into the second quarter of 2016. The year is moving fast, and with it the opportunity to see how carriers are responding to the urgency of a “new normal” is coming into clearer focus. I had the good fortune to be able to be the “Master of Ceremonies” for the CIO Insurance Summit in NYC on April 5th, which clearly provided insight into what is on the mind of participant carriers. The pace of activity is notably picking up for many lines of business although, as one of my mentors once shared, it is important to avoid confusing activity with progress. Real progress appears to be somewhat elusive but the quality of the dialogue definitively appears to be elevated. 2016 promises to be a very interesting time.

From our sessions in NYC a number of clear themes emerged that are worth sharing.

Cloud deployments are getting more real even as concerns remain in some quarters. We had a wide ranging discussion related to the opportunities and concerns that seem to concurrently emerge around the use of cloud based options for carrying mission critical workloads. In many organizations it appears that “security” and “compliance related risks” remain at the forefront of an inability to foster faster adoption. As we explored this, I discovered that many companies still haven’t made the connection that efforts to move CRM (SalesForce) and e-mail (Office 365) have already broken through a barrier that appeared daunting to carriers in the very recent past. Having effectively put these installations to the test, carriers with these implementations are increasingly willing to acknowledge that the security models are as good as, if not better than, what they are able to implement for their own environments. As additional mission critical workloads migrate toward this type of deployment (e.g., Workday for HRD and financials), it helps to push organizations to articulate what the real concerns are and how to best address them. The reality is that this is frequently not so much a technology issue as it is one that is linked to emotional, political or organizational issues that need to be addressed before the discussion turns to the selection of a hosting service. For CIOs and their teams getting in front of the issue to do effective education of business partners as well as developing a point of view on which cloud based providers are best positioned to be part of their tool kit (they are not all created equal) can be part of an effort to build momentum and organizational support. Going “full to bright” in a short timeframe may be too much for many organizations to accept, which runs the risk of triggering an enterprise immune system reaction that can be painful.

Data governance remains a significant concern. With a myriad of business units, products, core record keeping systems, and rules of engagement that may conflict from one line of business to another, this remains an area of notable concern – and investment – at carriers we spoke to. Most carriers still lag far behind the banking world in terms of an ability to understand their business from the outside looking in (rather than from the inside looking out). However, there continue to be advances in the idea that there is value in gaining a full view of both customer and producers, and that the investment in both technology and business process to allow for achieving informational insights from internal data can be quite high. A common theme among participants appears to be the desire to construct a 360-degree of customers but that breaking through some of the organizational barriers within companies can be daunting. For carriers considering this challenge, investing time and money to really build a robust data governance facility can be highly valuable. Even some of the compliance related effort for Know Your Customer (KYC) initiatives can also create operational and marketing benefits if used correctly. Once again, however, one of the challenges that can be most difficult to overcome is the “human” one. If line of business executives and managers are focused on optimization at the business unit level, while data analytics efforts around customers are focused at the enterprises level, the inherent conflict can substantially mute any resulting benefits to the organization. Being clear that this is not purely a technology issue is key to achieving desired outcomes. One key reality that becomes clear as companies talk about their desire to use better analysis of data to improve a range of business outcomes: while many talk about Big Data, struggles remain with managing Small Data in quantity.

Definitive plans to address Millennial needs remain elusive although awareness is elevated. We had a lively discussion about this issue, focused on a number of key challenges facing carriers. At a time when 10,000 Baby Boomers retire every day (and concurrently 9,000 children are born to Millennial parents each day) the opportunities associated with getting positioned to take better advantage of market dynamics would appear to be very clear. That said, most carriers acknowledge that they have not yet “cracked the code” on how to best position themselves in terms of both products and service models that will effectively speak to a new generation of potential consumers. We discussed some of the efforts being put forward by companies to better understand these dynamics (e.g., MassMutual’s coffee shops) but a reality is that the answers to changing market needs will require some level of experimentation and testing of hypotheses, an approach which may well be counter-cultural for the very organizations whose long term success is impacted by their ability to start thinking differently. Avoiding a “Kodak Moment” can be a function of how well carriers deal with a range of challenges including the demographics of their agency forces; the average age of an agent in the USA is now 59 with an estimated 25% of today’s agents potentially leaving the business by 2018. Concurrently, a number of carriers noted that their own HR policies and procedures do not appear to be adjusting appropriately to deal with the increased velocity of voluntary turnover associated with the emerging Millennials who will represent 50% of the USA labor force by 2020. Mentoring programs, efforts to create more varied experiences that allow for expanded horizontal movement within organizations, and increased flexibility related to geographic location have proven to be effective “tools of the trade” for some organizations as they’ve moved to adjust to a new reality, but the broader trend remains clear. The emerging generation of employees will have a very different relationship with employers in the foreseeable future than their parents or older siblings did. Implementing procedural changes for everything from employment procedures to knowledge management will be important to operational effectiveness.

The increased urgency at carriers is timely. With cycle times across many facets of the business being reduced, user tolerance for poor experiences being driven down and the competitive threats from many quarters being elevated, the current planning horizon is moving with surprising speed. Welcome to the future!

If you’d like to get a copy of the presentation materials used in NYC, please let me know by sending me a note at RMcIsaac@Novarica.com.

The Parallel Paths of Data Strategy

Jeff Goldberg

Data strategy at an organization follows multiple paths:

> Data Governance and Definitions
> Data Process, Quality, and Augmentation
> Data Warehousing
> Reports, Dashboards, and BI
> Predictive Analytics
> Big Data and External Data Sources

When I talk to insurers about their data strategy, I like to assess how far along they’ve progressed on the above paths. The exact breakdown (or naming) of these data strategy paths can vary from company to company depending on priorities and opinion. But the key is that they all matter and, just as importantly, that they happen in parallel rather than in series.

There are two problems I find when diving into the strategy details. Either (a) some of the critical paths of data strategy have been ignored or, the opposite issue, (b) some of the incomplete paths are being treated like roadblocks to other progress.

The first problem is pretty easy to understand. If an insurer focuses on just data warehousing and reporting (one of the most common scenarios) the data will never really represent a single-source-of-the-truth, the reports and other BI will always be in contention, and there will be lots of greater values left on the table. For another example, if an insurer puts all their effort into predictive modeling, those models will never be as deep or precise as they could be with better data for analysis. It’s not a surprise, though, that this kind of uneven approach happens all the time; a balanced data strategy is difficult and few insurers have the resources or skill in all areas. The different paths require various technological expertise, while still others require political will.

The second problem, on the other hand, requires rethinking how these different data strategy paths interact. Up above I’ve lined them up in what seems like a natural order: first you need to have some kind of governance group that agrees on what the data means, then you need to have a process to clean and flow the data through the different systems, then you aggregate the data into a warehouse, then you report on it, then you analyze it and build predictive models, and only then do you think about bringing in big data to the picture. It makes logical sense. But it’s also wrong.

The reality is that an insurer can work on all of those paths in any order and/or simultaneously. You don’t need a perfect data warehouse before you start thinking about predictive modeling (in fact, there are plenty of third-party vendors who help you skip right to the predictive models by using industry data). You can run reports directly off your claims system even if it’s data in isolation. Nowhere is there more proof of this than the fact that most insurers hardly have any data governance in place but have still moved forward in other aspects of their data strategy. That doesn’t mean a company should ignore the other paths (that leads to the first problem), but it does mean progress can be made in multiple areas at once.

What’s important to understand is that all these different data strategy paths enhance each other. The further an insurer is down all of them, the stronger each one will be, leading to the best decision making and data-driven actions.

So it’s always good to step back and take a look at the data across an organization, assessing each of these paths individually and seeing what can be done to move each forward. A good data strategy has a plan to advance each path, but also recognizes that no path needs to block another depending on current priorities.

Why Aren’t Insurers Doing More With Their Data Strategy?

Jeff Goldberg

While the business intelligence space has matured greatly in the last decade, it has been and remains an area where insurers need to work with a variety of platforms and tools to build out their capabilities. This requires a mix of technical skillsets, business expertise, and vendor relationships. While few efforts at an insurer are more complex or time consuming as a core system replacement, a major BI initiative will eventually touch all aspects of an organization. I will be presenting more on this topic in a webinar on December 1, 2015.

Today there are more vendors that provide a full business intelligence suite which includes the data layer, the industry-specific data models, the presentation and visualization tools, and the pre-built insurance reports and dashboards. But these suites still need to be tailored to and integrated into the rest of the insurer’s environment. Some policy administration system vendors are either pre-integrating with or releasing their own business intelligence suites. This does simplify the deployment but adds another variable to the “suite vs best-of-breed” decision, and until these options have more production exposure most insurers are still opting for best-of-breed.

For now, most of these approaches don’t provide some of the ancillary but very important data and analytics areas such as predictive modeling tools and the models themselves, the use of aggregated third-party data sources, or the emerging area of big data. And no matter what approach an insurer takes, it is a near-universal condition that there will be siloes of historical data that need to be considered with or migrated into the new BI solution, and that will take time and effort.

So despite plethora of vendor options and the general acknowledgement across the industry that good business intelligence is key to ongoing success, why aren’t more insurers further along in their data strategy?

1. Most insurers struggle with multiple legacy systems and siloes of disparate data, and they are still at the first steps of bringing that data together.

2. The data that does exist in the organization is of variable quality or completeness. New systems don’t immediately solve that problem without a broader plan in place.

3. Insurers and core systems have traditionally looked at data from a policy view, complicating the effort to move to new models that tend to take a 365 degree customer view.

4. Many insurers still have no formal data governance in place and lack organizational agreement on data definitions.

A good vendor partner can help put the framework and some tools in place to solve the above four roadblocks, but it requires more than just technology. It requires process and cultural change, which can’t be driven solely by IT.

Many insurers are still looking for a data champion to help push a strategy across the organization and get buy-in from other business leaders. As organizations mature, this data champion role is often formalized as a Chief Data Officer (CDO), and that person typically has a strong data background. But for insurers who are still looking to get a data strategy off the ground, it’s most important to find a leader who is respected in the organization and who is passionate about the value that good business intelligence can bring, even if they have little data experience themselves.

Related Reports:

  • Business and Technology Trends: Business Intelligence
  • Novarica Market Navigators: Business Intelligence Solutions
  • Workers’ Comp Insurers Look to Analytics and Core Systems

    Jeff Goldberg

    In our most recent Novarica Council Special Interest Group meeting, several Workers’ Compensation CIOs discussed core systems replacement strategies and long-term visions, as well as emerging uses of mobile, analytics, and end-user-facing technologies.

    All the attendees were in various stages of core system replacement—ranging from just-completed to the initial stages—so they were eager to learn from others’ experiences, and to gain perspective on their own challenges. Everyone agreed that a flexible, modern core system was at this point table stakes, hence the flurry of transformation activity. A minority of companies are changing appetite, but the vast majority of P/C insurers are looking to grow by moving into new territories. To do that a modern, flexible system is absolutely necessary.

    The shrinking lifespan and growing price of core systems was another area of concern. Everyone agreed that that new core systems are increasingly costly to implement, and that they must be replaced more frequently than older legacy systems. Gone are the days that a core system lasted for 40 years. Some participants also noted that many strategic business initiatives—like new product deployment—must be put on hold during a multi-year implementation project, increasing the indirect costs of the implementation.

    As an antidote to this gloom and doom, the CIOs in attendance were confident in their strategy to overcome these obstacles. Core systems today are much more flexible than legacy systems, relying on componentized architectures and configurable logic, meaning that the next round of replacements (and possibly even conversions) should be easier. More importantly, past lessons have been learned, and both insurers and vendors know how important it is to avoid custom coding and to stick with a vendor’s upgrade path. If those rules are followed, ten or fifteen years down the road the insurer’s system will be “new” even if they’ve stayed with the same vendor and system all along! It’s critical to choose a vendor who acts as a long-term partner and not just a one-time purveyor of a technology.

    Of all the strategic considerations discussed, one of the most important was a concrete plan for data conversion and sun-setting old systems. One participant shared that if he could go back in time, he’d focus much more on a transition plan, so as not to lose the project’s momentum after go-live. Other attendees described the challenges of data conversion and new data warehouses, and the legal and data integrity-related risks of fully sun-setting old components.

    Attracting and retaining good talent was another concern for many of the attendees. One insurer reported being well ahead of their Guidewire implementation schedule, due to a concerted effort to focus talented IT and other business unit resources on the project. Several attendees noted that the structure of their projects—agile, waterfall, or a combination of the two—was much less important than the staffing and communication strategies of those projects. When agile first started becoming prevalent in the insurance industry, carriers all over the country were told it might be the answer to all their project/logistical problems. But that’s not how software works. Everyone in attendance was reminded that there are rarely, if ever, silver bullets for these huge problems.

    Related Research:
    Business and Technology Trends in Workers Compensation

    An Honest Look at the State of Big Data in Insurance

    Jeff Goldberg

    With the recent publication of Novarica’s Analytics and Big Data at Insurers report, it’s time to take an honest look at the state of big data in the industry. One of the most telling–and disappointing–charts showed that of the insurers working with big data, seventy percent were using traditional computing, storage, database, and analytics technology, meaning they’re working with SQL databases in their existing environment. Of all the other technology options (Hadoop, NoSQL, columnar databases, etc) only a small percentage of insurers use them and almost no insurer uses them extensively.

    big_data_adopt_2015

    Compare that to the percentage of insurers who say they are using big data sources, which is significantly higher than the percentage of insurers using big data technology. This includes third-party consumer or business data, geospatial data, weather data at the top the list, with audio/video data, telematics, social-media content, and internet clickstreams lagging behind. But what’s really happening when those big data sources are loaded into a traditional structured database? Most likely, a few key elements are pulled from the data and the rest is ignored, allowing the so-called big data to be treated like “small data,” processed along with all the policy, claims, and customer data already being stored. For example, while a weather feed might be coming with minute-by-minute updates, most insurers are probably just pulling region condition along with daily temperature highs and lows, filtering it down to a subset that stores easily. While I’m not saying such reduced data doesn’t augment an insurer’s understanding of incoming claims (for example), it’s far from what we think about when we imagine how a big data analysis of the weather might impact our understanding of insurance claim patterns.

    There’s no denying that there are a few exciting use cases of insurers truly leveraging big data, but it’s still extremely limited. The industry as a whole is getting much better at data analysis and predictive modeling in general, where the business benefits are very clear. But the use cases for true big data analysis are still ambiguous at best. Part of the allure of big data is that insurers will be able to discover new trends and new risk patterns when those sources are fully leveraged, but “new discoveries” is another way of saying “we’re not yet sure what those discoveries will be.” And for many insurers, that’s not a compelling enough rationale for investing in an unfamiliar area.

    And that investment is the second problem. The biggest insurers may have the budget to hire and train IT staff to work on building out a Hadoop cluster or set up several MongoDB servers, but small to mid-size insurers are already stretched to their limits. Even insurers who dedicate a portion of IT budgets to innovation and exploration are focusing on more reliable areas.

    What this means is that insurers–no matter how many surveys show they anticipate its adoption–will likely not see a significant increase in big data tech. However, that doesn’t mean the industry will let big data pass it by. Instead, much of the technology innovation around big data will need to come from the vendor community.

    We’re already seeing a growing number of third-party vendors that provide tools and tech to do better analysis and get deeper understanding from big data, a second-generation of big data startups. Most of these vendors, however, expect that the insurer will already have a Hadoop cluster or big data servers in place, and (as we know) that’s exceedingly rare. Instead, vendors need to start thinking about offering insurers a “big data in a box” approach. This could means SaaS options that host big data in the cloud, appliances that offer both the analysis and the infrastructure, or even just a mix of professional services and software sales to build and manage the insurer’s big data environment on which the licensed software will run.

    We’ll also begin to see insurance core system vendors begin to incorporate these technologies into their own offerings. The same thing has happened for traditional data analytics, with many top policy admin vendors acquiring or integrating with business intelligence and analysis tools. Eventually they’ll take a similar approach to big data.

    And finally, some third-party vendors will move the entire big data process outside of the insurers entirely, instead selling them access to the results of the analysis. We’re already seeing vendors like Verisk and LexisNexis utilize their cross-industry databases to take on more and more of the task of risk and pricing assessment. Lookups like driver ratings, property risk, and experience-based underwriting scores will become as common as credit checks and vehicle history. These third-party players will be in a better position to gather and augment their existing business with big data sources, leveraging industry-wide information and not just a single book of business. This mean that smaller insurers can skip building out their own big data approach and instead get the benefits without the technology investment, and they can compete against bigger players even if their own store of data is relatively limited.

    So while the numbers in Novarica’s latest survey may look low and the year-on-year trend may show slow growth, that doesn’t mean big data won’t transform the insurance industry. It just means that transformative change will come from many different directions.

    On Tuesday, July 14th at 2 pm I’ll be hosting a Business Intelligence and Analytics webinar, which will review the recent report, go into more detail on big data, and cover how insurance is being transformed by the growth in available data and information both inside and outside the enterprise. For more information, visit:

    https://attendee.gotowebinar.com/register/1692064099558513922

    Related Report

    Top Technology Priorities for Personal Lines Carriers

    Jeff Goldberg

    Today’s personal lines marketplace is more competitive than ever due to slow growth, intense price competition and customer acquisition costs rising.

    Personal Lines insurers have always been a leader in insurance technology innovation and conversations with CIOs and research in the space shows that trend will continue, with technology playing an ever-larger role in insurers’ ability to attract, retain, and profitably serve clients. Across the industry, insurers continue to make investments across the Novarica Insurance Core Systems Map.

    Novarica Insurance Core Systems Map: Personal Lines

    Novarica Insurance Core Systems Map: Personal Lines

    In a market with very competitive conditions and intense profitability pressures, personal lines carriers are focusing on growth strategies, expense reduction, and improving underwriting results. Below I have listed four technology priorities CIOs and business executives should consider to remain competitive.

    Business Intelligence
    A data quality initiative, which examines data warehousing, operational data stores, and appropriate data marts, is key before undertaking more advanced business analytics initiatives. Once data quality is ensured, carriers can then overlay business intelligence tools. Predictive analytics tools for carriers with sufficient data are becoming more popular. Small carriers should look at working with an organization that can provide pooled data and insights. All carriers can use models to improve underwriting insights, to more consistently apply pricing, and to improve claims activities. In addition, third party big-data sources are going to become more and more prevalent for personal lines insurers. Companies who take advantage of this first will have an edge in pricing and retaining business.

    Policy Administration Systems
    Upgrades to policy admin systems will help carriers gain operational efficiencies and flexibility in the ability to add data. Using business rules to manage workflow and predictive analytics to build pricing models can improve risk selection, risk pricing, and reduce operating expenses. Carriers should look for highly configurable solutions with product configurators, simple rules, and tools for launching new rating algorithms. They should also look for the ability for business units to make their own modifications, though practical experience with configurable systems reveals IT often still ends up managing most changes. As long as the time and cost of such work for IT is reduced, that’s still a big value.

    Agent Connectivity
    Extending functionality to the agents continues to rise in importance. It’s less and less about differentiation and more and more simply the price to pay to be in the game. At this point in time, most personal lines insurers have built an agent portal, and are often quite proud of the results. As a next step, both agents and carriers would prefer to receive and provide information electronically and process that information with as little human touch as possible, eliminating double entry. Real-time upload, download, and data translation deliver tangible benefits including reduced costs of handling, improved data quality, and improved turnaround time.

    Claims Management
    Streamlining claims management by automating processes improves customer service by speeding up claims service, providing consistent and fair best practices to all customers, and delivering personal service. On top of these customer benefits, insurers who have implemented modern claims systems report tangible speed-to-market benefits. If a carrier hasn’t already begun to upgrade their claims administration system, now is the time to start. Carriers who are using modern systems are rapidly gaining competitive advantages by improved efficiencies in claims handling and improved data leading to better outcome management. In addition, better claims processing has become a significant part of how personal lines insurers market themselves to consumers and how consumers select an insurer.

    These technology priorities are based on the expertise of Novarica’s staff, conversations with members of the Novarica Insurance Technology Research Council, and a review of secondary published resources. For more information, download a free preview of our new Business and Technology Trends: Personal Lines report at: http://novarica.com/business-and-technology-trends-personal-lines/ or email me to set up a complimentary 30 minute consultation.

    Related Reports

    Enterprise Architecture and Digitalization

    Mitch Wein

    At our 8th annual Novarica Insurance Technology Research Council Meeting, discussion began with a definition of “digital” at the Enterprise Architecture and Digitization breakout session. Novarica stresses that digitalization encompasses the entire customer interaction lifecycle, from front-end to back-end. A common theme was that of silos: a given process might be automated, but still require manual elements to be fully executed. A example of this could be creating and exposing a form through a portal but requiring the form to be mailed in by the agent or customer and scanned in by the carrier back office. In part, this is a function of change management and budget – an agent portal project as a quick win has appeal compared to a more exhaustive transformation.

    We explored how prepared carriers were for digital given the definition above. Clearly key emerging technology trends like IoT, social media, big data and cloud were pushing firms to reconsider what they needed to do to remain competitive. While some firms were doing regular Strength, Weakness, Opportunity, Threat (SWOT) reviews, many were not. All agreed that a SWOT analysis should be an ongoing effort.

    As part of their digital transformation, many carriers were concerned about data quality (and thus resisting automation and subsequent “exposure”). They were also under investing in areas that will be critical to their digital success; many carriers do not have business and technology architects involved with their digital efforts or in some cases don’t even have a business or enterprise architecture group!!!

    We briefly touched on security and the importance of making sure security risks are mitigated to avoid reputational, regulatory or legal exposure that could impact a digital insurer. Existing cyber security frameworks like the NIST framework released in 2014 can save carriers time and effort, by providing metrics and a framework that regulators are familiar with. Security exposure of vendor practices and products was mentioned as key to a carrier. We discussed requiring vendors to conduct security audits and report on the results, which can serve as a lever for negotiations.

    Finally, we talked about the need to introduce a test and learn culture into the carriers, allowing projects to “fail fast” if they do not work well. We related this to the notion of Fast IT and how it compares with Core IT. Core IT will often use traditional waterfall methodologies and take a long time to evolve systems. In the new digital world Fast IT will behave differently. Fast IT will evolve solutions through Agile processes and allow for solutions to be “thrown-away” if they are not working out. We discussed examples like smartphone portal front-end apps for agents, brokers, or customers. All of the carriers participating felt that the Fast IT concept was new to their firms and will probably difficult to adjust to from a process and funding perspective.

    After digital, we moved on to enterprise architecture. We discussed how a reference architecture can provide consistency across projects as well as help with project prioritization. This allows carriers to be proactive and not just chase the hot new technology and feeling constantly behind. Linking architecture governance, aligned to a reference architecture, to funding governance through logical gates was being done in an informal way in some carriers and not at all in others.

    If you’d like to discuss enterprise architecture or digitalization further, please feel free to email me to set up a complimentary 30 minute consultation.

    Related Reports

    Related Blogs

    Upcoming Novarica Insurance Technology Research Council Meetings
    The Novarica Insurance Technology Research Council is a free, moderated knowledge-sharing community of nearly 400 insurer CIO members. Members represent a cross-section of property/casualty, life/annuity, and health insurers, and range from the very small to the largest companies in the industry. Some of our upcoming 2015 events include:

    Will Big Data Replace the Enterprise Data Warehouse?

    Jeff Goldberg

    At our 8th annual Novarica Insurance Technology Research Council Meeting, we had some great conversations about data, both as part of the formal event agenda and also via informal discussions during breaks or meals. Many of the trending or future technologies being discussed (Internet of Things, wearable devices, social media) from an insurer perspective are really just more and bigger channels of data. And just in case you think insurance data is lacking in thrills: my favorite story of the day was about auto insurers looking for fraud or stolen vehicles by integrating data feeds from the license plate scanners used by bounty hunters.

    At the analytics breakout session I had the opportunity to sit around a table with CIOs from a variety of insurers of different sizes and lines. One thing that struck me was how much big data and big data technology is part of the conversation. Novarica typically urges caution when asked about big data, stressing that insurers shouldn’t go looking for a use case to fit a technology. But it seems that more insurers are finding the use case which leads them towards a big data approach, and it raises questions about the future of the enterprise data warehouse.

    Building a data warehouse with a single data model that supports the whole business is notoriously difficult, with a success rate that drops as the amount of data and number of lines of business get larger, until becoming a near impossibility. A big data approach allows bringing together many different data sources with different structures without having to go through the normalizing and cleansing process that often derails EDW projects. While both approaches have value, larger insurers are discovering the big data route may be their only option for cross-business analysis. And some smaller insurers see this as a way to rapidly gather and review data as well, often running side-by-side with an existing data warehouse. In fact, in at least one case, an insurer’s big data lake has become the first step in the data workflow, with a data extraction from it feeding the older data warehouse as a way to maintain legacy processes.

    We’re currently reaching out to the Novarica Insurance Technology Research Council with a survey about analytics and big data, so will be reporting soon on just how many insurers have taken a big data approach. But even though we expect the percentages to remain small, it’s clear the growth will continue. The tools and options around big data are increasing and more vendors are providing services that make it easier for insurers to leverage big data tech without having to build it from scratch themselves. In time, I expect to see more insurers running a big data project alongside an enterprise data warehouse, and, in some cases, taking over as the hub of that insurer’s data.

    If you’re interested in talking more about big data and data technology, please feel free to email me to set up a complimentary 30 minute consultation.

    Related Reports

    Related Blogs

    Upcoming Novarica Insurance Technology Research Council Meetings
    The Novarica Insurance Technology Research Council is a free, moderated knowledge-sharing community of nearly 400 insurer CIO members. Members represent a cross-section of property/casualty, life/annuity, and health insurers, and range from the very small to the largest companies in the industry. Some of our upcoming 2015 events include:

    Enterprise Data Initiatives Now Taking Center Stage as Insurers Look to Improve their Digital and Analytical Capabilities

    Mitch Wein

    With insurance carriers of all sizes and lines of business focusing on improving their digital and analytical capabilities to meet customer expectations, the importance of enterprise data has never been greater.

    Over the last 3-5 years data has once again become a key focus of the insurance industry. Data is now seen as a key enabler of the insurance industry’s evolution into a fully digitized provider of risk services focused on customer, not product.

    When data is collected from either internal or external sources, the CIO becomes responsible for its storage, management and use. This becomes highly complex in today’s world since data is both structured and unstructured and is controlled by various legal and regulatory considerations, as well as internal security and risk policies. Based on the direct experience of Novarica’s senior team and its Council members, the following best practices should be considered when initiating an MDM initiative.

    • Create policies around data and how the data should be managed and controlled.
    • Establish organizational structure for data.
    • Link MDM architecture to business goals and objectives.
    • Consider supplementing internal data in an MDM infrastructure with enriched data and external big data
    • Determine what system or process creates the official data of record.
    • Determine the senior level project sponsor and who pays.
    • Deploy multi-year MDM programs in an incremental fashion.

    MDM is more than a technology. It is a program of work involving an assessment of business needs, a data sourcing strategy, a data cleansing strategy to address quality, an architecture and integration initiative, data documentation and classification, as well as an organizational evolution for data governance and ownership. The goal is to create a single view of the master data which can then be referenced by all systems, reporting, and business processes.

    Novarica’s experience has shown that the data governance and ownership dimension is often the hardest aspect of this program after identifying a business sponsor and developing a solid business case. Who manages the data, who owns the data, and who is ultimately accountable are difficult challenges that must be addressed.

    If your organization is in the process of implementing an MDM program or if your efforts have stalled, please send me an email to set up a complimentary 30 minute consultation.

    Recent Novarica CIO Checklist Briefs

  • Master Data Management: A CIO Checklist
  • Architectural Governance: A CIO Checklist
  • Preparing for Digital Transformation: A CIO Checklist