News and View Roundup: Penn Mutual and Vantis Life Merger, Root, the DDoS Attack

Rob McIsaac

Rob McIsaac on the recent merger between Penn Mutual and Vantis Life

Steven Kaye

Steve Kaye on what auto insurer startup Root can teach insurers about customer service

Tom Benton

Tom Benton on the recent DDoS attack and how insurers should be adjusting security to better protect consumer data.

Group Insurance Special Interest Group Session Highlighted Opportunities and Challenges

Rob McIsaac

Recently, Novarica facilitated a Special Interest Group session in Wellesley, MA, focused on the issues, challenges and opportunities facing carriers in the Group and Voluntary Benefits space. Sixteen carrier participants made the trip to join in a session which provided for a wide-ranging discussion that was particularly timely, inasmuch as many carriers are now in the final stages of framing their plans and budgets for 2017. Earlier last week, Novarica published our latest annual survey of IT Budgets and Plans, which can be accessed at; the material featured prominently in some of the discussions that we explored through the course of the day. The session was hosted by Sun Life and we very much appreciate them making both their team and facilities available for the event. Some of the key discussion areas included:

The world is going “digital” and group carriers need to accelerate their efforts to keep up. Across the entire insurance industry, irrespective of line of business, one of the key topic areas is the evolution of digital strategies. Agreeing on what exactly is included in such a “digital” definition remains somewhat elusive, with varying interpretations of the term reflected in Novarica research. Interestingly, however, no one seems to want to discuss “analog” strategies, which would seem to be the natural corollary to digital capabilities, validating the general importance. Carriers focused on the Group / Voluntary benefits space are certainly feeling the pressure.

From the session, it was absolutely clear that Group carriers are aware of the rapidly changing “customer”, which has serious implications for the creation of valid and engaging experiences. Customer has multiple possible definitions for Group insurance. Clearly, plan sponsors are a focal point, and providing solution sets and experience which retain key players in the value chain (e.g., benefits managers and HR departments are key), but similar needs exist for distribution partners and plan members. Leveraging from other lines of business, there was a good discussion regarding the need to think about different tailored experiences for different demographic cohorts as well as the need to consider how “customer” expectations are being influenced by other parts of their lives, including banking and retail engagements. Being out of step with this creates added pressure on carriers that may have historically tried to optimize on “one size fits all” solutions; these approaches, however, risk become “one size fits none” answers. For example, mobile capabilities become increasingly important across the value chain, but as a complement to other tools, rather than as a replacement for other engagement points. A number of carriers noted that they are also considering how Omni-channel capabilities might impact them in the future, which led to a key discussion in this space: effective Omni-channel solutions reflect the ability for a customer to decide when and how they want to move between channels for getting things done, rather than treating them as parallel and non-intersecting runways.

Data management is an increasingly important issue for carriers. One other thing that is very clear from all of this is that data and the ability to turn it into insights that are meaningful and actionable will become increasingly critical in this line of business. That creates challenges for both the ability to extract data from existing legacy environments as well as to attract and retain the talent needed to be able to work with emerging tools and platforms. Most carriers continue to work with legacy solutions that have underlying operational data contained in silos which can be difficult to use for interoperating with other things such as unstructured data. As we explored issues related to data, the notion of governance became increasingly a focal point for the discussion. One key question here was, in effect, who owns the data. Varying approaches were described by different carriers, suggesting that there is not a singular right answer. It was clear, however, that success is dependent on having someone own the data and that investment is made around effective governance which encompasses both defining the data and creating rules of the road for how it is used. The notion that discrepancies in the interpretation of data should somehow be resolved in the CEO’s office was clearly understood to be a “bad idea” which should be avoided at all possible costs. A range of approaches to creating a data governance structure were discussed, which led to a dialogue related to finding the talent that can actually work with new and emerging tools. Several challenges exist here, including the fact that insurance companies may not be seen by graduating students as attractive places to embark on careers. There are also concurrent challenges associated with where carriers are physically located, in terms of being in markets that can attract the right kind of talent. A number of approaches to addressing this were discussed, including internship programs, efforts to engage with local universities to influence curricula and the relocations of key functional areas to places that naturally attract top talent. Examples of companies positioning themselves to be able to tap into talent pools such as Amherst (MA) and Chapel Hill (NC) were also discussed, further reflecting on the need to be flexible and adaptable in a rapidly changing world.

Properly armed with data, part of the next challenge becomes turning that raw material into actionable insights. Some businesses, such as health insurance, have been increasingly aggressive about creating more frequent, and more bi-directional engagement, giving them the ability to potentially weave themselves more tightly into consumer lives (e.g., wearable devices). The unanswered question becomes how Group life and DI carriers can also do this. One avenue explored included leveraging technology to become better and more effective about managing claims for disability and absence management. This has also shown signs of success in the workers compensation space. It remains clear that this (data and analytics) is a space where there is no such thing as a notion of being “done”.

The lack of standards for the exchange of data is an increasingly troublesome problem with no “silver bullets” in sight. An ongoing area of concern for Group and Voluntary Benefit carriers is the lack of standards for exchanging data between enrollment platforms, distributors, carriers and others in the value chain. This particular line of business lags significantly behind the progress that has been made in the P&C world, for example. While carriers have long understood the issues and challenges resulting from the lack of standards there has been remarkably little progress made in this space for years. A number of industry bodies, including ACORD, LIMRA and the Open HR Standards group have tried, but the results to date have been characterized as “disappointing”. There are several efforts underway now but none appear to have the kind of immediate traction or sense of urgency which would lead to a change in circumstances in the near future. One of the challenges is that this business line has a different set of “players” in the value chain than you see in places like individual life insurance and annuities. As the Group and Voluntary Benefits arena has become more competitive, and faces increased margin pressure, new entrants in the value chain have moved to disintermediate carriers and distributors alike. Specifically, enrollment platforms which view the employer as their customer have moved to take on an increasingly important place in the delivery of these products. For them, the employer is the key client relationship and they really have very little interest in helping address issues specific to the insurance carriers which are providing products and services “behind-the-scenes”. As much as carriers would like to influence what the enrollment providers do, the reality up to this point has been that they have very little influence on decision-making at that level. The key to creating a more effective ecosystem for this line of business will be in figuring out the “what’s in it for me” issue for the enrollment platform vendors. Short of that, carriers and associations they belong to may have relatively little leverage in this space. Some carriers, recognizing the immediate challenge associated with this, have moved to create a series of technical tools to ease the ingestion of enrollment data. While this is neither easy nor inexpensive, it can be effective in terms of creating an internal mechanism for a carrier which mitigates the challenge of different formats of data coming in from the myriad of platform providers they work with.

A number of carriers mentioned that one of the major insurance trade association (ACORD) organizations believes they are making some progress in this space. The details on that remain confidential until later in the year but it seems clear that any solution which will be effective must address a broad set of issues for both insurance carriers and enrollment platform providers. Until that is done carriers will likely continue to play a relatively weak hand in this space, particularly for Group insurance benefits which employers see as being part of a suite of solutions being made available to plan participants.

Core systems remain a major concern but the pace of transformation remains modest. For all carriers at this meeting, the subject of core systems and their future direction remains an important “hot topic”. The nature of the business is that few have embarked on sweeping core system replacements today, although many expressed interest in seeing how the vendor community anticipates supporting this need in the future. A more common approach for most carriers now is to address some of their peripheral system needs, such as claims, compensation or underwriting, leaving the heart of their systems environment to be addressed another day. For many carriers, part of the issue remains that the low interest rate environment continues to be an impediment to new investments. There also remains a strong desire to see platforms successfully deployed at other carriers before some are willing to make a local plunge. Given the risks associated with these types of projects, and the lack of the kinds of success patterns which have colored the property and casualty space in recent years, some of this reluctance is not surprising. There are, however, emerging success stories with things like underwriting and claims which carriers in our view should be mindful of. One question came up regarding new entrants in this space. While there have been some newer technology vendors in recent years, the number is actually very small; this situation is unlikely to change significantly since the Group and Voluntary benefits segment is very highly concentrated with relatively few buying units. That generally keeps the number of new players down, particularly since part of the decision criteria carriers use for purchasing the systems is some form of track record that they can count on, creating a classic “Catch-22” situation. That isn’t to say, however that there are not good deployment examples for carriers to be aware of, particularly in support of the Voluntary benefits lines. The session gave us an opportunity to discuss a number of these. We also expect that 2017 will continue to show more progress by existing software vendors attracted to this space. Given the challenges, including significant technical debt, associated with current platforms at Group carriers, we do expect that the latter part of a decade will see a significant ramping up of investment.

Cloud based solutions are increasingly viable … with some caveats. For many carriers the use of cloud-based solutions across the technology stack is an option which looks increasingly attractive. In addition to allowing carriers to transition from Cap-Ex to Op-Ex spending, which can more closely align IT spending and business cycles, this can be a way for carriers to also reduce the long depreciation periods associated with traditional deployments, which can risk having solutions become obsolete before the depreciation schedules expire. This can lead IT organizations into unfortunate circumstances where impaired assets with significant residual value can need to be written off before new replacement projects can be implemented. As a result, carriers have increasingly turned to cloud options for peripheral systems such as e-mail, Human Resources platforms, CRM, and Finance capabilities. As many of the carriers at the SIG noted, these solutions can be both flexible and highly functional for critical workloads. Historically, cloud-capabilities had raised security related concerns for carriers. Today, many carriers have accepted that top-rated solution provides such as Google, Amazon and Microsoft (to name three) probably can provide better security than they deliver internally. As such, this allows carriers to consider these hosted solution options as viable alternatives for functions at the core of their operations, including claims and underwriting. We also discussed how cloud-based capabilities have become more commonly used for core functions in other parts of financial services, notably in banking and P&C insurance. One special challenge that may be faced by group carriers, however, is that there may be special requirements for data management and control that are established by the plan sponsors. Employers, potentially concerned about a range of issues associated with the personal and confidential information for employees, appear to have created challenges that group carriers are continuing to struggle with getting past. As a general construct, group carriers appear to have a very similar understanding of the issues as their individual insurance counterparts, but the concerns or restrictions raised by plan sponsors appears to be creating some added headwinds for cloud adoption on core capabilities. It also appears likely that this is a temporary, rather than a permanent, situation. For one thing, many carriers note that key infrastructure providers (including Microsoft and Oracle) are making it increasingly difficult for companies to acquire their technology for deployment outside of the cloud options. This is something that employers / plan sponsors will likely be facing themselves.

Agile is increasingly important but there are some clear challenges to achieving the desired outcomes. All carriers at the session noted that they are actively moving to use Agile as the preferred approach to software development. The degree of deployment varies significantly between carriers, however, with some having already moved to fundamentally change their work environments to support the collaboration needs associated with the evolving SDLC methodology. As much as carriers are looking forward to the potential benefits of Agile, there are some problems. For one thing, working with vendors that have themselves moved toward this development approach, may have very different definitions for Minimum Viable Product than the carriers themselves do. This makes it difficult to ingest work done by vendors and integrate it with development efforts being done by carriers on their own. This type of integration issue was mentioned by several carriers, highlighting the need to think expansively as project plans are developed and implemented. With the move to Agile, many carriers appear to be moving away from traditional office spaces and more toward a “hoteling” approach which has work spaces available on more of a ‘first come / first served” basis. This appears to have many advantages for carriers, but is something that is accepted at differing paces by employees from different demographic cohorts. Others noted that Agile is something that needs to be a joint IT and business undertaking, not IT alone. The change management efforts required to address both these issues is crucial to the ultimate success of these initiatives.

The level of engagement during the session was terrific although there were clearly more issue areas than we could discuss in a single day meeting. The group agreed that we should continue these sessions in the future and we’re pleased to be able to report that The Hartford Insurance Group will be our host for the sessions we plan to conduct in 2017. As always, if you’d like to discuss any of this in more detail, please let us know. Insurance technology is becoming ever more interesting … and critical to the success of the carriers focused on this line of business.

News and Views: Data and the Small Carrier, Customer Experience in the C-Suite, and Cyber-Risk Scoring

Novarica’s team comments on recent insurance and technology news

GuideOne’s CEO says their exit from personal lines was driven in part by an inability to keep up with the data demands.

Jeff Goldberg

Novarica Comment by Jeff Goldberg, VP of Research and Consulting: “GuideOne’s CEO attributed their decision to exit personal auto, in part, to trouble keeping up with big data needs. Any insurer with a small book of business understands the difficulty in trying to use that limited data set to extract the same kind of insight compared to large competitors. Some utilize third-party data providers who aggregate larger sets of contributory databases across many insurers, allowing participants to position against those Tier 1’s. But access to the broader data doesn’t translate into the right skills for big data insight. Novarica often refers to data technology as an arms race: as long as your competitors aren’t doing it, you don’t have to do it either. But as soon as someone else adopts a new approach to make better pricing/underwriting decisions, you need to follow or else face adverse selection. Big data technology hasn’t always been common enough to cause these problems. But now, especially in personal lines auto, big data and big data tech is becoming so prevalent that insurers feel the pain if they ignore it. It’s another example of insurers needing to “Adapt or Decline.”

 USAA has named a chief technology and digital officer (CTDO), a newly created position responsible for information technology, digital strategy & operations, and experience design.

Chuck Ruzicka

Novarica Comment by Chuck Ruzicka, VP of Research and Consulting: “The USAA, which has been a leader in technology and innovator in the insurance industry, made a significant statement today by appointing a C-level executive with an explicit mandate to oversee and improve their overall customer experience. This ‘voice of the customer’ has long been lacking in the C-suite, and while other companies in other industries have started to bring that voice into the room, insurers are only now beginning to do so. The USAA, at least, is clearly banking on the idea that offering a world-class customer experience will enable them to thrive in an increasingly challenging marketplace.” Novarica will be publishing a report on best practices in User Experience later this month.

A new startup is grading Fortune 500 companies on cyber risk “credit scores” 

Mitch Wein

Novarica Comment by Mitch Wein, VP of Research and Consulting: ”UpGuard’s Cyberrisk score factors in public, web accessible reviews of a company’s IT environment (strong ciphers, using up-to-date software, using valid certificate authorities, applying phishing protections, etc.) to create a FICO like score.  This score can then be used by an underwriter to help price cyber risk.  The problem of how to price cyber risk is very real, in part because of the very recent emergence of demand for this type of coverage and the lack of loss history over time and because third parties can introduce cyber risk into the company requesting the coverage.  The problem is that this type of score, while a good preliminary indicator, is that it over-simplifies the issue.  Human behavior is a primary cause of security risk (putting a thumb drive into a computer, clicking a phishing link on an email, letting someone come into the office that they don’t know).  This score will not be able to factor this in.”

Less than 4 months before many firms are going live with the new DOL regulations, more than half of advisors surveyed are unaware of their firms’ plans for compliance

Rob McIsaac

Novarica Comment by Rob McIsaac, SVP of Research and Consulting: “As we approach the 4th quarter of 2016, the efforts to prepare for the implementation of the DOL Fiduciary Rule changes are clearly accelerated across the value chain. While the “real” effective date for key elements of the rules isn’t until April of next year, there is a growing consensus that distributors and manufacturers both anticipate having key changes implemented by the end of 2016 to avoid a “split year” scenario that will be difficult to manage. However, many producers are still unclear on what this brave new world will mean for them. Most importantly, there’s a lack of clarity on how certain things like the Best Interest Contract Exemption will be implemented and how their sales practices will be altered by the changes.

While the study is interesting, it doesn’t address demographic differences in the distributor force. With the average agent age now 59, understanding how the more mature producers react may be a key to future state planning. It’s plausible that older producers will simply turn attention to other product sets to simplify their own business models if the changes appear to be overly daunting. Carriers may find this yet another example of an area where the need to think bi-modally in order to maximize their chances of success.

Effective communication, support mechanisms, and a focus on being “easy to do business with” will  be critical for carriers hoping to preserve product shelf space with key distribution partners. We fully expect the pace of activity to sharply accelerate as carriers recognize there’s little more than four months left before their ’go live’ date.“ More from Novarica on the DOL ruling.


Reading KPCB’s Annual Internet Trends Report

Matthew Josefowicz

As always, the KPCB Annual Internet Trends Report, published yesterday by Mary Meeker and her team makes for some fascinating reading.

Insurers may want to pay particularly close attention to the following sections:

  • Pages 61 and following, on how retailers like Amazon and others have used their massive data collection capabilities to launch private labelled products in far-flung categories like outdoor furniture and fashion.

  • Pages 66 and following, on how companies are micro-segmenting their target markets and designing niche products for them (see our “Quick Quote” thoughts on how It’s All Program Business Now)

  • Pages 102 and following, on the evolution of new messaging platforms to support business conversations. While financial services examples are in Asia today, they’ll be in North America over the next decade.

  • Pages 151 and following, on changes in the auto industry and usage trends, of special interest to Personal Lines insurers.

  • Pages 188 and following, on how non-tech incumbents have increased their purchase of tech new entrants by 2.6x since 2012. Insurers aren’t the only ones buying innovation.

The data-as-a-platform and cybercrime sections are also interesting, although those sections are heavily focused on what seem to be KPCB portfolio companies.

Update from the CIO Insurance Summit

Rob McIsaac

It is amazing to think that we’re already into the second quarter of 2016. The year is moving fast, and with it the opportunity to see how carriers are responding to the urgency of a “new normal” is coming into clearer focus. I had the good fortune to be able to be the “Master of Ceremonies” for the CIO Insurance Summit in NYC on April 5th, which clearly provided insight into what is on the mind of participant carriers. The pace of activity is notably picking up for many lines of business although, as one of my mentors once shared, it is important to avoid confusing activity with progress. Real progress appears to be somewhat elusive but the quality of the dialogue definitively appears to be elevated. 2016 promises to be a very interesting time.

From our sessions in NYC a number of clear themes emerged that are worth sharing.

Cloud deployments are getting more real even as concerns remain in some quarters. We had a wide ranging discussion related to the opportunities and concerns that seem to concurrently emerge around the use of cloud based options for carrying mission critical workloads. In many organizations it appears that “security” and “compliance related risks” remain at the forefront of an inability to foster faster adoption. As we explored this, I discovered that many companies still haven’t made the connection that efforts to move CRM (SalesForce) and e-mail (Office 365) have already broken through a barrier that appeared daunting to carriers in the very recent past. Having effectively put these installations to the test, carriers with these implementations are increasingly willing to acknowledge that the security models are as good as, if not better than, what they are able to implement for their own environments. As additional mission critical workloads migrate toward this type of deployment (e.g., Workday for HRD and financials), it helps to push organizations to articulate what the real concerns are and how to best address them. The reality is that this is frequently not so much a technology issue as it is one that is linked to emotional, political or organizational issues that need to be addressed before the discussion turns to the selection of a hosting service. For CIOs and their teams getting in front of the issue to do effective education of business partners as well as developing a point of view on which cloud based providers are best positioned to be part of their tool kit (they are not all created equal) can be part of an effort to build momentum and organizational support. Going “full to bright” in a short timeframe may be too much for many organizations to accept, which runs the risk of triggering an enterprise immune system reaction that can be painful.

Data governance remains a significant concern. With a myriad of business units, products, core record keeping systems, and rules of engagement that may conflict from one line of business to another, this remains an area of notable concern – and investment – at carriers we spoke to. Most carriers still lag far behind the banking world in terms of an ability to understand their business from the outside looking in (rather than from the inside looking out). However, there continue to be advances in the idea that there is value in gaining a full view of both customer and producers, and that the investment in both technology and business process to allow for achieving informational insights from internal data can be quite high. A common theme among participants appears to be the desire to construct a 360-degree of customers but that breaking through some of the organizational barriers within companies can be daunting. For carriers considering this challenge, investing time and money to really build a robust data governance facility can be highly valuable. Even some of the compliance related effort for Know Your Customer (KYC) initiatives can also create operational and marketing benefits if used correctly. Once again, however, one of the challenges that can be most difficult to overcome is the “human” one. If line of business executives and managers are focused on optimization at the business unit level, while data analytics efforts around customers are focused at the enterprises level, the inherent conflict can substantially mute any resulting benefits to the organization. Being clear that this is not purely a technology issue is key to achieving desired outcomes. One key reality that becomes clear as companies talk about their desire to use better analysis of data to improve a range of business outcomes: while many talk about Big Data, struggles remain with managing Small Data in quantity.

Definitive plans to address Millennial needs remain elusive although awareness is elevated. We had a lively discussion about this issue, focused on a number of key challenges facing carriers. At a time when 10,000 Baby Boomers retire every day (and concurrently 9,000 children are born to Millennial parents each day) the opportunities associated with getting positioned to take better advantage of market dynamics would appear to be very clear. That said, most carriers acknowledge that they have not yet “cracked the code” on how to best position themselves in terms of both products and service models that will effectively speak to a new generation of potential consumers. We discussed some of the efforts being put forward by companies to better understand these dynamics (e.g., MassMutual’s coffee shops) but a reality is that the answers to changing market needs will require some level of experimentation and testing of hypotheses, an approach which may well be counter-cultural for the very organizations whose long term success is impacted by their ability to start thinking differently. Avoiding a “Kodak Moment” can be a function of how well carriers deal with a range of challenges including the demographics of their agency forces; the average age of an agent in the USA is now 59 with an estimated 25% of today’s agents potentially leaving the business by 2018. Concurrently, a number of carriers noted that their own HR policies and procedures do not appear to be adjusting appropriately to deal with the increased velocity of voluntary turnover associated with the emerging Millennials who will represent 50% of the USA labor force by 2020. Mentoring programs, efforts to create more varied experiences that allow for expanded horizontal movement within organizations, and increased flexibility related to geographic location have proven to be effective “tools of the trade” for some organizations as they’ve moved to adjust to a new reality, but the broader trend remains clear. The emerging generation of employees will have a very different relationship with employers in the foreseeable future than their parents or older siblings did. Implementing procedural changes for everything from employment procedures to knowledge management will be important to operational effectiveness.

The increased urgency at carriers is timely. With cycle times across many facets of the business being reduced, user tolerance for poor experiences being driven down and the competitive threats from many quarters being elevated, the current planning horizon is moving with surprising speed. Welcome to the future!

If you’d like to get a copy of the presentation materials used in NYC, please let me know by sending me a note at

The Parallel Paths of Data Strategy

Jeff Goldberg

Data strategy at an organization follows multiple paths:

> Data Governance and Definitions
> Data Process, Quality, and Augmentation
> Data Warehousing
> Reports, Dashboards, and BI
> Predictive Analytics
> Big Data and External Data Sources

When I talk to insurers about their data strategy, I like to assess how far along they’ve progressed on the above paths. The exact breakdown (or naming) of these data strategy paths can vary from company to company depending on priorities and opinion. But the key is that they all matter and, just as importantly, that they happen in parallel rather than in series.

There are two problems I find when diving into the strategy details. Either (a) some of the critical paths of data strategy have been ignored or, the opposite issue, (b) some of the incomplete paths are being treated like roadblocks to other progress.

The first problem is pretty easy to understand. If an insurer focuses on just data warehousing and reporting (one of the most common scenarios) the data will never really represent a single-source-of-the-truth, the reports and other BI will always be in contention, and there will be lots of greater values left on the table. For another example, if an insurer puts all their effort into predictive modeling, those models will never be as deep or precise as they could be with better data for analysis. It’s not a surprise, though, that this kind of uneven approach happens all the time; a balanced data strategy is difficult and few insurers have the resources or skill in all areas. The different paths require various technological expertise, while still others require political will.

The second problem, on the other hand, requires rethinking how these different data strategy paths interact. Up above I’ve lined them up in what seems like a natural order: first you need to have some kind of governance group that agrees on what the data means, then you need to have a process to clean and flow the data through the different systems, then you aggregate the data into a warehouse, then you report on it, then you analyze it and build predictive models, and only then do you think about bringing in big data to the picture. It makes logical sense. But it’s also wrong.

The reality is that an insurer can work on all of those paths in any order and/or simultaneously. You don’t need a perfect data warehouse before you start thinking about predictive modeling (in fact, there are plenty of third-party vendors who help you skip right to the predictive models by using industry data). You can run reports directly off your claims system even if it’s data in isolation. Nowhere is there more proof of this than the fact that most insurers hardly have any data governance in place but have still moved forward in other aspects of their data strategy. That doesn’t mean a company should ignore the other paths (that leads to the first problem), but it does mean progress can be made in multiple areas at once.

What’s important to understand is that all these different data strategy paths enhance each other. The further an insurer is down all of them, the stronger each one will be, leading to the best decision making and data-driven actions.

So it’s always good to step back and take a look at the data across an organization, assessing each of these paths individually and seeing what can be done to move each forward. A good data strategy has a plan to advance each path, but also recognizes that no path needs to block another depending on current priorities.

Why Aren’t Insurers Doing More With Their Data Strategy?

Jeff Goldberg

While the business intelligence space has matured greatly in the last decade, it has been and remains an area where insurers need to work with a variety of platforms and tools to build out their capabilities. This requires a mix of technical skillsets, business expertise, and vendor relationships. While few efforts at an insurer are more complex or time consuming as a core system replacement, a major BI initiative will eventually touch all aspects of an organization. I will be presenting more on this topic in a webinar on December 1, 2015.

Today there are more vendors that provide a full business intelligence suite which includes the data layer, the industry-specific data models, the presentation and visualization tools, and the pre-built insurance reports and dashboards. But these suites still need to be tailored to and integrated into the rest of the insurer’s environment. Some policy administration system vendors are either pre-integrating with or releasing their own business intelligence suites. This does simplify the deployment but adds another variable to the “suite vs best-of-breed” decision, and until these options have more production exposure most insurers are still opting for best-of-breed.

For now, most of these approaches don’t provide some of the ancillary but very important data and analytics areas such as predictive modeling tools and the models themselves, the use of aggregated third-party data sources, or the emerging area of big data. And no matter what approach an insurer takes, it is a near-universal condition that there will be siloes of historical data that need to be considered with or migrated into the new BI solution, and that will take time and effort.

So despite plethora of vendor options and the general acknowledgement across the industry that good business intelligence is key to ongoing success, why aren’t more insurers further along in their data strategy?

1. Most insurers struggle with multiple legacy systems and siloes of disparate data, and they are still at the first steps of bringing that data together.

2. The data that does exist in the organization is of variable quality or completeness. New systems don’t immediately solve that problem without a broader plan in place.

3. Insurers and core systems have traditionally looked at data from a policy view, complicating the effort to move to new models that tend to take a 365 degree customer view.

4. Many insurers still have no formal data governance in place and lack organizational agreement on data definitions.

A good vendor partner can help put the framework and some tools in place to solve the above four roadblocks, but it requires more than just technology. It requires process and cultural change, which can’t be driven solely by IT.

Many insurers are still looking for a data champion to help push a strategy across the organization and get buy-in from other business leaders. As organizations mature, this data champion role is often formalized as a Chief Data Officer (CDO), and that person typically has a strong data background. But for insurers who are still looking to get a data strategy off the ground, it’s most important to find a leader who is respected in the organization and who is passionate about the value that good business intelligence can bring, even if they have little data experience themselves.

Related Reports:

  • Business and Technology Trends: Business Intelligence
  • Novarica Market Navigators: Business Intelligence Solutions
  • Workers’ Comp Insurers Look to Analytics and Core Systems

    Jeff Goldberg

    In our most recent Novarica Council Special Interest Group meeting, several Workers’ Compensation CIOs discussed core systems replacement strategies and long-term visions, as well as emerging uses of mobile, analytics, and end-user-facing technologies.

    All the attendees were in various stages of core system replacement—ranging from just-completed to the initial stages—so they were eager to learn from others’ experiences, and to gain perspective on their own challenges. Everyone agreed that a flexible, modern core system was at this point table stakes, hence the flurry of transformation activity. A minority of companies are changing appetite, but the vast majority of P/C insurers are looking to grow by moving into new territories. To do that a modern, flexible system is absolutely necessary.

    The shrinking lifespan and growing price of core systems was another area of concern. Everyone agreed that that new core systems are increasingly costly to implement, and that they must be replaced more frequently than older legacy systems. Gone are the days that a core system lasted for 40 years. Some participants also noted that many strategic business initiatives—like new product deployment—must be put on hold during a multi-year implementation project, increasing the indirect costs of the implementation.

    As an antidote to this gloom and doom, the CIOs in attendance were confident in their strategy to overcome these obstacles. Core systems today are much more flexible than legacy systems, relying on componentized architectures and configurable logic, meaning that the next round of replacements (and possibly even conversions) should be easier. More importantly, past lessons have been learned, and both insurers and vendors know how important it is to avoid custom coding and to stick with a vendor’s upgrade path. If those rules are followed, ten or fifteen years down the road the insurer’s system will be “new” even if they’ve stayed with the same vendor and system all along! It’s critical to choose a vendor who acts as a long-term partner and not just a one-time purveyor of a technology.

    Of all the strategic considerations discussed, one of the most important was a concrete plan for data conversion and sun-setting old systems. One participant shared that if he could go back in time, he’d focus much more on a transition plan, so as not to lose the project’s momentum after go-live. Other attendees described the challenges of data conversion and new data warehouses, and the legal and data integrity-related risks of fully sun-setting old components.

    Attracting and retaining good talent was another concern for many of the attendees. One insurer reported being well ahead of their Guidewire implementation schedule, due to a concerted effort to focus talented IT and other business unit resources on the project. Several attendees noted that the structure of their projects—agile, waterfall, or a combination of the two—was much less important than the staffing and communication strategies of those projects. When agile first started becoming prevalent in the insurance industry, carriers all over the country were told it might be the answer to all their project/logistical problems. But that’s not how software works. Everyone in attendance was reminded that there are rarely, if ever, silver bullets for these huge problems.

    Related Research:
    Business and Technology Trends in Workers Compensation

    An Honest Look at the State of Big Data in Insurance

    Jeff Goldberg

    With the recent publication of Novarica’s Analytics and Big Data at Insurers report, it’s time to take an honest look at the state of big data in the industry. One of the most telling–and disappointing–charts showed that of the insurers working with big data, seventy percent were using traditional computing, storage, database, and analytics technology, meaning they’re working with SQL databases in their existing environment. Of all the other technology options (Hadoop, NoSQL, columnar databases, etc) only a small percentage of insurers use them and almost no insurer uses them extensively.


    Compare that to the percentage of insurers who say they are using big data sources, which is significantly higher than the percentage of insurers using big data technology. This includes third-party consumer or business data, geospatial data, weather data at the top the list, with audio/video data, telematics, social-media content, and internet clickstreams lagging behind. But what’s really happening when those big data sources are loaded into a traditional structured database? Most likely, a few key elements are pulled from the data and the rest is ignored, allowing the so-called big data to be treated like “small data,” processed along with all the policy, claims, and customer data already being stored. For example, while a weather feed might be coming with minute-by-minute updates, most insurers are probably just pulling region condition along with daily temperature highs and lows, filtering it down to a subset that stores easily. While I’m not saying such reduced data doesn’t augment an insurer’s understanding of incoming claims (for example), it’s far from what we think about when we imagine how a big data analysis of the weather might impact our understanding of insurance claim patterns.

    There’s no denying that there are a few exciting use cases of insurers truly leveraging big data, but it’s still extremely limited. The industry as a whole is getting much better at data analysis and predictive modeling in general, where the business benefits are very clear. But the use cases for true big data analysis are still ambiguous at best. Part of the allure of big data is that insurers will be able to discover new trends and new risk patterns when those sources are fully leveraged, but “new discoveries” is another way of saying “we’re not yet sure what those discoveries will be.” And for many insurers, that’s not a compelling enough rationale for investing in an unfamiliar area.

    And that investment is the second problem. The biggest insurers may have the budget to hire and train IT staff to work on building out a Hadoop cluster or set up several MongoDB servers, but small to mid-size insurers are already stretched to their limits. Even insurers who dedicate a portion of IT budgets to innovation and exploration are focusing on more reliable areas.

    What this means is that insurers–no matter how many surveys show they anticipate its adoption–will likely not see a significant increase in big data tech. However, that doesn’t mean the industry will let big data pass it by. Instead, much of the technology innovation around big data will need to come from the vendor community.

    We’re already seeing a growing number of third-party vendors that provide tools and tech to do better analysis and get deeper understanding from big data, a second-generation of big data startups. Most of these vendors, however, expect that the insurer will already have a Hadoop cluster or big data servers in place, and (as we know) that’s exceedingly rare. Instead, vendors need to start thinking about offering insurers a “big data in a box” approach. This could means SaaS options that host big data in the cloud, appliances that offer both the analysis and the infrastructure, or even just a mix of professional services and software sales to build and manage the insurer’s big data environment on which the licensed software will run.

    We’ll also begin to see insurance core system vendors begin to incorporate these technologies into their own offerings. The same thing has happened for traditional data analytics, with many top policy admin vendors acquiring or integrating with business intelligence and analysis tools. Eventually they’ll take a similar approach to big data.

    And finally, some third-party vendors will move the entire big data process outside of the insurers entirely, instead selling them access to the results of the analysis. We’re already seeing vendors like Verisk and LexisNexis utilize their cross-industry databases to take on more and more of the task of risk and pricing assessment. Lookups like driver ratings, property risk, and experience-based underwriting scores will become as common as credit checks and vehicle history. These third-party players will be in a better position to gather and augment their existing business with big data sources, leveraging industry-wide information and not just a single book of business. This mean that smaller insurers can skip building out their own big data approach and instead get the benefits without the technology investment, and they can compete against bigger players even if their own store of data is relatively limited.

    So while the numbers in Novarica’s latest survey may look low and the year-on-year trend may show slow growth, that doesn’t mean big data won’t transform the insurance industry. It just means that transformative change will come from many different directions.

    On Tuesday, July 14th at 2 pm I’ll be hosting a Business Intelligence and Analytics webinar, which will review the recent report, go into more detail on big data, and cover how insurance is being transformed by the growth in available data and information both inside and outside the enterprise. For more information, visit:

    Related Report

    Top Technology Priorities for Personal Lines Carriers

    Jeff Goldberg

    Today’s personal lines marketplace is more competitive than ever due to slow growth, intense price competition and customer acquisition costs rising.

    Personal Lines insurers have always been a leader in insurance technology innovation and conversations with CIOs and research in the space shows that trend will continue, with technology playing an ever-larger role in insurers’ ability to attract, retain, and profitably serve clients. Across the industry, insurers continue to make investments across the Novarica Insurance Core Systems Map.

    Novarica Insurance Core Systems Map: Personal Lines

    Novarica Insurance Core Systems Map: Personal Lines

    In a market with very competitive conditions and intense profitability pressures, personal lines carriers are focusing on growth strategies, expense reduction, and improving underwriting results. Below I have listed four technology priorities CIOs and business executives should consider to remain competitive.

    Business Intelligence
    A data quality initiative, which examines data warehousing, operational data stores, and appropriate data marts, is key before undertaking more advanced business analytics initiatives. Once data quality is ensured, carriers can then overlay business intelligence tools. Predictive analytics tools for carriers with sufficient data are becoming more popular. Small carriers should look at working with an organization that can provide pooled data and insights. All carriers can use models to improve underwriting insights, to more consistently apply pricing, and to improve claims activities. In addition, third party big-data sources are going to become more and more prevalent for personal lines insurers. Companies who take advantage of this first will have an edge in pricing and retaining business.

    Policy Administration Systems
    Upgrades to policy admin systems will help carriers gain operational efficiencies and flexibility in the ability to add data. Using business rules to manage workflow and predictive analytics to build pricing models can improve risk selection, risk pricing, and reduce operating expenses. Carriers should look for highly configurable solutions with product configurators, simple rules, and tools for launching new rating algorithms. They should also look for the ability for business units to make their own modifications, though practical experience with configurable systems reveals IT often still ends up managing most changes. As long as the time and cost of such work for IT is reduced, that’s still a big value.

    Agent Connectivity
    Extending functionality to the agents continues to rise in importance. It’s less and less about differentiation and more and more simply the price to pay to be in the game. At this point in time, most personal lines insurers have built an agent portal, and are often quite proud of the results. As a next step, both agents and carriers would prefer to receive and provide information electronically and process that information with as little human touch as possible, eliminating double entry. Real-time upload, download, and data translation deliver tangible benefits including reduced costs of handling, improved data quality, and improved turnaround time.

    Claims Management
    Streamlining claims management by automating processes improves customer service by speeding up claims service, providing consistent and fair best practices to all customers, and delivering personal service. On top of these customer benefits, insurers who have implemented modern claims systems report tangible speed-to-market benefits. If a carrier hasn’t already begun to upgrade their claims administration system, now is the time to start. Carriers who are using modern systems are rapidly gaining competitive advantages by improved efficiencies in claims handling and improved data leading to better outcome management. In addition, better claims processing has become a significant part of how personal lines insurers market themselves to consumers and how consumers select an insurer.

    These technology priorities are based on the expertise of Novarica’s staff, conversations with members of the Novarica Insurance Technology Research Council, and a review of secondary published resources. For more information, download a free preview of our new Business and Technology Trends: Personal Lines report at: or email me to set up a complimentary 30 minute consultation.

    Related Reports