The Parallel Paths of Data Strategy

Jeff Goldberg

Data strategy at an organization follows multiple paths:

> Data Governance and Definitions
> Data Process, Quality, and Augmentation
> Data Warehousing
> Reports, Dashboards, and BI
> Predictive Analytics
> Big Data and External Data Sources

When I talk to insurers about their data strategy, I like to assess how far along they’ve progressed on the above paths. The exact breakdown (or naming) of these data strategy paths can vary from company to company depending on priorities and opinion. But the key is that they all matter and, just as importantly, that they happen in parallel rather than in series.

There are two problems I find when diving into the strategy details. Either (a) some of the critical paths of data strategy have been ignored or, the opposite issue, (b) some of the incomplete paths are being treated like roadblocks to other progress.

The first problem is pretty easy to understand. If an insurer focuses on just data warehousing and reporting (one of the most common scenarios) the data will never really represent a single-source-of-the-truth, the reports and other BI will always be in contention, and there will be lots of greater values left on the table. For another example, if an insurer puts all their effort into predictive modeling, those models will never be as deep or precise as they could be with better data for analysis. It’s not a surprise, though, that this kind of uneven approach happens all the time; a balanced data strategy is difficult and few insurers have the resources or skill in all areas. The different paths require various technological expertise, while still others require political will.

The second problem, on the other hand, requires rethinking how these different data strategy paths interact. Up above I’ve lined them up in what seems like a natural order: first you need to have some kind of governance group that agrees on what the data means, then you need to have a process to clean and flow the data through the different systems, then you aggregate the data into a warehouse, then you report on it, then you analyze it and build predictive models, and only then do you think about bringing in big data to the picture. It makes logical sense. But it’s also wrong.

The reality is that an insurer can work on all of those paths in any order and/or simultaneously. You don’t need a perfect data warehouse before you start thinking about predictive modeling (in fact, there are plenty of third-party vendors who help you skip right to the predictive models by using industry data). You can run reports directly off your claims system even if it’s data in isolation. Nowhere is there more proof of this than the fact that most insurers hardly have any data governance in place but have still moved forward in other aspects of their data strategy. That doesn’t mean a company should ignore the other paths (that leads to the first problem), but it does mean progress can be made in multiple areas at once.

What’s important to understand is that all these different data strategy paths enhance each other. The further an insurer is down all of them, the stronger each one will be, leading to the best decision making and data-driven actions.

So it’s always good to step back and take a look at the data across an organization, assessing each of these paths individually and seeing what can be done to move each forward. A good data strategy has a plan to advance each path, but also recognizes that no path needs to block another depending on current priorities.

Managing Chaos in the Reinsurance Industry

Jeff Goldberg

While insurers talk about simplifying products to appeal to consumers and maximize online channels, reinsurers are going the other way. As insurers decrease complexity for customers, they increase the complexity of their reinsurance relationships. Reinsurers are also facing severe pressure from legacy issues like mold, asbestos, and other mass torts, as well as professional liability/malpractice issues. Emerging hazards such as terrorism, climate change, pandemics and nanotechnology are further creating new challenges.

These difficult conditions force reinsurers to adapt and come up with creative solutions. For most, this means investing in BI, data analytics, and sophisticated modelling tools. Once some reinsurers begin to revamp their processes, the others must follow. Failure to do so is likely to result in adverse risk selection and loss of market position.

The technology solutions in place at reinsurance companies tend lag behind other areas of the insurance industry. Millions of dollars of business are often still managed via Excel, often on individual laptops. Before they worry about next generation capabilities, reinsurers need stable and robust systems that can support their current business.

Related Research

To Move Forward, Insurer CIOs Must Look Outward

Chuck Ruzicka

In our recent report on IT Budgets and Projects for 2016, CIOs cited their top challenges as IT operations, talent availability, and budgetary restrictions. How can you address, i.e. move forward on these issues? CIOs indicate that are looking outward and that they will be increasingly using cloud-based services and outsourcing to address these challenges.

Larger carriers project only modest increases in outsourcing. We have noticed over the course of the past few years that insurers are moving to embrace the use of blended/variable staffing rather than expanding traditional Application Development and Maintenance contracts. Outsourced infrastructure management and support services are waning in popularity as more insurers move applications to the cloud. Midsize carriers indicate that they are increasing their use of outsourcing. While these arrangements may seem challenging or uninitiated, services have matured and are not just for pioneers.

Carriers need to find the optimal level of outsourcing for themselves. Balancing cost and capacity benefits against their own abilities to manage external providers is crucial. Insurers who require more flexible cost structures may also consider moving towards variable or blended staffing models. This transfers some risk to the services provider but can reduce administrative overhead for both parties and establish more effective strategic partnerships.

Carriers should be evaluating strategic sourcing and outsourcing services on a regular basis. Remember to look outside for best practices and lessons learned to ease the transition.

Related Research:

Analytics: A Route to Growth for Insurers

Steven Kaye

The idea of uses for analytics beyond claims and underwriting fraud detection isn’t new to carriers. Progressive’s growth in part was due to being able to profitably underwrite customer segments that had been dismissed by others, such as motorcycle drivers. Analytics-driven products can aid carriers in pursuing growth, rather than just trading market share back and forth or raising rates. Or, as I once titled a blog post, in finding gold in a tough market.

Two factors holding back the uptake of cyber liability insurance are the difficulty of underwriting and the resulting wide variation in coverages and in pricing. The news that AIR Worldwide, Lloyds, and RMS are agreeing on common data requirements, as covered in several outlets, may lead to convergence in underwriting, and therefore fewer gaps in coverage and less variation in pricing. The standardization of policies in turn may lead to increased uptake – the demand is certainly there already.

Related Research:

For Safe Driving, the Future is Arriving

Jeff Goldberg

Access to cheap, connected sensors and smart devices means companies can leverage data in innovative and meaningful ways. This new use of the Internet of Things will lead to changes in how and what insurance products are sold. In the automotive industry, services analyzing telematics data to recommend safer driving routes and encourage better driving behavior will become the norm. Liberty Mutual recently debuted RightTrack, an in-vehicle mobile application developed in collaboration with Subaru. The application provides real-time feedback on driver performance in order to educate and correct, and as a result make insureds safer.

In the future, consumers will buy insurance from insurers not just for the indemnification or risk but for additional risk mitigation services. Risk is never going completely away, but in different areas it will be greatly reduced (with automotive being the most obvious example). Claims and corresponding losses will go down, and rates will need to follow. This shift from indemnification to mitigation is a move away from how insurers have done business for hundreds of years, but it’s also in the best interest of the insured and a great opportunity to provide more and better service.

Restrictions Lead to Consolidation in Insurance: Who Would Have Guessed?

Mitch Wein

Insurance Journal reported recently that over a dozen state attorneys general have joined the federal government’s review of two mergers that would reduce the number of national health insurers from five to three.

Remember about five years ago when the federal government mandated an 80% medical loss ratio for health insurers, thus capping their max profitability? Turns out that if they can’t save money on reducing loss payments, health insurers will try to save money in other places. First, they’ll reduce commissions. Then they’ll scramble to merge and attempt to realize benefits of scale.

It’s hard to stimulate aggressive competition in a market with firmly limited upside. Given the number of states that are busily outlawing pricing optimization and the use of various underwriting factors, property/casualty insurers should keep an eye on how things play out in health. As Matt said five years ago, restrictions accepted by one segment of the industry may well spread to others.

Insurance Tech Trends 2016: SaaS & Mobile Go Mainstream, Big Data Tech Grows (for “Medium Data” challenges)

Matthew Josefowicz

We recently published our 3rd annual look at “Hot Topics” in insurance IT: social, mobile, analytics, cloud, big data, and digital.
I’ll be presenting this study in a webinar next week on January 20th. You can pre-register here to attend.

These six areas share two main characteristics:

  1. They enable potentially disruptive changes in one of more areas of the insurance value chain or traditional company operating models
  2. They are still discussed more than they are embraced or understood

Other than that, they share little in common, despite the fact that they are often lumped together. They cross consumer media and communications (Social), technology platforms (Mobile and Cloud), tools (Analytics some parts of Big Data), and strategy (Digital). While they are sometimes called “emerging” technology areas, many of them are fairly well-established, at least in other industries.

As the chart below shows, there has been significant growth in several of these areas since our first study in 2014
htimage

Of particular note is the growth in use of Big Data technologies. But rather than using these technologies to analyze external big data sets, many insurers are leveraging their capabilities to analyze their own enterprise data, or what is now being called “medium data.”

The report paints a clear picture of the emergence of haves and have-nots in usage of modern technologies. These areas are rapidly losing their specialness and becoming mainstream. But a significant number of insurers have yet to apply them to their businesses.

Related Research

Consider the Future when Evaluating Today’s Insurance Systems

Chuck Ruzicka

Over the past few years, I’ve worked with many insurers on core policy administration vendor selection, and I’m excited to have joined a team with so much existing experience in this area.

While most selection processes focus heavily on features, configuration tools, architecture and implementation methodology, it is equally important to understand the vendor’s upgrade process and history. Since business needs and technology continue to change rapidly, a key selection factor for a vendor partner is their ability to evolve to support tomorrow’s needs, and not just today’s.

One predictor of the future is the vendor’s record for helping customers migrate to newer versions of their software. The decision to upgrade can be impacted by factors outside the control of the solution provider such as:

  • Business priorities like entering new states, deploying new products or an acquisition
  • Budget constraints due to poor underwriting results
  • Complexity of multi-year implementation of business transformation projects

Regardless of these external factors, upgrades are most likely to be dependent on the vendor itself. In Novarica’s most recent Property/Casualty PAS Market Navigator report, one key difference that emerged between solution providers was the percentage of current clients on versions of systems that were three years old or older. The reasons for these vendor solution differences should be investigated and assessed as part of the software evaluation process. Prospective clients should consider the following:

  1. Value of roadmap items. Does the vendor’s solution roadmap suggest enough new capabilities or improvements with each upgrade to warrant implementation? Well-architected and mature systems may have much of the desired functionality and integration capabilities such that upgrading becomes less important. Even if this is the case, this difference is still relevant due to changes in infrastructure components and the risk of technological obsolescence. Ease in upgrading is more critical if a vendor is catching up or building out new competitive capabilities.
  2. Solution provider release strategy. Some solution providers incorporate all minor releases into the scope of their implementation projects. Others hold back release updates until the system is fully implemented. What is the impact of delaying the incorporation of new functionality during the implementation? Competitors who have already implemented a new system may be leveraging those new roadmap features before you complete your initial implementation.
  3. Velocity of change. Frequent and material releases will drive a higher need for related regression testing capabilities and possible infrastructure upgrades. How will your organization respond to those demands? How will your QA organization manage release (regression) testing along with functionality testing?
  4. Upgrade accelerators. Does the solution provider offer adequate guidance regarding how to configure and manage the system to facilitate future upgrades? Do they have the tools and a proven methodology to make this process easier and less painful?
  5. Upgrade barriers. Potential clients should ask longer term clients either why they haven’t upgraded versions in several years or conversely, what did it take to implement the most recent upgrade. Project team size, duration, challenges and available assistance should be understood. While it is always useful to talk to new clients about methodology, new features and configuration tools, existing clients should also be contacted.
  6. Impact of configuration upgrades. If solution providers have improvements to their configuration tools in their roadmaps, be sure to ask if products implemented in prior versions will automatically roll forward or if a conversion effort is required. You may be surprised by the answer.
  7. Value of more frequent upgrades in a managed service agreement. An additional consideration is whether or not you will get earlier benefits realization from a managed service agreement. SaaS installations tend to be on more current versions of the software. If your organization does not have a disciplined program for upgrading software components, you may benefit from a SaaS implementation.
  8. Selection criteria weights. Given all that you learned by considering the prior seven items, should your selection criteria weighting be altered? Does it give enough weight to differences in upgradeability?

No one wants to be buying a system that will become out of date and require a large scale replacement project in the near future. One way to avoid this is to purchase a system from a solution provider that invests in the future and has a clear and cost effective methodology for managing software upgrades.

Please don’t hesitate to contact me at cruzicka@novarica.com to learn more about our policy administration and other core system vendor selection services. You can also check out:

Divisional CIOs: A Guide to Serving Two Masters

Chuck Ruzicka

Often standardization of common services or platforms does not yield improvements in functionality or capabilities in the short term. Divisional CIOs need to balance the corporate oversight desire for centralization of services and standardization with their business unit needs to improve agility and capabilities. Failing to do both can cost them their jobs. So how do successful CIOs navigate these often conflicting priorities?

  • Build business cases aligned with business unit strategies – Providing a cost benefit to justify an initiative is important. However, creating linkage to key business strategies takes it one step further. This helps functional leaders advocate the initiative within their own corporate governance structure.
  • Chose battles wisely – All CIOS need to be strategic in their sourcing strategies. This same thought applies to the service offerings available from corporate organizations. If corporate can provide competitive service at competitive prices, why not migrate to these service models? It is important to understand what items are non-negotiable. One of the worst ways to start a new relationship is to raise issues already resolved by prior management that have previously been a sore point.
  • Articulate needs in terms of service – By stating requirements in terms of service levels, the dialogue moves from a turf issue to a discussion of capabilities and needs. This goes a long way towards insuring that the needs of a business unit are understood and met.
  • Measure services provided – Centralized service groups need objective feedback on their performance. Divisional CIOS have a fiduciary responsibility to hold service provides accountable. Anecdotal evidence is not adequate. Put the proper measure in place.
  • Understand cost allocations – Expect and demand transparency. If a CIO doesn’t know how and what a business unit is being charged, a realistic understanding of the perceived value of IT is not possible. No one likes surprises.
  • Over-communicate – Working from a plan is better than reacting to needs as they arise. The best way to engage Corporate to meet needs is by reviewing the application portfolio plan with Corporate CIOs, indicating where gaps in services exist and where the most compelling needs are from a business unit perspective. Making sure that a business unit understands the plan, the service levels, the capabilities that are currently provided and how needs are addressed is also critical. Divisional presidents can partner to drive approval of divisional initiatives.


  • When in doubt, don’t do what people want you to do; do what you think is right for your business unit. Divisional CIOS must provide leadership and leverage their intimate knowledge of their business units needs with their knowledge of technologies and vendor capabilities.

    To discuss this topic further, please email Chuck Ruzicka at cruzicka@novarica.com

    Related Reports:

  • CIO Checklist: Multi-Divisional IT Strategy
  • Centralized and Federated IT Models
  • Why Aren’t Insurers Doing More With Their Data Strategy?

    Jeff Goldberg

    While the business intelligence space has matured greatly in the last decade, it has been and remains an area where insurers need to work with a variety of platforms and tools to build out their capabilities. This requires a mix of technical skillsets, business expertise, and vendor relationships. While few efforts at an insurer are more complex or time consuming as a core system replacement, a major BI initiative will eventually touch all aspects of an organization. I will be presenting more on this topic in a webinar on December 1, 2015.

    Today there are more vendors that provide a full business intelligence suite which includes the data layer, the industry-specific data models, the presentation and visualization tools, and the pre-built insurance reports and dashboards. But these suites still need to be tailored to and integrated into the rest of the insurer’s environment. Some policy administration system vendors are either pre-integrating with or releasing their own business intelligence suites. This does simplify the deployment but adds another variable to the “suite vs best-of-breed” decision, and until these options have more production exposure most insurers are still opting for best-of-breed.

    For now, most of these approaches don’t provide some of the ancillary but very important data and analytics areas such as predictive modeling tools and the models themselves, the use of aggregated third-party data sources, or the emerging area of big data. And no matter what approach an insurer takes, it is a near-universal condition that there will be siloes of historical data that need to be considered with or migrated into the new BI solution, and that will take time and effort.

    So despite plethora of vendor options and the general acknowledgement across the industry that good business intelligence is key to ongoing success, why aren’t more insurers further along in their data strategy?

    1. Most insurers struggle with multiple legacy systems and siloes of disparate data, and they are still at the first steps of bringing that data together.

    2. The data that does exist in the organization is of variable quality or completeness. New systems don’t immediately solve that problem without a broader plan in place.

    3. Insurers and core systems have traditionally looked at data from a policy view, complicating the effort to move to new models that tend to take a 365 degree customer view.

    4. Many insurers still have no formal data governance in place and lack organizational agreement on data definitions.

    A good vendor partner can help put the framework and some tools in place to solve the above four roadblocks, but it requires more than just technology. It requires process and cultural change, which can’t be driven solely by IT.

    Many insurers are still looking for a data champion to help push a strategy across the organization and get buy-in from other business leaders. As organizations mature, this data champion role is often formalized as a Chief Data Officer (CDO), and that person typically has a strong data background. But for insurers who are still looking to get a data strategy off the ground, it’s most important to find a leader who is respected in the organization and who is passionate about the value that good business intelligence can bring, even if they have little data experience themselves.

    Related Reports:

  • Business and Technology Trends: Business Intelligence
  • Novarica Market Navigators: Business Intelligence Solutions