Enterprise Architecture and Digitalization

Mitch Wein

At our 8th annual Novarica Insurance Technology Research Council Meeting, discussion began with a definition of “digital” at the Enterprise Architecture and Digitization breakout session. Novarica stresses that digitalization encompasses the entire customer interaction lifecycle, from front-end to back-end. A common theme was that of silos: a given process might be automated, but still require manual elements to be fully executed. A example of this could be creating and exposing a form through a portal but requiring the form to be mailed in by the agent or customer and scanned in by the carrier back office. In part, this is a function of change management and budget – an agent portal project as a quick win has appeal compared to a more exhaustive transformation.

We explored how prepared carriers were for digital given the definition above. Clearly key emerging technology trends like IoT, social media, big data and cloud were pushing firms to reconsider what they needed to do to remain competitive. While some firms were doing regular Strength, Weakness, Opportunity, Threat (SWOT) reviews, many were not. All agreed that a SWOT analysis should be an ongoing effort.

As part of their digital transformation, many carriers were concerned about data quality (and thus resisting automation and subsequent “exposure”). They were also under investing in areas that will be critical to their digital success; many carriers do not have business and technology architects involved with their digital efforts or in some cases don’t even have a business or enterprise architecture group!!!

We briefly touched on security and the importance of making sure security risks are mitigated to avoid reputational, regulatory or legal exposure that could impact a digital insurer. Existing cyber security frameworks like the NIST framework released in 2014 can save carriers time and effort, by providing metrics and a framework that regulators are familiar with. Security exposure of vendor practices and products was mentioned as key to a carrier. We discussed requiring vendors to conduct security audits and report on the results, which can serve as a lever for negotiations.

Finally, we talked about the need to introduce a test and learn culture into the carriers, allowing projects to “fail fast” if they do not work well. We related this to the notion of Fast IT and how it compares with Core IT. Core IT will often use traditional waterfall methodologies and take a long time to evolve systems. In the new digital world Fast IT will behave differently. Fast IT will evolve solutions through Agile processes and allow for solutions to be “thrown-away” if they are not working out. We discussed examples like smartphone portal front-end apps for agents, brokers, or customers. All of the carriers participating felt that the Fast IT concept was new to their firms and will probably difficult to adjust to from a process and funding perspective.

After digital, we moved on to enterprise architecture. We discussed how a reference architecture can provide consistency across projects as well as help with project prioritization. This allows carriers to be proactive and not just chase the hot new technology and feeling constantly behind. Linking architecture governance, aligned to a reference architecture, to funding governance through logical gates was being done in an informal way in some carriers and not at all in others.

If you’d like to discuss enterprise architecture or digitalization further, please feel free to email me to set up a complimentary 30 minute consultation.

Related Reports

Related Blogs

Upcoming Novarica Insurance Technology Research Council Meetings
The Novarica Insurance Technology Research Council is a free, moderated knowledge-sharing community of nearly 400 insurer CIO members. Members represent a cross-section of property/casualty, life/annuity, and health insurers, and range from the very small to the largest companies in the industry. Some of our upcoming 2015 events include:

Will Big Data Replace the Enterprise Data Warehouse?

Jeff Goldberg

At our 8th annual Novarica Insurance Technology Research Council Meeting, we had some great conversations about data, both as part of the formal event agenda and also via informal discussions during breaks or meals. Many of the trending or future technologies being discussed (Internet of Things, wearable devices, social media) from an insurer perspective are really just more and bigger channels of data. And just in case you think insurance data is lacking in thrills: my favorite story of the day was about auto insurers looking for fraud or stolen vehicles by integrating data feeds from the license plate scanners used by bounty hunters.

At the analytics breakout session I had the opportunity to sit around a table with CIOs from a variety of insurers of different sizes and lines. One thing that struck me was how much big data and big data technology is part of the conversation. Novarica typically urges caution when asked about big data, stressing that insurers shouldn’t go looking for a use case to fit a technology. But it seems that more insurers are finding the use case which leads them towards a big data approach, and it raises questions about the future of the enterprise data warehouse.

Building a data warehouse with a single data model that supports the whole business is notoriously difficult, with a success rate that drops as the amount of data and number of lines of business get larger, until becoming a near impossibility. A big data approach allows bringing together many different data sources with different structures without having to go through the normalizing and cleansing process that often derails EDW projects. While both approaches have value, larger insurers are discovering the big data route may be their only option for cross-business analysis. And some smaller insurers see this as a way to rapidly gather and review data as well, often running side-by-side with an existing data warehouse. In fact, in at least one case, an insurer’s big data lake has become the first step in the data workflow, with a data extraction from it feeding the older data warehouse as a way to maintain legacy processes.

We’re currently reaching out to the Novarica Insurance Technology Research Council with a survey about analytics and big data, so will be reporting soon on just how many insurers have taken a big data approach. But even though we expect the percentages to remain small, it’s clear the growth will continue. The tools and options around big data are increasing and more vendors are providing services that make it easier for insurers to leverage big data tech without having to build it from scratch themselves. In time, I expect to see more insurers running a big data project alongside an enterprise data warehouse, and, in some cases, taking over as the hub of that insurer’s data.

If you’re interested in talking more about big data and data technology, please feel free to email me to set up a complimentary 30 minute consultation.

Related Reports

Related Blogs

Upcoming Novarica Insurance Technology Research Council Meetings
The Novarica Insurance Technology Research Council is a free, moderated knowledge-sharing community of nearly 400 insurer CIO members. Members represent a cross-section of property/casualty, life/annuity, and health insurers, and range from the very small to the largest companies in the industry. Some of our upcoming 2015 events include:

Enterprise Data Initiatives Now Taking Center Stage as Insurers Look to Improve their Digital and Analytical Capabilities

Mitch Wein

With insurance carriers of all sizes and lines of business focusing on improving their digital and analytical capabilities to meet customer expectations, the importance of enterprise data has never been greater.

Over the last 3-5 years data has once again become a key focus of the insurance industry. Data is now seen as a key enabler of the insurance industry’s evolution into a fully digitized provider of risk services focused on customer, not product.

When data is collected from either internal or external sources, the CIO becomes responsible for its storage, management and use. This becomes highly complex in today’s world since data is both structured and unstructured and is controlled by various legal and regulatory considerations, as well as internal security and risk policies. Based on the direct experience of Novarica’s senior team and its Council members, the following best practices should be considered when initiating an MDM initiative.

  • Create policies around data and how the data should be managed and controlled.
  • Establish organizational structure for data.
  • Link MDM architecture to business goals and objectives.
  • Consider supplementing internal data in an MDM infrastructure with enriched data and external big data
  • Determine what system or process creates the official data of record.
  • Determine the senior level project sponsor and who pays.
  • Deploy multi-year MDM programs in an incremental fashion.

MDM is more than a technology. It is a program of work involving an assessment of business needs, a data sourcing strategy, a data cleansing strategy to address quality, an architecture and integration initiative, data documentation and classification, as well as an organizational evolution for data governance and ownership. The goal is to create a single view of the master data which can then be referenced by all systems, reporting, and business processes.

Novarica’s experience has shown that the data governance and ownership dimension is often the hardest aspect of this program after identifying a business sponsor and developing a solid business case. Who manages the data, who owns the data, and who is ultimately accountable are difficult challenges that must be addressed.

If your organization is in the process of implementing an MDM program or if your efforts have stalled, please send me an email to set up a complimentary 30 minute consultation.

Recent Novarica CIO Checklist Briefs

  • Master Data Management: A CIO Checklist
  • Architectural Governance: A CIO Checklist
  • Preparing for Digital Transformation: A CIO Checklist
  • The Usage-Based Insurance (UBI) Short Cut: Developing a usage-based insurance program has now gotten easier

    Thuy Osman

    I’ve been following telematics in insurance since 2012 and the main thing I’ve noticed is that the application of telematics by insurers moves at a sluggish pace. While telematics service providers continually improve their platforms and services, offering insurers multiple ways to collect data (connected car, OBD plug in, mobile app), numerous data points to analyze (speed, braking, cornering, road type, etc.), and various applications for the analyzed data (from underwriting to claims), many insurers are still stuck in the initial phase of designing and developing a UBI program.

    Aside from the issue of patents, the concern I hear most from carriers who are in the process of developing a UBI program is the data approach. Deciding on what data to collect, how to collect the data, and how to store and analyze the data is a huge obstacle. The main point of the data process is to arrive at some sort of driver risk score. Instead of throwing time and resources to come up with that score, carriers can now partner with an external analytics vendor who will collect, analyze and synthesize the data into a risk score that can then be used by underwriters to determine eligibility for discounts or rewards.

    Partnering with an analytics vendor and using their program, such as Verisk Telematics Safety Scoring program (which, it was recently announced, is now approved in 41 states), essentially allow carriers to skip over the data process and jump straight to developing insurance products based on risk scores. Sure, carriers will still have to determine which driving behaviors constitute risk to their company. But, checking off the behaviors on a list for the analytics provider is far more efficient than planning out the hardware to collect the data, storing the data, and in particular, wading through pools of data to make some sense of it all.

    A few years ago at a summit for telematics in insurance, the presenters were talking about the possibility of deriving a driver risk score from the telematics data collected from a vehicle. At that time, only a small number of vendors had large enough data sets that could be representative of a segment of the customer driver pool. Today, analytics vendors and telematics service providers alike have collected enough miles and other driving data to develop an algorithm (however simple or complex) to calculate a driver risk score. In fact, providing carriers with a driver score is one of the standard services a TSP provides.

    For carriers who are looking to develop a UBI program, or even for carriers who are already entrenched in this process, partnering with an external vendor can save time and decrease the risk involved in launching such a program. As carriers see the benefit in this partnership and as they begin to outsource more to the data process to the data experts, maybe the application of telematics in insurance will pick up.

    If you’re interested in discussing this topic further, please contact me at email to arrange a call.

    Related Research

  • Telematics in Insurance: Key Issues and Trends 2014
  • The Evolving Data Analytics Group: Applying Lessons from IT Organizations of the Past

    Jeff Goldberg

    I’ve been working with a lot of insurance companies who are struggling to find the right home for data analytics within their organization and I’m struck by the similarity these questions have to the more general evolution of IT organizations over the last two decades. Matt Josefowicz pointed this connection out in a blog post and I’d like to examine the phenomenon in more detail.

    When I call this a “struggle” I don’t mean that in a negative way. Often companies try out different approaches with an updated technology or a shift in process or–in this case–an entirely new group because they believe in its value to the business but industry best practices haven’t been worked out yet. This struggle is part of the growth and leadership process. That’s definitely the case now with data analytics and was (and is still!) the case with IT. There won’t be a general agreement about the best practices for a data analytics group within an insurance company for a long time yet (if ever) but we can shortcut some of the learning curve by looking at a similar evolution of the IT organization.

    The most common question: where does the role of data analytics fit? Should insurers create a new separate data analytics group or should different data analytics resources report directly to their respective business units? Insurers are trying it both ways right now, and there’s no sure right answer. This is the same centralized v federated (or horizontal v vertical) debate that has gone on for decades in regards to IT resources.

    By centralizing data analytics resources together in their own corporate unit, it (1) allows a sharing of skills, tools, and best practices, (2) avoids redundancy and rework, and (3) leads to an easier adoption of a corporate mission for insight and business intelligence. By federating resources out to each operating business unit it (1) allows very tight alignment of business goals with the assigned data analytics expert, (2) helps that expert gain a depth of understanding about the business that they may lack otherwise, (3) better promotes the mission of data analytics to business users.

    These are similar–if not the same–as the drivers in IT, and just like IT there are benefits to both approaches. In fact, for IT alignment, while shifting between a horizontal and vertical approach, many organizations have found that the shift itself is valuable, giving employees the opportunity to spread what they’ve learned to others, either in terms of business insight or best practices. So insurers trying different approaches to the organization of data analytics should be rest assured that multiple approaches all have value.

    The second similarity between data analytics now and IT organizations of the past is about recruiting talent. These days colleges offer a variety of information technology and computer science degrees, creating a pipeline of potential employees. But that wasn’t always the case, and insurance companies (and companies in other industries) had to staff their IT departments by hiring out of other engineering programs or find people who had a technical aptitude and train them in computer programming.

    With data analytics, there’s a similar lack of clarity about who to hire. Some insurers are recruiting PhDs and creating teams of data scientists, others are looking internally for technical staff who have a knack for data insight and exploration and can be cross-trained. But as demand grows, more universities will offer data analysis coursework at an undergraduate or masters level, increasing the availability of trained hires. Of course, just like IT, insurers will be competing against specialized companies to recruit those graduates, and will need to figure out how to attract them to our industry.

    If you’ve been struggling with the role of data analytics within your organization or are interested in benchmarking your company’s BI approach against your peers, please feel free to reach out to me. To send me a note or set up a complimentary 1 hour consultation, contact me via email.

    Mitigation versus Prevention: Three Questions Insurers Must Answer In Order to Evolve in a IoT and Big Data World

    Jeff Goldberg

    The insurance industry is only just beginning to deal with a host of emerging technologies, including Big Data, the Internet of Things (IoT), and a vast wealth of sensor data such as automotive telematics, smart watches, and fitness trackers. In 2015 it’s likely that the industry as a whole will take some small steps (and maybe a few big ones) towards capturing this plethora of data and applying it towards better underwriting decisions and risk management. But there’s more at stake than just incremental improvements to the existing business model.

    There are three key questions that need to be answered in order to move towards fundamental change:

    1. Will insurers be able to develop the technology (Big Data or otherwise) to capture and use this growing amount of customer data?
    2. Even if insurers can build this technology, will customers be willing to share their growing streams of personal data with insurers?
    3. How can insurers use this customer data to do more than just update their current process?

    For the most part, the industry has been fixated on the first question. Insurers have been experimenting with Big Data technologies and looking into hiring Hadoop experts, and while there are a few examples of big impact projects, the majority of insurers haven’t made much headway. Doing more with traditional data technologies has been sufficient, partially because–despite all the hype–the bulk of the business is still dealing with “small data.” Everyone knows about how some auto insurers gather vehicle driving data from participating customers, but outside of personal auto examples are rare.

    But what happens when all new vehicles in the marketplace capture driving data, not just those where customers have plugged in insurer-provided dongles? And what happens when a maturing IoT means that all customer homes are gathering data about safety and security? And what about when a significant percentage of people are wearing devices that monitor their health and well being?

    This leads us to the second question: Will insurers even be able to get access to this data?

    Right now many consumers seem to be willing to share personal data with a host of third party companies in return for convenience and cool features. But will consumers be as open to sharing this data with their insurance company? Many fear that their data will be used against them, resulting in higher rates or cancelled policies. That’s a hurdle the industry needs to overcome.

    In terms of true goal alignment, the insurance industry is actually much better positioned than many tech companies that currently control so much of the current personal data flow. The main goal of these tech companies is to use personal data to monetize the consumer. Companies like Google and many other less-trustworthy third parties want access to the customer’s data in order to properly position advertisements. Companies like Apple and high tech device makers want access to the customer’s data in order to sell them ever more gadgets. But insurance companies want access to the customer’s data to manage a customer’s risk, not to advertise or continually sell more to them.

    But how does an insurer convince a consumer that they will use their data to help them rather than to sell or advertise to them? With automotive telematics we’ve learned that the first step is monetary: provide discounts. But the second step is making use of that data for more than just underwriting and risk management.

    This brings us to the third question: How can insurers use this customer data to do more than just update their current process?

    As the amount of real-time customer data expands and an insurer’s ability to process and understand that data grows, insurers will find themselves able to make informed insights about a customer earlier in the process. Instead of responding to a burst pipe after a winter freeze, an insurer monitoring home sensors will be able to alert a customer before weather damage occurs, saving both the customer and the insurer time and money. Instead of paying claims after an auto accident, an insurer will be able to make recommendations to a customer about patterns that will help them avoid accidents altogether. A workers comp insurer tracking RFID badges will be able to help reduce worksite mishaps and liability.

    Insurers will be in the unique position of looking out for the customer’s well being, helping to prevent accidents, theft, and loss. Unlike most industries, the insurer wants exactly what the consumer wants, and both are happier when claims never need to be filed.

    While some insurance companies might lead this charge, it’s unlikely that all insurers will develop the technology to capture and analyze the full range of data. Instead, third party companies will probably emerge that serve as consumer data hubs and begin to monitor and make suggestions to their clients. Insurers who don’t build this technology themselves will have the opportunity to partner with these third party companies, offering discounts and possibly covering the cost of the services for their customers.

    The new year is a good time for thinking about longer term change. As multiple technologies converge, insurers should be planning for their role to evolve, migrating from one of just risk mitigation to include risk prevention as well.

    As always I welcome your feedback. To send me a note or set up a complimentary 1 hour consultation, contact me via email.

    Related Reports

  • Big Data Technologies for Insurers
  • Big Data Lessons for Insurers from Other Industries (Executive Brief)
  • Insurer Digital Capabilities and Strategies
  • Wearable Technologies and Insurance
  • Benchmarking the “New Normal”: 50 Advanced Capabilities for P&C Insurers
  • Telematics in Insurance:Key Issues and Trends in 2014
  • Document Management and ECM – New Novarica Market Navigator Report

    Tom Benton

    Insurers are showing increasing interest in improving workflow and customer experience.  This often includes providing multiple communication channels, such as mobile texting, social media and video, along with traditional paper and e-mail.  The growing amount of unstructured data from these communications brings challenges for management, storage, workflow and distribution along with leveraging the data for analytics and reporting.

    Insurers are finding that legacy document management systems are not able to meet demands for customer experience and workflow initiatives.  Many find that replacement is necessary, and that current document management / ECM (Enterprise Content Management) systems have capabilities that are difficult to add to legacy systems. Updating can also provide opportunities for improved process flow along with new deployment options such as SaaS or hosted ECM solutions.

    Novarica has published an updated Market Navigator on Document Management and ECM Systems, available now.  This report presents an overview of the current solution provider marketplace to assist insurers in drawing up their shortlists of potential providers based on vendor market position and offering details.


    What Insurers Should Really Feel Bad About

    Martina Conlon

    We work with lots of insurer IT departments that seem to feel bad about quite a few things. They are embarrassed that they have a 4 batch policy system in production, or that they don’t use a fraud analytics, or that they order replacement parts for a few of their servers on eBay. The most common one I hear lately is that they don’t have a big data program, understanding of big data or a business community that would know what to do with it.

    Well, stop feeling bad – this far more common in our industry than you think. There are about 600 P&C insurers, and many of them share these same or similar issues – even some of the top 20 players.

    What you should feel bad about is if you have not outlined the technology vision for your company to advance beyond these limitations and address business and IT needs in the longer term. Whether you have a transformation project in progress, on deck or already complete, it is critical to outline the to-be state of your technology and how you plan to get there. Work with the business to define a technology strategy that is driven by business goals and IT guiding principles. Document it, even on one page, and communicate it throughout the organization. A shared understanding of the to-be technology state will help govern technology projects over time, and will help develop better IT/business alignment.

    And if you have already defined your technology vision, even at the highest level, feel good.

    Research Council: Big Data’s Big Challenges

    Martina Conlon

    At last week’s Research Council meeting, I led a spirited discussion on big data and analytics in insurance. Like every big data discussion, we spent a healthy amount of time on the definition of big data; as a group of technology professionals, we obsessed over the definition, with a number of participants offering variations. However, volume, velocity, and variety were recurring terms in most of the definitions.

    We advised the group to think more broadly, to consider big data as the opportunity to leverage existing and new data in considerable volumes without the technological barriers that existed just a few years ago. We reviewed Novarica research on the adoption of big data and predictive analytics, and had an active discussion on what the attendees were implementing or planning in the space. Several carriers are focusing on telematics, working with partners in the planning or pilot phase. Quite a few are using web activity and click streams for customer service and marketing support, while others are assembling holistic customer views or pairing prospects to products by using matching models and historical underwriting data. One insurer shared their experience in piloting big data technology for management and analysis of system log files. Most smaller and midsize carrier attendees are still focusing on core system replacements and have yet to invest in big data.

    Regardless of the state of big data within participants’ organizations, most agreed that building the business case will continue to be a big challenge until the market leaders effectively communicate the benefits and make the initiatives table stakes rather than an optional R&D effort.

    HBR on Big Data and Data-Driven Leadership v. HIPPOs

    Matthew Josefowicz

    If you’re thinking about how Big Data and Analytics will impact your organization, the single biggest issue is whether your company’s leadership will embrace data-driven decision-making to drive change.

    As we put it in our recent Big Data and Analytics Report:

    Follow business need. While technologists and data specialists can get excited about the potential value of integrating new data sources and analytics capabilities, analytics initiatives that are not requested by the business will fail because business will not take advantage of them. This was the ignominious end of many data warehousing projects in the late 90’s and early 2000’s, and the scars are still fresh for many companies. The first step for insurance data and analytics specialists (beyond R&D initiatives like pilot programs or proof of concept experiments) must be to encourage business executives to think about the potential value of the outcome, and gauge the willingness of senior leaders to drive change.

    Today I came across a great blog post at Harvard Business Review by Andrew McAfee and Erik Brynjolfsson from MIT on the necessary changes to management culture that are required to thrive in a Big Data world.

    It contrasts data-driven leadership with traditional models of following the HIPPO (Highest Paid Person’s Opinion). I highly recommend this post, especially the discussion that starts on the bottom of page 3 and continues through the end. http://hbr.org/2012/10/big-data-the-management-revolution/ar/1

    If you’re unfamiliar with Brynjolfsson and McAfee’s work, I also recommend their recent book, Race Against the Machine.