Small commercial insurance moves online…because it’s too low margin to do offline?

Matthew Josefowicz

There was an interesting article today on PC360 about the state of small business online. The article echoed many of the themes and issues we raised in our report last summer, Direct Online Small Commercial Insurance.

The article had some interesting quotes from direct players like Insureon and Hiscox, both of which were featured in our report. But it also contained this quote, which raises an issue we didn’t mention last summer:

“The Hartford is committed to a multichannel distribution model in its small commercial business and independent agents are at the center of the distribution strategy,” says Ray Sprague, senior vice president of small commercial insurance at The Hartford, but adds that the vast majority of small businesses operating in the U.S. today are often too small for many independent agents to profitably acquire and serve (emphasis added).

This resonates with several comments that small commercial CIOs made at a meeting I attended last week. There’s a big SOHO small commercial market that’s too small for most agents to care about. I believe insurers will increasingly look to the direct channel to be able to meet this market demand.

Related research and blog posts:

Accelerating Pace of Change Requires New IT Planning Paradigms

Rob McIsaac

One of the realities of IT in any industry is that “truth” related to technology is a fleeting thing. The best system or technology to deploy can evolve with surprising speed, making it important for CIO’s and their organizations to determine with some precision what a roadmap toward a future state should look like. Increasingly, CIO’s and their team should carefully consider just how long they think they will be in that future state too. This has implications for both the technologies to be deployed and the financial mechanics used to pay for them. Missing either of these key points can create the IT equivalent of “The Hangover”. Unfortunately, aspirin alone won’t cure this one!

There are parallels in other parts of our personal and professional lives. As a frugal minded sort, my typical approach to cars was to buy them and drive them long after the warranty and that new-car smell were gone. While the shapes and sizes until recently changed like fashion statements, the essential technology remained pretty stable.  Parts evolving slowly over time and had surprisingly long useful lives. As a result, parts and skills remained in a pretty consistent supply. A few years ago I finished restoring a 30 year old BMW (ok, so being frugal has its limits) and the only limiting factors were time and money. Parts and skills could be bought, because essentially the same vehicle had been in production for nearly 15 years.

Try that trick with a new car. They are better in every way. Faster, quicker, safer, better fuel economy, less maintenance. The list is long. But the challenge is that the technology used is fleeting. A two or three year old vehicle may have technology embedded that looks nothing like what is in production now. When the parts run out, there may be no clear path forward. As a friend of mine said, “I don’t think I could afford the risk of owning a new one when the warranty runs out!”.  Relatively small parts failures could lead to catastrophic financial events.  Leasing starts to sound like a pretty decent idea; about the time problems begin to set in, give the keys back and start over again.  It is an appliance, not an investment.

That’s hardly unique to cars. Is anyone paying real money to fix an iPhone 4?  Of course not. They were the height of cool a few years ago and helped to change the world we live in. Now they are disposable.

Large flat screen TV’s are the same way.  When a circa 2008 model expired recently, it was cheaper to get a new one (that was far better) than it was to fix the old one.  Turn them and burn them when they’re done.

There’s a good chance my next car will be disposable too. I will lease it, use it for a specific period of time, then replace it on or around a known date. I won’t depreciate it, won’t fix it, won’t treasure it like a friend. I will consume it and move on.

The same should be true of future core systems at insurance carriers. The systems and their vendors will evolve quickly using the “best” available technology at a moment in time. Then they will move on. Rinse and repeat will be their model.

And while carriers have built, bought, modified and embraced systems from the 1960′s to the 2000′s (a surprising number of 40-50 year old systems run major workloads every night), that’s a model that has a foreseeable end. Anyone pining for that “state of the art 2009 platform” now?  Of course not; we would have had a challenging time describing some of the things that would be key drivers for business success five years later.  That will be even more true as we think about 2019 or 2024.

Rather than acquiring and depreciation systems for a protracted lifespan, implementing with an eye toward “replacing the replacement” appears to be a more viable and effective model. This impacts skill sets, depreciation schedules and even the future state IT discussions. It may no longer be a “buy versus build” dialogue. For the future it may be “buy versus rent”.

A variety of factors have now come together to make this a viable option. If email for large / complex / highly regulated companies can live in the cloud, a host of other things like policy administration, claims, distribution management and financials can too. Pun intended.

I never thought I’d lease a car either, but we’ve crossed a risk / return tipping point that makes that a pretty attractive option. Of course I will keep my ’84 Bimmer for fun and pleasure. Sure wish the A/C worked better, however …

 

“Permission Space” and Transformation Initiatives

Catherine Stagg-Macey

It was news to me but it turns that the world’s largest wi-fi installation is underground in London. Literally under my feet. Transport for London (TFL) has installed free wi-fi in all 137 stations on the London underground. No small feat given most of the infrastructure was built in the Victorian times.

In a recent presentation I saw by Matt Griffin, Head of Biz relationship and IT strategy at TFL, he made the point that this project was the first IT project that had a direct impact on customers. IT had been kept away from the end-customer. TFL is an engineering organization that prioritises anything engineering related over IT.

Triggered by requirements of the Olympics for station staff to better manage station access to a large number of tourists, TFL had to tackle the challenge of better intra-station communication for staff.

IT had little credibility in the business community at TFL. As Griffin put it, they had limited permission space. Clever contracting meant the burden of the financial outlay resided with the mobile network carrier. Even with this reality, TFL IT was reminded rather strongly that they would not get any money should they overrun.

The project went in on time and budget. Wi-fi capability provided much needed information to effectively manage the huge increase in passengers during the Olympics. Customer feedback to TFL was that underground journey’s were more pleasant than usual during the Olympics. That was certainly my experience too.

Learning from this experience, TFL is now working to leverage this digital capability to create extra capacity in the network without new rail or train investments. Needless to say, the business is rather charmed by this all and IT now has rather a lot of permission space to propose new investments that can support TFL.

This is a great story of permission space. As CIO, it’s important to get a good measure of the IT credibility in the business community. Large change initiatives should only be undertaken in the golden end of the permission space continuum. If you aren’t there, work on the small projects that will have visibility to the end-users and build up the reputation of a department that knows what it’s doing.

Click here for more on Novarica’s CIO Best Practices research.

Here’s the Problem… A Process Is Only As Good As Its Weakest Link!

Rob McIsaac

Recently, I decided that I needed to update my life insurance portfolio. With a range of life events taking place, and a 10 year term policy purchased in 2004 coming to a natural end, I was poised to take quick action. Suffering from mild OCD, I actually started the process a full 8 weeks before the anniversary date. Little did I know that I was dancing on a razor’s edge in terms of timing. This sort of “secret shopper” experience has been frustrating, humorous and thought provoking all at the same time. Does it really need to be this hard? If a process is only as good as its weakest link, this one sets a new standard on the low side of the scale.

In an era of instant access, and nearly instantaneous gratification, I went online to start shopping at an aggregator site. To my surprise, this was less functional than the site I recalled from 10 years ago but it did earn me a call back from a call center agent. After going through the medical questions, we landed on the need to “draw fluids”, a process that could take 2-3 weeks to complete. Given the green light, this process started. It took 3-4 weeks to actually schedule the blood draw.

Thinking the process had run amuck, I went to a second carrier directly. After completing their app online, I was called back in minutes for the medical questions. Because of how I answered one question, I was cautioned that I wouldn’t qualify for the super preferred rate and that the agent had no idea what the premium might be. The thrust of the conversation was that it would likely be around 50% more, but that this was just a SWAG. Clearly surprised that I wanted to proceed anyway (not a great trait in sales) we again marched into the need to draw fluids. A process that could take, I was assured, 2-3 weeks.

The third carrier was a traditional Agency company that I decided to test to see if the web site channel worked. Although it took several business days for someone to respond, when they did, the agent was effective and knowledgeable. She was able to share different premium scenarios and suggest which products might best fit the need. While the low end price was higher than the direct company super-preferred rate, the “likely” rate based on the medical questions was lower. And, of course, drawing blood would take several weeks.

Several interesting points in the process:

  • For all practical purposes, the questions the carriers asked were exactly the same. The only variations seemed to be in looking at my driving record (3 years or 5) and my parents issues with illness (before 60 or 65). Other than that, it was cookie cutter.
  • All three companies declined the opportunity to get medical records (fluids, EKG, chest x- ray) directly from my doctor, despite the fact that they were available as part of an annual physical done two weeks earlier. All wanted to have their own chance to stick me with needles.
  • All three carriers used the exact same service to do the fluid draw and on site visit. Some effect envy for me there, since I got a “three for one” deal on the fluids and the list time. Each carrier wanted their own EKG original so I had to sit through that multiple times, but only partially disrobe once.
  • The direct carriers are decidedly poor at staying in touch with process updates. With them it feels like I’ve fallen into an abyss. The agency carrier seems to be far more engaged through my touch point.
  • Across the board, the process seems broken … or at least archaic. I became a little worried about coverage gaps with the ’04 policy exporting, but I shouldn’t have been concerned. The old carrier indicated that it takes a calendar quarter to actually lapse the contract … whose premium would triple on the next anniversary barring action on my part.

    At a time when the balance of consumer’s financial lives is so readily available through self service and guided experience, this seems like a trip back in time to a different world. Actions are measured in weeks and quarters rather than minutes and hours. Rather than full transparency on information and pricing, the process feels both secretive and ill-informed.

    The process also seems to be intentionally inefficient. When my doctor did his version of the fluid analysis, we had results in 2-3 days. The paramedic firm used by all three carriers said it took 2-3 weeks. How could that possibly be?

    Left to a natural course, this process could run (in total) 6-10 weeks, by my estimation. At that point, I will be presented with “take it or leave it” offers from all the carriers involved. I will, of course, have a personal choice to make at that point, armed with full disclosure and valid pricing as inputs. In the end, it will have a happy outcome for some.

    This got me thinking about my own children and their Gen Y peers. They would be highly unlikely to participate in an exercise as slow and as painful as this one. Baby Boomers like me may now be closing in on purchasing their last life insurance contracts as other life events loom.

    For carriers, the time to think about the required paradigm shifts is coming quickly. Those footsteps you hear are future generations of prospects but they may be running away, rather than toward, you!

    Big data, mobile capabilities, access to a form of Telematics and other devices may all prove to be game changers sooner than we think. Remember what life was like before SmartPhone? I don’t either…

    Turning Insurance Outside-In

    Matthew Josefowicz

    Across the great formal presentations, panel discussions, and roundtables at our 7th Annual Council Meeting this week, one theme kept jumping out for me: the need for insurance to become more demand-led in market, operational, and technology strategies.

    As an industry, we have a tendency to view the world from the inside out. We need to reverse that perspective and look at the our industry and our operations from the outside in. We need to start from market and operational needs as we plan product, service, and technology strategies, rather than starting from our own understanding of capacities.

    Our keynote speaker, data and analytics expert Adam Braff, hit on this theme in his opening presentation on “Cooking with Big Data,” with the first of his 5 guidelines: “Figure out what people want to eat before you go shopping.” Too many analytics efforts start with gathering data rather than thinking about how insights might be operationalized to drive better business results. The supply of data and analytical capability is leading in too many cases, rather than the demand for insight.

    My presentation on Trends in Information Technology and Insurance focused on how changes in the ability to access, communicate, and analyze information means that buyer and distributor expectations about speed, flexibility, and even value propositions, are diverging from insurers’ own understandings of the world. The supply of risk analysis and distribution is leading in too many cases, rather than the demand for coverage.

    In our CIO panel, a common theme of the panelists from AFLAC, The Hartford, Great American, and New York Life Investment Management was re-orienting IT organizations to be more focused on the creation of business value. This involves educating IT staff about business needs and goals as well as educating business leaders about the implications of their requests. The supply (and cost) of technology is leading in too many cases, rather than the demand for capabilities.

    This will be a massive shift for the insurance industry, but one that is necessary to undertake. Access to information, communications technology, and analytical capability is democratizing the ability to price and sell risk. Insurers (and insurer operational and IT executives) that focus on the demand for coverage and capabilities will be better positioned to meet that demand. Those that don’t may soon find themselves with much less demand for what they have to offer.

    The 7th annual Novarica Insurance Technology Research Council Meeting was held in Providence RI on April 30-May 1, and was attended by more than 70 insurer CIOs and senior IT executives. A report based on the discussions at the meeting will be published shortly.

    Other recent Novarica reports on this theme include:

     

    Straight Through Chicago

    Rob McIsaac

    Last week’s LIMRA/ LOMA Retirement Conference in Chicago provided an interesting overview and update for what is happening in the industry today. Jim McCool from Charles Schwab noted the importance of having carriers move to establish trust with consumers, and the need to de-clutter and simplify products and business models. He highlighted the example of Apple as a company that has taken a potentially complex space and made it elegantly simple with a terrific user experience that inspires trust and confidence.

    This was a great build on a presentation I had an opportunity to deliver at the conference on Straight Through Processing.

    The reality in the United States is that 10,000 Baby Boomers are now reaching retirement every day, something that will persist for the foreseeable future. The opportunity for carriers to prepare for this is now. Further, with low interest rates and continued cost pressure, finding ways to reduce operational expenses while improving customer experience (for both agents and customers) is critical.

    Another reality is that customer experiences are increasing being set by companies like Apple, Facebook, Google and Amazon. They have perfected ways to make complex things simple, easy to use, innovative and “delightful” to customers. With expectations set there, business practices that are dependent on paper and rooted in the 1950′s are increasing arcane and inaccessible to agents and customers alike. The need to drive toward electronic applications and electronic signatures is crucial for carriers across lines of business. It is both a crucial step toward better customer experience now … and a precursor to bring able to deliver on meaningful mobile capabilities.

    This was an opportunity to highlight findings from a recent Electronic Signatures Executive Brief we published.

    When asked if there was a potential crisis due to aging in the producer community, the executive panelists at the conference’s main session noted that there is. Allianz, Schwab and Wells Fargo all acknowledged the problem and highlighted approaches they are taking to prepare for a new generation of advisors.

    In some places, the agent / advisor community is actually aging faster than the general population at large. This also highlights the importance of creating better and more compelling user experiences for both producers and end clients. Moving to simplify business process, allowing for the electronic execution of transactions and “going mobile” are all key to this. Carriers will continue to need to compete for advisor “mind share” which will require experiences that can be concurrently compelling to multiple generations of users. All of this, of course, ties back to the Hot Topics we see for insurers in the near future.

    The Apple analogy continues to resonate, particularly if carriers want to truly remain relevant in a highly competitive environment.

    While there are certainly complexities inherent to the life insurance, annuity and retirement plans segments of financial services, the future is clear: STP is moving from being innovative to becoming a “cost of doing business”. Hope is not a strategy and indecision is not a winning game plan.

    New Brief: Wearable Technology and Insurance

    Tom Benton

    Over the last two years, fitness tracking bands, smartwatches and Google Glass have fueled the next wave of consumer electronics:  wearable technology. Financial services firms and insurers are already starting to find innovative ways to use wearables. In my new brief, Wearable Technology and Insurance , I outline three key capabilities and some examples of how these enable innovative applications for insurers and financial services firms. 

    In some respects, “wearables” are not new – after all, the Dick Tracy comic strip introduced their iconic “wrist radio” just after World War II.  What is new is that smartphone adoption and more efficient smaller batteries are enabling new devices and applications.

    I currently have two wearables on my wrist – a fitness tracking band (the Fitbit Flex that I have been wearing since June 2013) and a smartwatch (a Pebble – I was one of the 69,000 or so who backed the project on Kickstarter back in May 2012, but I started wearing it regularly earlier this year).  I am seeing more and more people wearing these devices and with the recent introduction of Android Wear, Google’s extension to the Android operating system for wearable devices, we can expect 2014 to be the “year of the wearable”.

    As wearables gain adoption by consumers, innovative insurers will find ways to use them in engaging customers.  Others should consider how wearables will fit into mobile and customer communication strategies.  Wearables are on the way – how can you leverage them for customer interactions?  Read the brief and let me know your ideas.

    Crowdsourcing Predictive Models: Who Wins?

    Martina Conlon

    Analysis of data, creation of predictive models, and the ability to take action based on the outcome of those models have always been at the core of the insurance industry. However, there seems to be a peak in interest in the predictive modeling space right now from our research council and clients. Carriers are realizing how effective these scoring models can be across the insurance lifecycle and want in.

    From our research (http://www.novarica.com/analytics_big_data_2013/) we know that predictive models can help marketing departments in lead development, cross selling and campaign targeting. R&D departments can use custom or standard predictive models as part of rate making and distribution can use predictive models to better target prospective agents and geographies. Predictive risk models can improve underwriting consistency, transparency, automate segments of the underwriting process, and ensure that the right underwriter sees the right submissions, all the while driving profitability. Using predictive models for claims triage and expert adjuster assignment can have a big impact on claims severity. Insurers use scoring models to gain insight into which claims are candidate for fraud investigation, subrogation, litigation, and settlement as well as more accurate and automated loss reserving.

    Although the opportunities abound with predictive models, obstacles slow down adoption, especially for small and mid-size insurers. Potential high cost combined with uncertain return, priority given to other projects, limited internal data volume, the lack of data scientists, and the lack of business sponsorship are among the biggest challenges. Luckily, the vendor community servicing the space is active and expanding, and they are here to help insurers overcome these obstacles. A variety of insurance specific data warehouses, analytics tools, third party data and predictive models can be purchased. Actuarial and specialized consulting firms offer data scientists with insurance domain experience to provide you with the expertise that you lack in house. These vendors are also communicating their successes to business stakeholders and they are paying attention.

    And today – a colleague asked me, “Have you heard of Kaggle?” Kaggle is predictive analytics solution provider for the energy industry, but Kaggle also hosts a marketplace for data science competitions for all industries, data science forums and job posting boards. Allstate has an open competition with them for development of a predictive model to predict which quote/coverages will be purchased when a prospect is presented with several options. Data scientists from across the world are working on this right now, competing for $50,000 in prize money. Allstate conducted a similar competition last year around claims with a much smaller prize where they gained substantial benefits and insights from the submitted models, feedback and concepts. And many other Kaggle competitions have no cash prize, just recognition within the community or a job offer.

    So one may think – here’s an option to make predictive modeling more accessible to smaller and mid-size insurers. But to date, crowd sourcing of predictive models has been most successful with companies that have a strong analytics practices already. According to industry press, Allstate’s predictive modeling team felt that the infusion of new ideas and approaches was extremely valuable and enabled them to significantly improve their models. Unfortunately, Kaggle won’t likely be a silver bullet for smaller insurers. Kaggle doesn’t offer solution to many of the obstacles mentioned above. However, it does offer one more way for small companies to gain access to a predictive modeling community and skilled data science resources – which may level the playing field just a little bit.

    Goodbye, Old Friend

    Rob McIsaac

    Well, we are at the end of a mighty 13 year run. Microsoft will be pulling the plug on Windows XP life support early next month. All indications are that this is no April Fool’s joke.

    All indications also are that someone would have to be fooling themselves to think that continuing to use it now would be a good idea. I have a solitary machine running the venerable OS. It refuses to run Win-7 and so the end has come. In a few weeks the network interface will be disabled and it will revert to being a glorified, isolated, word processor. The sneaker network lives on via a hacker proof thumb drive.

    Of course I’m fortunate. I only have one machine to worry about and no dedicated apps that are tied to antique software stack components. The Windows 8.1 machine I’m now running is great and wasn’t much more expensive than 2 year’s worth of extended support from Redmond. Most insurance carriers don’t have such an easy set of choices.

    The migration from Windows NT to XP was slow and painful, carrying with it some notable challenges and costs. The journey wasn’t engineered in the shadow of a financial crisis that has had a long, lingering, hangover; it simply faced normal cost headwinds and technical challenges on non-portable code. The contrast between old and new was also stark, with improvement galore, which generally excited users.

    This time around, the improvements are significant but harder to see from the UI. In fact the UI is polarizing, so it alone does not create a big push to bring it into use. Perhaps worse, given the long and successful run of XP, is the sheer number of applications that run on it natively and won’t transition smoothly or cheaply. Reliance on old browser version, old software, old databases and other incompatibilities make it daunting to explain why migration is a good idea. It also makes the transition expensive to execute.

    Good luck making all those old Access databases, for example, work in a new environment.

    Of course, hand wringing won’t be helpful. Developing and delivering on a migration plan, in concert with key vendors, is really the only possible path forward. A range of solutions are possible, including isolating equipment and virtualizing some “legacy” applications. There won’t be a silver bullet on this, however. Looking at this, and a range of other security related concerns, was highlighted in a recent executive brief (IT Security Issues Update) published by my colleague Tom Benton.

    One thing for CIOs and their teams to consider, if they haven’t done it already, is building an educational program around this issue. Making remediation part of a broader effort to improve functionality, reduce risk and reduce support cost over time, can also help win critical organizational and executive support. Mixing in some sugar may make it easier to swallow some strong medicine. This one is worth it since failing to address it now could lead to a much bigger problem in the not so distant future.

    Unintended Consequences

    Rob McIsaac

    An article in this week’s Business Week magazine reminded me of the impact that unintended consequences can have on projects and programs throughout financial services. Regardless of how one views the Affordable Care Act, one of the clear points of debate and discussion has related to the speed and breadth of program adoption. Following the decidedly flawed rollout of the Federal exchanges late last year, it seemed that the pace of “uptake” would have been considerably slower than program sponsors had anticipated.

    A reality now, however, is that adoption is running along at a pretty good clip. One of the drivers helping this along now are professional tax preparation companies like Jackson-Hewitt and H&R Block (http://www.businessweek.com/articles/2014-02-20/obamacares-arms-length-allies-h-and-r-block-and-jackson-hewitt)

    How can that be?

    Well, it turns out that most of the info required to apply for health insurance is included in the tax return preparation process. Once the IRS forms are completed, it only takes about 6 minutes to apply for insurance, which these companies offer as a service to their customers. It isn’t a political judgment on their end, just actions driven by the economics associated with providing good service to their customers. As they are quick to point out, it is also part of the tax code, and their job is to get the best returns possible for their clients.

    Didn’t see that one coming!

    Which begs the question how many other good things, that may be unintended consequences, do we miss in managing technology for insurance carriers? Once when deploying customer service capabilities for life and annuity clients at a major carrier, we also deployed the portal into call centers to allow CSR’s to see the same screens that customers could. Nice idea. The unintended consequence, however, was that we dramatically improved our Business Continuity capabilities.

    How? We achieved this by taking a variety of systems on different platforms and with different user experiences, and browser enabling them. It was a remarkable, if accidental, addition to our technology capabilities, which created real and measurable long term value.

    Are there other unintended consequences that can appear as technology projects unfold? Absolutely there are. One of the challenges that CIO’s and their teams need to keep in mind is how they can foster the appropriate level of situational awareness, and openness of mind, to recognize these opportunities when they emerge.

    This isn’t to say that all unintended consequences are “good things”, of course. There are many instances where the result of changes turns into poorly performing systems or capabilities that fall far short of the original objectives framed during the design phase of an initiative.

    The key point remains, however. Not all surprises have to be portents of bad news. As new technologies emerge and business processes evolve, there may be new opportunities to see multiplied value for carriers making the appropriate investments in new and innovative ideas. This is potentially an area for IT organizations to add significant, strategic, value.