Evolution in London Market Vendor Landscape

Catherine Stagg-Macey

The 325-year-old Lloyd’s market makes changes slowly. The complexity of the risk, the uniqueness of the market place and the importance of relationships are all factors in the speed of change.

However, one area of noticeable change in the last three years is the core solution offerings to Lloyds. Traditionally, a small number vendors serving the local insurers, syndicates and managing agents developed software for each client. These vendors were small in size with management and development staff drawn from the Square mile.

There was significant re-use of the software across clients but the software offered was some way off from being productized. As a result, the clients (insurer, syndicate, managing agent) would be faced with on-going services bill to maintain what is essentially a bespoke core systems implementation. This had the advantage of being significantly fine-tuned to each client’s requirements but at a cost.

The Lloyds vendor market was known for mostly over-promising and under-delivering. CIOs would make comments like “our vendor is the best of a bad lot”. The sector is notorious for difficult and painful implementations. With only one or two new deals a year for vendors, it was a hard market to ensure continued investment in the solutions.

For years, the idiosyncrasies of the Lloyds market (messaging into the Bureau, peculiarities of business process, and complexity of the risk) were significant barriers to entry for other vendors.

Then two trends converged to create a more interesting target sector for the non-traditional London market vendors looking for continued growth:

  1. Mainstream software market matured significantly and insurers were able to partner with vendors with sophisticated partner programs, and strong delivery records to take on some serious legacy system challenges.
  2. Lloyds/London market insurers started to expand globally and were open to looking at more mainstream solutions for their non-Lloyds business. Several made investments in Europe or the US for their regional businesses and successfully implemented new underwriting and claims solutions.

The result is several of the big names in the core P&C systems market now have the global Lloyds or London market insurers as their clients, albeit for their non-Lloyds books of business for underwriting. Implementations are mostly in the US or in Europe.

The Claims area is a different picture. Several vendors have made the necessary investment (with thanks to their charter client) to be London market compliant and support Claims in the Lloyds market. This mostly involves modifications to messaging to be compliant to the London market message standard of ECF.

So the next uncharted territory is for a mainstream vendor to partner with a London market insurer and invest in the localization requirements. Our view is that is probably 18-24 months away.

It’s a good time for the insurer in the London market. There is increasingly more choice from established, well-funded vendors with better implementation track records and experience in product management. It’s a challenging time for the incumbent traditional London market software players as the competition is going to really hot up with these new market entrants. Time to buy ring-side tickets.

New Reports: Policy Administration System Project Averages and Metrics

Tom Benton

This week, Novarica published reports on PAS project metrics and averages for L/H/A insurers and for P&C insurers, based on a recent survey of 33 P&C insurers and 11 L/H/A insurers who have completed projects in the last ten years or are currently implementing. The survey included questions on project scope, timelines, resources, costs and impacts – the key areas that carriers focus on when considering a PAS implementation.

Among the findings, the survey showed that a significant number of midsize P&C insurers and midsize L/H/A insurers are opting for SaaS or hosted PAS solutions. Also, compared with our 2012 study, large P&C insurer deployments are taking somewhat longer to complete and midsize deployments are taking less. For L/H/A carriers, most deployments were completed in less than three years.

Core systems replacement is high priority for insurers, and the majority of insurers have either just completed a replacement, are in the midst of one, or are evaluating one. These reports should help those in the latter two categories set expectations for levels of effort and cost for these key projects.

See the reports for more information:

 

Business and Technology Trends: Individual Life

Rob McIsaac

This week, Novarica released the most recent of our Business and Technology Trends reports, focused on the Individual Life Insurance market segment. The report is available for immediate download from Novarica’s research library or directly via the link http://www.novarica.com/b_and_t_trends_individual_life_2014/

Five years after the official end of the Great Recession, this is a segment of the insurance industry that continues to face significant challenges. Low interest rates and heightened levels of competition continue to cause economic pressure for companies involved in the sector. Slowing sales in 2013 exacerbated that pressure even as demographic changes in the market highlight future trends which carriers can ill afford to ignore.

There are, however, opportunities for the future, as carriers consider concurrent preparations for both the maturing of the Baby Boomer cohort and the rise of Millennials. Baby Boomers are now reaching retirement age at a pace of 10,000 individuals per day, which accelerates a range of changes in their financial needs. The latter group represents a larger, more diverse and technically more astute generation that has seen relatively low penetration by traditional life carriers. This is a discerning demographic group that has had expectations for services and information transparency heavily influenced by experiences with entities like Google, Apple, Facebook and Amazon. In order to take advantage of these opportunities, carriers should be carefully rethinking operational processes and the underlying technology investments that enable them.

This report explores the range of business trends, including changes in the regulatory environment, which are framing the market for Individual Life products. It also highlights the technology trends, and illustrative examples, for the investments decisions that carriers are now making in preparation for the future.

This report confirms what has been seen in other Novarica research: in the current business climate, CIO’s and their teams are being asked to do more without much more funding. This report can be used to both confirm existing priorities and refine future investment plans.

Turning Insurance Outside-In

Matthew Josefowicz

Across the great formal presentations, panel discussions, and roundtables at our 7th Annual Council Meeting this week, one theme kept jumping out for me: the need for insurance to become more demand-led in market, operational, and technology strategies.

As an industry, we have a tendency to view the world from the inside out. We need to reverse that perspective and look at the our industry and our operations from the outside in. We need to start from market and operational needs as we plan product, service, and technology strategies, rather than starting from our own understanding of capacities.

Our keynote speaker, data and analytics expert Adam Braff, hit on this theme in his opening presentation on “Cooking with Big Data,” with the first of his 5 guidelines: “Figure out what people want to eat before you go shopping.” Too many analytics efforts start with gathering data rather than thinking about how insights might be operationalized to drive better business results. The supply of data and analytical capability is leading in too many cases, rather than the demand for insight.

My presentation on Trends in Information Technology and Insurance focused on how changes in the ability to access, communicate, and analyze information means that buyer and distributor expectations about speed, flexibility, and even value propositions, are diverging from insurers’ own understandings of the world. The supply of risk analysis and distribution is leading in too many cases, rather than the demand for coverage.

In our CIO panel, a common theme of the panelists from AFLAC, The Hartford, Great American, and New York Life Investment Management was re-orienting IT organizations to be more focused on the creation of business value. This involves educating IT staff about business needs and goals as well as educating business leaders about the implications of their requests. The supply (and cost) of technology is leading in too many cases, rather than the demand for capabilities.

This will be a massive shift for the insurance industry, but one that is necessary to undertake. Access to information, communications technology, and analytical capability is democratizing the ability to price and sell risk. Insurers (and insurer operational and IT executives) that focus on the demand for coverage and capabilities will be better positioned to meet that demand. Those that don’t may soon find themselves with much less demand for what they have to offer.

The 7th annual Novarica Insurance Technology Research Council Meeting was held in Providence RI on April 30-May 1, and was attended by more than 70 insurer CIOs and senior IT executives. A report based on the discussions at the meeting will be published shortly.

Other recent Novarica reports on this theme include:

 

Commercial/Specialty Underwriting Automation: Cui Bono?

Matthew Josefowicz

Good article recently in I&T on Commercial Insurers and Underwriting Automation., covering some recent studies by various industry analysts. Here’s a quote:

Complex risks are still much more hand underwriting and will be for the foreseeable future,” says Matt Josefowicz, managing director at Novarica. “It’s all about empowering those underwriters with more communications tools and more data. A lot of the tech investment for underwriting in the specialty and large commercial side involves bringing all the information needed to make decisions to the underwriter’s fingertips as quickly as possible.

Complex-risk underwriters present a challenge when implementing new technologies, Josefowicz explains. “The individual underwriting desks have a lot of political power,” he says. When dealing with high-value cases, these experts have a great deal of specialized knowledge and tend to call the shots for which technologies they want to use.

One of the main questions in automating commercial and specialty is in answering the question Cui Bono? – “to whose benefit?” As we discussed in our report on Centralized and Federated IT Models, it’s hard to drive IT strategy centrally when the political power in an organization is federated. Commercial and Specialty CIOs need to work closely with their business leaders to make sure they are addressing their key data and technology issues. If the P&L heads can’t be convinced of the local value of an IT initiative, appeals to a weak central power are rarely successful.

For more on business and technology trends in Specialty Lines, see our recent report.

Straight Through Chicago

Rob McIsaac

Last week’s LIMRA/ LOMA Retirement Conference in Chicago provided an interesting overview and update for what is happening in the industry today. Jim McCool from Charles Schwab noted the importance of having carriers move to establish trust with consumers, and the need to de-clutter and simplify products and business models. He highlighted the example of Apple as a company that has taken a potentially complex space and made it elegantly simple with a terrific user experience that inspires trust and confidence.

This was a great build on a presentation I had an opportunity to deliver at the conference on Straight Through Processing.

The reality in the United States is that 10,000 Baby Boomers are now reaching retirement every day, something that will persist for the foreseeable future. The opportunity for carriers to prepare for this is now. Further, with low interest rates and continued cost pressure, finding ways to reduce operational expenses while improving customer experience (for both agents and customers) is critical.

Another reality is that customer experiences are increasing being set by companies like Apple, Facebook, Google and Amazon. They have perfected ways to make complex things simple, easy to use, innovative and “delightful” to customers. With expectations set there, business practices that are dependent on paper and rooted in the 1950′s are increasing arcane and inaccessible to agents and customers alike. The need to drive toward electronic applications and electronic signatures is crucial for carriers across lines of business. It is both a crucial step toward better customer experience now … and a precursor to bring able to deliver on meaningful mobile capabilities.

This was an opportunity to highlight findings from a recent Electronic Signatures Executive Brief we published.

When asked if there was a potential crisis due to aging in the producer community, the executive panelists at the conference’s main session noted that there is. Allianz, Schwab and Wells Fargo all acknowledged the problem and highlighted approaches they are taking to prepare for a new generation of advisors.

In some places, the agent / advisor community is actually aging faster than the general population at large. This also highlights the importance of creating better and more compelling user experiences for both producers and end clients. Moving to simplify business process, allowing for the electronic execution of transactions and “going mobile” are all key to this. Carriers will continue to need to compete for advisor “mind share” which will require experiences that can be concurrently compelling to multiple generations of users. All of this, of course, ties back to the Hot Topics we see for insurers in the near future.

The Apple analogy continues to resonate, particularly if carriers want to truly remain relevant in a highly competitive environment.

While there are certainly complexities inherent to the life insurance, annuity and retirement plans segments of financial services, the future is clear: STP is moving from being innovative to becoming a “cost of doing business”. Hope is not a strategy and indecision is not a winning game plan.

New Brief: Wearable Technology and Insurance

Tom Benton

Over the last two years, fitness tracking bands, smartwatches and Google Glass have fueled the next wave of consumer electronics:  wearable technology. Financial services firms and insurers are already starting to find innovative ways to use wearables. In my new brief, Wearable Technology and Insurance , I outline three key capabilities and some examples of how these enable innovative applications for insurers and financial services firms. 

In some respects, “wearables” are not new – after all, the Dick Tracy comic strip introduced their iconic “wrist radio” just after World War II.  What is new is that smartphone adoption and more efficient smaller batteries are enabling new devices and applications.

I currently have two wearables on my wrist – a fitness tracking band (the Fitbit Flex that I have been wearing since June 2013) and a smartwatch (a Pebble – I was one of the 69,000 or so who backed the project on Kickstarter back in May 2012, but I started wearing it regularly earlier this year).  I am seeing more and more people wearing these devices and with the recent introduction of Android Wear, Google’s extension to the Android operating system for wearable devices, we can expect 2014 to be the “year of the wearable”.

As wearables gain adoption by consumers, innovative insurers will find ways to use them in engaging customers.  Others should consider how wearables will fit into mobile and customer communication strategies.  Wearables are on the way – how can you leverage them for customer interactions?  Read the brief and let me know your ideas.

Security Update: XP and Heartbleed

Tom Benton

Two interesting security items are in the news this week.  One is Microsoft’s end of support for the XP operating system on April 8, an operating system introduced nearly 13 years ago and originally slated to lose support in 2007 when Windows Vista was introduced (see Rob’s McIsaac’s post on this here).

In several discussions I had earlier this year with insurers, few were concerned about the XP support issue, but some admitted they had XP clients running and no immediate plan to upgrade or replace them.  However, CIOs may want to consider finding and resolving XP issues, since some XP machines may only be running because a critical software application will not run on later versions of Windows.  These need to be identified and their risks mitigated.  Also, even though its malware security support for XP has been extended by Microsoft until 2015, XP still presents a security risk due to old software ( as far back as IE 8 or so, for example) and will likely be a targeted by auditors.

The other issue is the Heartbleed vulnerability to OpenSSL.  Apparently this vulnerability has been in place for some time, and threatens passwords and other sensitive information sent via Internet communications.  This could include SSL-protected websites maintained by insurers, so CIOs need to work with CISOs and system administrators to ensure sites are updated to fix the issue.

By now, insurance carrier CIOs should have asked if OpenSSL is in use for any of their company-owned servers and for their service providers.  They also should have put a plan in place to update the sites, develop a communication plan for employees and any who externally access those sites, and clearly communicated the status to internal staff and external stakeholders.  However, there are two more steps that should be taken at this time.

First, this is a good opportunity to educate employees about good IT security practices in general, including the need for strong passwords and taking care with providing information online.  However, note that for this particular vulnerability, it is not advised for users to rush into changing passwords on all websites, since this will not be effective until each site applies an update to OpenSSL.  CNET is keeping a current list of the update status for 100 popular sites.  Mashable has a good article on what users should do to protect themselves from Heartbleed.

Next, take time to review IT Security policies and procedures.  As mentioned in my update on security earlier this year , vulnerabilities and threats are being constantly identified, so CIOs and CISOs need to consider more frequent audits and outside help with securing networks and servers.  Choosing to ignore or minimize the importance of the end of XP support or the Heartbleed issue may lead to unpleasant consequences and difficult conversations to explain them.  Also, review your overall IT security plan to make sure it is up to date and that any weaknesses are being addressed.

Ironically, while XP is an issues long known with plenty of time to prepare, Heartbleed was unknown but in existence for a long time (reportedly two years) with little notice until it was revealed this week.  CIOs and CISOs need to be prepared to respond to many different types of threats, making solid IT Security planning a key imperative for insurance IT leaders.

Crowdsourcing Predictive Models: Who Wins?

Martina Conlon

Analysis of data, creation of predictive models, and the ability to take action based on the outcome of those models have always been at the core of the insurance industry. However, there seems to be a peak in interest in the predictive modeling space right now from our research council and clients. Carriers are realizing how effective these scoring models can be across the insurance lifecycle and want in.

From our research (http://www.novarica.com/analytics_big_data_2013/) we know that predictive models can help marketing departments in lead development, cross selling and campaign targeting. R&D departments can use custom or standard predictive models as part of rate making and distribution can use predictive models to better target prospective agents and geographies. Predictive risk models can improve underwriting consistency, transparency, automate segments of the underwriting process, and ensure that the right underwriter sees the right submissions, all the while driving profitability. Using predictive models for claims triage and expert adjuster assignment can have a big impact on claims severity. Insurers use scoring models to gain insight into which claims are candidate for fraud investigation, subrogation, litigation, and settlement as well as more accurate and automated loss reserving.

Although the opportunities abound with predictive models, obstacles slow down adoption, especially for small and mid-size insurers. Potential high cost combined with uncertain return, priority given to other projects, limited internal data volume, the lack of data scientists, and the lack of business sponsorship are among the biggest challenges. Luckily, the vendor community servicing the space is active and expanding, and they are here to help insurers overcome these obstacles. A variety of insurance specific data warehouses, analytics tools, third party data and predictive models can be purchased. Actuarial and specialized consulting firms offer data scientists with insurance domain experience to provide you with the expertise that you lack in house. These vendors are also communicating their successes to business stakeholders and they are paying attention.

And today – a colleague asked me, “Have you heard of Kaggle?” Kaggle is predictive analytics solution provider for the energy industry, but Kaggle also hosts a marketplace for data science competitions for all industries, data science forums and job posting boards. Allstate has an open competition with them for development of a predictive model to predict which quote/coverages will be purchased when a prospect is presented with several options. Data scientists from across the world are working on this right now, competing for $50,000 in prize money. Allstate conducted a similar competition last year around claims with a much smaller prize where they gained substantial benefits and insights from the submitted models, feedback and concepts. And many other Kaggle competitions have no cash prize, just recognition within the community or a job offer.

So one may think – here’s an option to make predictive modeling more accessible to smaller and mid-size insurers. But to date, crowd sourcing of predictive models has been most successful with companies that have a strong analytics practices already. According to industry press, Allstate’s predictive modeling team felt that the infusion of new ideas and approaches was extremely valuable and enabled them to significantly improve their models. Unfortunately, Kaggle won’t likely be a silver bullet for smaller insurers. Kaggle doesn’t offer solution to many of the obstacles mentioned above. However, it does offer one more way for small companies to gain access to a predictive modeling community and skilled data science resources – which may level the playing field just a little bit.

Goodbye, Old Friend

Rob McIsaac

Well, we are at the end of a mighty 13 year run. Microsoft will be pulling the plug on Windows XP life support early next month. All indications are that this is no April Fool’s joke.

All indications also are that someone would have to be fooling themselves to think that continuing to use it now would be a good idea. I have a solitary machine running the venerable OS. It refuses to run Win-7 and so the end has come. In a few weeks the network interface will be disabled and it will revert to being a glorified, isolated, word processor. The sneaker network lives on via a hacker proof thumb drive.

Of course I’m fortunate. I only have one machine to worry about and no dedicated apps that are tied to antique software stack components. The Windows 8.1 machine I’m now running is great and wasn’t much more expensive than 2 year’s worth of extended support from Redmond. Most insurance carriers don’t have such an easy set of choices.

The migration from Windows NT to XP was slow and painful, carrying with it some notable challenges and costs. The journey wasn’t engineered in the shadow of a financial crisis that has had a long, lingering, hangover; it simply faced normal cost headwinds and technical challenges on non-portable code. The contrast between old and new was also stark, with improvement galore, which generally excited users.

This time around, the improvements are significant but harder to see from the UI. In fact the UI is polarizing, so it alone does not create a big push to bring it into use. Perhaps worse, given the long and successful run of XP, is the sheer number of applications that run on it natively and won’t transition smoothly or cheaply. Reliance on old browser version, old software, old databases and other incompatibilities make it daunting to explain why migration is a good idea. It also makes the transition expensive to execute.

Good luck making all those old Access databases, for example, work in a new environment.

Of course, hand wringing won’t be helpful. Developing and delivering on a migration plan, in concert with key vendors, is really the only possible path forward. A range of solutions are possible, including isolating equipment and virtualizing some “legacy” applications. There won’t be a silver bullet on this, however. Looking at this, and a range of other security related concerns, was highlighted in a recent executive brief (IT Security Issues Update) published by my colleague Tom Benton.

One thing for CIOs and their teams to consider, if they haven’t done it already, is building an educational program around this issue. Making remediation part of a broader effort to improve functionality, reduce risk and reduce support cost over time, can also help win critical organizational and executive support. Mixing in some sugar may make it easier to swallow some strong medicine. This one is worth it since failing to address it now could lead to a much bigger problem in the not so distant future.