Matthew Josefowicz on reports that Aon is in talks to sell its employee benefits outsourcing group.
Automation that enhances the agent experience and ultimately their selling and service capabilities is fundamental to enabling their success. But with so many potential areas to focus on, from portals to licensing and contracting to mobile, it can be challenging to know what to prioritize. Here are some best practices to help CIOs and business partners focus attention on the most value-added elements of agent automation:
- Know your agent: A successful agent is someone who puts the needs of their clients first, has a positive impact on the people and community in which they serve, and takes pride in being counted on by their clients. Knowing the motivations of an agent provides a good base for understanding where you can get the most value from your investments.
- Create an inventory of opportunity: Given the complex array of systems that provide support for the agent, a significant step is creating an inventory. There are a mountain of touch points that leave a lasting impression for an agent.To fully realize areas of opportunity, CIOs and their teams should develop a process map covering the agent interactions with all the touch points during the course of a business day.
- Define metrics for success: The measures of success are likely to be a combination of things like improving the efficiency of submission to commission process, straight-through processing, aggregating access via single sign-on, and reducing agent onboarding times. Whatever the ultimate metrics may be, the process of working with agents and business partners to develop the metrics helps bring all parties together.
- Understand mobile use cases and value: Mobility for agents has grown beyond calendar and contacts to more of a full agent “mobile office.” Depending on the use case, agents may be looking to access all of their data and applications from any device from anywhere at any time. A well thought-out and well executed IT strategy can support mobile applications by leveraging an open, service-oriented architecture.
- Consider the impact of emerging technology: For most carriers, using today’s widely available digital technology and approaches will have the most immediate value in agent automation. But there are several emerging technologies, like gamification and wearables, whose potential values should be considered for inclusion on carriers’ roadmaps for the next few years.
Agents are heavily dependent on automation for every aspect of the engagement and service functions that support the ability and sell and service a client. The more CIOs can enable them through technology and process simplicity, the easier it is for them to do what they do best: develop relationships and provide peace of mind.
For more on this topic, see my recent CIO Checklist report: http://novarica.com/agent-automation/
Some technology trends are just that: trends. Others have the potential to change the landscape of the IT industry landscape. A deep review and understanding of XaaS (“Anything as a Service”) puts the practice on a parallel with similar industry sea changes of the past, like the PC movement of the 80s, the web movement of the 90s, and the sourcing movement in the 00s. Here are our thoughts on what the best practices are for CIOs moving forward with XaaS implementation:
- Review current business processes with a critical eye: Whenever a CIO embarks on replacing any major platform, the first caution is not to recreate what already exists into another system, unless the business is completely satisfied with the current platform which quickly begs the question; why move? Assuming there is a need to move because the existing platform is complex, not scaling appropriately, doesn’t support current compliance requirements, lacking modern security
capabilities, costing too much to maintain, or any other similar reason, the first step should be to review what functions are being supported, what value is behind these functions, and are these functions generic to the industry.
- Define value add processes and align to benefit targets: It is important to define value-add processes and take the step to align benefit targets to each of these processes. This analysis will need to start with a top-level agreement between CIO and COO on value benefits, cost of non-standard process, and success metrics before moving into discussions and process planning.
- Implement Rent vs Buy vs Build model: A very old question that is outlined in just about every IT strategy is the philosophy direction of Buy versus Build. XaaS adds a new dimension of whether the function or service should be rented? In other words, can the company pay per user, pay per customer or pay per policy instead of making the significant investment, to buy or build a platform?
- Prepare for organizational shift, not just technology shift: There is clearly a technology shift in moving to XaaS which includes all the challenges and opportunities with implementing a new platform. One aspect that isn’t as apparent is the need to make an organizational shift from a focus on development and application maintenance to vendor and product management. Specific consideration might include QA focus more on regression testing using business use cases instead of feature testing focus, a shift from focus on intra data center design to inter data center design, and architecture with greater focus on data, data management instead of interconnecting applications within data center.
- Shift primary focus to data and analytics capabilities: Many IT shops spend most their time and resource maintaining, developing, and servicing existing platforms, which leaves little ability to address the huge data frontier. By fully taking advantage of XaaS, IT shops can reallocate resources to focus on unleashing the power of data into the whole enterprise.
Lessons learned and experience from previous sea changes lead us to review XaaS as part of the IT strategy roadmap. XaaS is not simply a new technology but rather a clear move and opportunity that requires a full assimilation into IT shops. At a minimum, adopting XaaS should create the opportunity to bring IT and business teams closer together.
For more on this topic, see my recent CIO Checklist report: http://novarica.com/best-practices-for-xaas-strategy/
We have written previously about the ever increasing importance of data in Insurance. A related area of interest to insurers is the growth of predictive analytics. Modern predictive analytics solutions are capable of providing deep insight into a wide range of business areas such as underwriting risk, product profitability, and financial projections. However, maturity and adoption of predictive analytics solutions vary widely among insurers. As more carriers prioritize data strategy, usage of this potentially disruptive technology will grow rapidly. Data is a major component of Novarica’s “Hot Topics” for insurers, which include social, mobile, analytics, big data, cloud, digital, and Internet of Things/drones. Data is being utilized to speed up underwriting, utilizing external third party data (e.g. prescription information, telematics information for driving), improve actuarial models (e.g. data collected from drones, the National Weather Service), and help to process claims (e.g. data generated from devices, commercial vehicles, health devices). Over 25% of insurers ran big data programs last year in order to gain insights from large volumes of data with high variety (structured and unstructured) and velocity. This article from the New York Times discusses the increasing concern of regulators, mostly in Europe and the UK, that access to large amounts of data may ultimately lead to a decrease in competition by freezing out smaller firms who can’t get at as much data as large firms like Amazon, Google and Facebook. The article mentions the case of IBM, which is combining internal data with customer data in order to train Watson AI software for a wide variety of tasks in fields ranging from medicine to finance. Some insurance carriers are working with IBM’s Watson software to develop underwriting, claims, and actuarial modeling. Data will continue to grow in importance even as it grows in volume. It is inevitable that regulators will start looking more at data and access to it as we move forward into the 2020s.
An interesting article came out over the weekend that delves into the consolidation that has taken place among publicly traded life insurance companies, and contrasts this trend with the relatively stable number of mutual carriers that are in the market today. We are now the better part of two decades past the period when there was a significant demutualization effort which included notable, name-brand, national carriers. In that period, we have weathered multiple recessions, one of them the worst economic downturn since the 1930s, and emerged into a world that has experienced persistent low interest rates. Taken as a whole, these factors have produced a series of economic outcomes which were outside of the planning corridors that many carriers executed against. As the article suggests, carriers face some very interesting challenges going forward. For those with long tail liabilities such as life and annuity contracts, the conflicts associated with quarterly earnings reports and maximizing shareholder value appear to be particularly daunting.
There is more to this story, however, which may suggest some additional advantages for mutual carriers. Almost without exception, life carriers are grappling with aging technology platforms which may date as far back as the Kennedy administration. The blocks of business on these platforms are themselves old, and may be closed to new business. But because they were at the heart of these businesses over multiple decades they have become, through the magic of cost accounting, blocks of business which absorb significant overhead for carriers. For many companies, these platforms represent a significant drag in terms of being able to implement new products and services effectively. At the same time, however, these platforms, if they are walled off, can become quite stable and relatively inexpensive to operate. This can meaningfully influence both operational and financial outcomes for carriers.
We recently unearthed a 1995 chronicle from MIT which provides a fascinating view of the first 35 years of policy administration utilization in North America. The fact that many of the systems that were deemed to be aging in that 22-year-old report are still being used by carriers should give cause for concern to some!
In any case, as carriers plot their technology strategy for the future, addressing these old systems and blocks of business running on them will become increasingly critical. The investments and planning horizon required to make them successful may be easier for mutually owned companies to execute than it will be for their publicly traded competitors given their respective focus on long- versus short-term results.
Even as market competitive threats loom large, it is not just a technology challenge that many life insurance carriers face. There is an accounting and a reporting issue which carriers would be well advised to consider as they put their strategic plans in place.
The major tech players are all betting that smart home automation and digital assistants will be the next big thing for consumers. Grange is taking advantage of this emerging area with their recent announcement that Amazon’s voice-controlled Alexa can now help users learn about Grange insurance or find local agents. It’s clear that the insurance marketplace has not always adapted quickly to improve the customer experience, so this is a great example of an insurer working to serve consumers in whatever way they prefer. It also demonstrates the necessity for insurers to think to the future when they modernize their back-end systems. Will a new core system support future channels? Over the last five to ten years insurers have poured a lot of time and money into building web-based consumer portals. Those that didn’t build for future flexibility had to start from scratch in order to create mobile-ready sites. Will they have to begin again to leverage voice-based home assistants or some as-of-yet unknown customer interaction? Insurers who are thinking in an omni-channel way will instead be architecting agile back-end systems that can support any number of channels and–just as importantly–can support transfers between channels when necessary.
2017 has barely begun and the 2016 trend towards core system consolidation in the insurance industry is showing signs of going strong for another year. Duck Creek’s recently announced acquisition of Yodil comes as no surprise, as it both continues the willingness by Duck Creek to invest in broadening the scope of its offering as well as a growing focus on data and business intelligence across the industry as a whole.
This is another example of a multi-year trend towards consolidation in the P&C core system space, a direct response to insurer preferences for suite providers as opposed to best-of-breed. Even when insurers do seek out standalone components, they show a strong inclination towards vendors who will be able to provide additional components at a later date. Over the course of their history, Duck Creek has continued to grow their offering to satisfy more of an insurer’s technology stack, both through development and acquisition.
What’s notable here is that Yodil isn’t what the industry has considered standard insurer core component like claims or billing, but instead is a business intelligence and data management offering. This move by Duck Creek is likely at least partially in response to Guidewire’s multiple acquisitions of data warehouse and business intelligence solutions, with EagleEye (now Guidewire Predictive Analytics) the most recent example. It’s safe to say that BI and data warehousing can now be considered a core part of the insurance suite just like any other business process focused system. Insurers are increasingly budgeting more money towards the data arena, so other suite vendors will need to follow up with their own competitive BI and warehousing offerings either through development or M&A.
Cybersecurity is back in the news this week, with Yahoo’s announcement that more than 1 billion user accounts, many of them containing sensitive information, were compromised in a 2013 cyber attack. Recently, Novarica held a Working Group on the new cybersecurity regulations that will go in force on January 1, 2017 in New York State. The law was drafted from the NAIC Cybersecurity Task Force’s Insurance Data Security Model Law but goes further in many cases than the draft law did. The new standards will apply to insurers offering licensed products in New York State. While some proposed requirements stand as general best practices most insurers have already established, others will require carriers to implement significant changes. Although financial and insurance institutions have until June 2017 to comply, carriers are already considering the upcoming shifts in resources and strategies. The regulations will mandate:
- Annual submission of a written statement to the Department certifying compliance, with all supporting data, records and schedules maintained for five years.
- Regular cybersecurity awareness training for all personnel, updated to reflect the annual risk assessment.
- Appointing a Chief Information Security Officer.
- Documentation of “areas, systems, or processes that require material improvement, updating or redesign” along with planned and in-progress efforts toward remediation.
- Employment of cybersecurity personnel who must attend regular update and training sessions.
- Establishing cybersecurity policies to address areas like access controls and identity management, business continuity and disaster recovery, capacity and performance planning, customer data privacy, data governance and classification, incident response, information security, physical security and environmental controls, risk assessment, systems and application development and quality assurance, systems and network monitoring and security, and vendor and third-party service provider management.
- The policies must be reviewed by the board of directors or similar governing body, and approved by a senior officer.
- Establishing and maintaining cybersecurity programs to:
-detect incidents, identify internal and external risks
-to implement defensive infrastructure, policies, and procedures
-to respond to detected or identified incidents to mitigate the impact
-to recover from incidents and restore normal operations
-to fulfill regulatory reporting requirements
Most of the carriers present at the working group focused on the compliance expectations for vendors and third-party service providers. If partners do not comply with the regulations, the carriers manufacturing the products will be liable. We are unsure today if the carriers can get the penalties back from the MGA’s, agents and partners if the security breach was due to that agent’s or partner’s lack of compliance with the law.
Another area of focus was encryption. In the current draft of the legislation, carriers will have up to five years to implement encryption of nonpublic information both in transit and at rest. Many participants saw this as an onerous task, as PII data is already difficult to manage. Although the clause allows for “compensating controls” to stand in place of the encryption leading up to the five-year mark, carriers are already apprehensive of the burdens of such a large feat. In a similar context, multi-factor authentication will be required as well, but an extension of 1 year is being considered.
Some attending carriers with operations in Europe and the UK brought up concerns for how the cybersecurity legislation will affect international relationships. However, while there are some differences between the NYS regulation and the GDPR (General Data Protection Legislation), we don’t expect these difference to drastically impact the carrier’s ongoing technology activities.
Many carriers discussed the security and reliability of Cloud. While some saw Cloud as an additional risk, others saw it as a faster, seamless way to fortify cybersecurity. There was a general concern that because data centers from Cloud providers house different “tenants,” there is a risk of the data being exposed. There was a discussion of “accumulation risk” caused by a cloud which means that a hack of the cloud could automatically trigger a security event for everyone in the cloud. However, other attendees suggested that because it is easier to add a security tool to a Cloud solution, the risk of data exposure is mitigated.
Happy Holidays & Happy New Year!!!!
This week marked a very interesting development involving Amazon and two core systems vendors focused on PAS and adjacent capabilities supporting different lines of business, as both EIS and Guidewire were awarded Amazon’s “AWS Financial Services Competency”. Cloud computing continues to make significant strides in taking on key workloads for carriers, irrespective of size and line-of-business focus. Once approached cautiously because of concerns over things like security, when compared with traditional deployment methods, the cloud provider space has significantly matured in recent years. Today security, which was once considered to be an Achilles’ heel for cloud providers, is now considered to be one of the strengths for top-tier service providers. Many CIO’s now understand that the security that a provider like Amazon, Microsoft or Google (to name three) can deliver is better than what is available via company-owned facilities. To test this, and build organizational comfort and support for cloud-based solutions, carriers increasingly have moved key workloads such as email, HR platforms, CRM solutions and financial systems to cloud offerings. In fact, during recent Special Interest Group sessions (sample summary here) Novarica hosted with carriers from widely different lines of business, one of the comments that insurance CIO’s repeatedly offered was that it is getting increasingly difficult to procure non-cloud-based solution sets from companies like Oracle and Microsoft. The push is on to move ever more complex and mission critical offerings to the cloud.
Up to this point, there have been somewhat limited deployments of PAS suites in this space, such as Liberty Mutual Benefits’ recent and notable decision to move forward with an EIS deployment on Amazon Web Services. Which brings us to this week’s news about EIS and Guidewire. As the press releases noted, this is a differentiating event for these companies which should have carriers and competitor software solution providers taking note. While these two firms represent the start of something new, for properly architected solutions, we expect that this type of certification will be increasingly important. Given the increase desire of carrier CIO’s to leverage cloud-based computing, the certifications can be not only a means of achieving greater confidence in a particular solution’s ability to support their needs, but may also become a key differentiator when weighing alternatives for carrying specific workloads.
We’ve already seen a range of insurance applications, including underwriting workbenches and claims platforms, move to being delivered only as cloud offerings (e.g., ClaimVantage for Life, Disability and Absence claims). Having PAS suites join the mix of options is likely to be welcomed by many CIO’s and their teams as they move forward with key transformation events in their organizations.