A Catastrophe of Its Own Making
Every time an insurance company goes bankrupt, the gods of irony must shake their heads in disgust. How can a business whose sole business is about managing business risk be so bad at managing their own risk, they face the ultimate disaster that can befall a business — going out of business? These companies fail the most basic duties of their job, managing risk. Perhaps they were being too laissez-faire about their raison d’être or too entrepreneurial about their corporate risk management? Whatever it was, the regulators probably did the customers a service by forcing the company into bankruptcy and transferring the liability of their policyholders onto a far more solvent and less risky company.
However, this is certainly a case of “you had one job…” If the executives at these bankrupt companies had more respect for their data, they probably wouldn’t have found themselves in such a desperate, unthinkable situation. Unfortunately, they let the situation get out of control and lost everything. If only they had more respect for data and data governance they probably could have avoided the worst of all possible situations, the end of their business.
Industry-Specific Governance
Discover tailored data governance strategies for key sectors in today’s business landscape.
Going Up in Smoke
According to the California Department of Insurance (CDI) , “A catastrophe model is a computer-based process that simulates thousands of plausible catastrophic events based upon statistical, financial, economic, physical, engineering, other scientific concepts and equations, and insurance policy coverage information to derive aggregate estimates of financial loss, including insured loss.” Insurance companies typically run catastrophe models on a regular basis, especially in sectors like property and casualty insurance.
Many insurers conduct comprehensive catastrophe modeling at least once a year to assess potential risks and recalibrate their policy pricing and reserves. Also, after significant catastrophic events, like wildfires, hurricanes, and earthquakes, insurers often run models to evaluate the industry impact as well as adjust future risk assessments and underwriting strategies. The more aggressive companies perform quarterly analyses to stay updated on emerging risks, market conditions, and changes in exposure. On top of the above modeling, insurers may also run models on an ad-hoc basis in response to changes in regulatory requirements, sudden acts of God, major climatic events, new scientific data, or even shifts in market conditions.
California Wildfires
The recent wildfires in California, which will cost between $135 billion and $150 billion in total damage and economic losses, provide a perfect example of the insurance industry’s changing landscape. In her article, California’s Home Insurance Crisis: Rising Risks, Soaring Costs and Limited Options , Carla Ayers writes, “As the recovery begins, insurance companies are again reassessing coverage in wildfire-prone regions, citing mounting losses and increased risk. This comes as homeowners across the state have been facing a worsening insurance crisis with rising premiums, policy cancellations, and limited coverage options. This instability has led several major insurers to scale back or withdraw from the California home insurance market entirely, citing excessive risk and financial exposure.”
Sadly, many homeowners discovered their home insurance policies didn’t cover certain wildfire-related damage, while others found out low reimbursement caps don’t cover a lot of expenses in today’s inflationary environment. As Ayers reports, “Three of California’s largest home insurance companies denied nearly half of their claims in 2023 — a rate significantly higher than the national average.” The insurance companies may be protecting themselves, but they’re leaving their policyholders out in the cold. They’d argue they are just following the numbers, trusting in the data that shows an environment more prone to natural disasters and massive destruction that they cannot foot the bill for.
Banking & Governance Tools
Explore banking-specific governance approaches and essential tools for implementation.
Success Has a Hundred Fathers, While Failure Is an Orphan
After the failure of the Bay of Pigs, John F. Kennedy once famously said, “Success has a hundred fathers while failure is an orphan.” He was probably quoting the Roman historian Tacitus, but the sentiment of his words captures the idea that everyone jumps aboard the bandwagon of success while fleeing any association with failure. This is human nature. While the Bay of Pigs was a rare US military failure as well as a stinging political embarrassment, businesses fail every day, usually for the same reason. They simply run out of money. There are even thriving industries based on taking companies into and out of bankruptcy as well as forfeiting and selling off assets.
The thing about business failure is, in many ways, it is highly predictable. Most businesses fail due to reasons that are entirely foreseeable. It’s rare for a functioning business to suddenly shutter. The writing is often on the wall months, if not years, in advance. When it comes to the bankruptcy of insurance companies, it is usually easy to predict what led to their dissolution. There’s a reason why some of the oldest companies in the world are insurance companies; they tend not to fail.
Stay ahead with practical insights and expert perspectives on modern data governance.
The Seeds of Failure…
Either poor financial management, a regulatory issue, fraudulent activity, failure to meet capital requirements, or a deteriorating economic environment can cause an insurance company to fail. In December 2020, the UK’s East West Insurance Company faced bankruptcy due to a combination of these factors. A year later, Gibraltar’s MCE Insurance and Denmark’s Gefion Insurance went bankrupt. A year after that, South Africa’s Constantia Insurance Company had to shutter its doors. Many of the factors leading to these failures should have been spotted and, potentially, mitigated or even eliminated if these companies had implemented a strong data governance program from the very beginning.
Proper data governance could have caught a misallocation of funds or helped uncover some troubling business practices that led to an insurer’s financial instability. Data governance programs simplify regulatory compliance, which helps insurance companies reduce the potential for fines and penalties. Fraudulent activities, whether internal or external, can lead to unexpected financial strain on a company and contribute to bankruptcy. When a company understands its data, it can quickly root out fraud, which minimizes any potential damage. The reality of the insurance industry is, if you can’t manage risk, you won’t be in business long.
An analogous quote to JFK’s “Success has a hundred fathers while failure is an orphan” could be, “The reason for a business’s success is a thousandfold, but the cause for failure usually boils down to a single reason: the lack of money.” This lack of money is usually due to either financial mismanagement, poor risk management, fraudulent behavior, or a sudden, unexpected economic downturn. In one way or another, the proper use of data, along with some added foresight, can root out and solve most of these problems before they turn grave.
What is Data Governance?
In my article, Foundations of Enterprise Data Governance , I state, “ Data governance is the planning, oversight, and control over management of data and the use of data and data-related resources, and the development and implementation of policies and decision rights over the use of data. It is the foundational component of an enterprise data management or enterprise information management program.”
For insurance companies, data governance ensures data is accurate, secure, consistent, and compliant with regulations like GDPR, Basel III, and anti-money-laundering (AML) legislation. Remaining compliant with these regulations reduces the risk of insurance companies being heavily fined and penalized, which also reduces the potential of reputational damage to the company’s brand.
High-quality data is essential for accurate reporting and decision-making. Strong data governance helps insurance companies manage risks, enhance customer trust, and drive business growth by ensuring data accuracy, security, and accessibility.
Evolution & Benefits
Understand how data governance has evolved and the key advantages it brings to organizations.
The Reasons for Failure
Factors Leading to Gefion Insurance Dissolution
Gefion Insurance, a Danish insurance company, dissolved due to the following interrelated factors:
1
Financial Mismanagement
Inadequate financial oversight led to poor investment decisions and an inability to maintain sufficient reserves to cover claims.
2
High Claims Ratios
A surge in claims, particularly in the property and casualty sectors, put immense pressure on the company’s finances.
3
Regulatory Scrutiny
Gefion faced increasing scrutiny from regulatory bodies due to its financial practices. This scrutiny often leads to higher compliance costs and operational restrictions.
4
Market Conditions
The competitive landscape in the insurance market made it difficult for Gefion to maintain profitability. Pricing pressures from competitors can lead to reduced margins.
5
Solvency Issues
The company struggled to meet solvency requirements, which are critical for insurance firms to ensure they can cover future claims. This lack of solvency raised red flags for regulators.
6
Operational Challenges
Internal operational inefficiencies, including issues in underwriting and claims processing, contributed to escalating costs and customer dissatisfaction.
7
Economic Factors
Broader economic challenges, such as downturns or changes in consumer behavior, impacted the company’s performance and ability to attract new business.
These factors collectively led to the decline and eventual failure of Gefion Insurance, highlighting the complexities and risks involved in the insurance industry. Companies like East West Insurance and MCE Insurance, as well as countless others, faced bankruptcy due to a combination of factors, including the inability to secure reinsurance, which is critical for managing risk and ensuring solvency. Before long, growing creditor claims swamped these companies, liabilities soon exceeded assets, and administration, or bankruptcy as it’s known in the US, was the only option. Ultimately, liquidation occurred.
An Industry Running on Data
Data might be “the new oil,” as Clive Humby famously claimed, but data is what makes the insurance industry go round, and it is now awash in data. As Tony Boobier notes in his Analytics for Insurance: The Real Business of Big Data, the volume of information that Insurers manage is huge. “In 2012, the UK insurance industry created almost 90 million policies, which conservatively equates to somewhere around 900 million pages of policy documentation,” Boobier says. 90% of the available data was created over the last two years, he adds. That figure was from 2012, more than twelve years ago, long before the explosion of social media, IoT, and cloud data, so the numbers are far worse today.
“Insurance companies collect large amounts of data every day. Auto insurers, for example, track everything from how many miles a customer travels to what type of car they drive. Insurers use this data to help determine whether a policyholder qualifies for discounts, offers, and coverage,” says Matt Turner in his article, Why Data Governance is Essential for the Insurance Industry . All this data collection means the industry isn’t wanting for data, but it does mean the industry players need to figure out what to do with it.
Insurers analyze historical data, demographics, and various risk factors to assess the cost of premiums, insurance coverage limits as well as the likelihood of claims. Statistical models crunch vast amounts of data to devise pricing strategies. This allows for dynamic pricing based on real-time data, leading to more competitive rates. Analyzing customer data reveals customer behavior, preferences, and needs. It allows insurers to segment their customer base effectively.
Charles Nyce’s Predictive Analytics
Charles Nyce’s influential 2007 paper on predictive analytics, simply called Predictive Analytics , provides a comprehensive overview of the methodologies, applications, and ramifications of predictive analytics in the insurance industry. Written for the American Institute for Chartered Property Casualty Underwriters and the Insurance Institute of America, this paper’s two main sections, “Drivers of Insurers’ Use of Predictive Analytics” and “Insurers’ Use of Predictive Analytics”, clearly show its intended audience was insurance providers.
“While accurately forecasting factors such as operations, budgets, supplies, or product demand is crucial to any organization’s success, insurance organizations are particularly reliant on predicting future activities,” says Nyce. “An insurer’s ability to forecast a policy’s ultimate cost determines how accurately it prices its product and, in turn, the extent to which it can avoid adverse selection,” he adds. Advanced analytics and pattern recognition identify suspicious claims and mitigate fraud risks. Tailored marketing strategies and personalized product offerings improve customer engagement and help with retention.
Core Concepts
Explore fundamental data management concepts and organizational structures for effective governance.
Crunching the Numbers
Nyce explains how machine learning techniques can be used to understand relationships between variables, predict credit scores, forecast economic outcomes and future values. He explains how techniques like decision trees, neural networks, and ensemble methods can help with risk assessment and fraud detection. Predictive and prescriptive analytics can also help understand customer behavior, inventory management, and define marketing strategies. Predictive analytics can even be used for dynamic pricing, which can improve risk management and customer engagement.
“The statistical techniques used in predictive analytics are computationally intensive,” says Nyce. “Depending on the amount of data they use, some require performing thousands of millions of calculations. Advances in computer hardware and software design have yielded software packages that quickly perform such calculations, allowing insurers to efficiently analyze the data that produce and validate their predictive models,” he adds.
Analytics Investment
$3.6B
Insurance companies approximate investment in big data analytics in 2021.
According to Xenonstack, “Companies that invested in big data analytics have seen 30% more efficiency, 40% to 70% cost savings, and a 60% increase in fraud detection rates.”
Source: Xenonstack
The use of analytics in the insurance industry is growing. According to Xenonstack , “Insurance companies invested $3.6 billion in 2021. Companies that invested in big data analytics have seen 30% more efficiency, 40% to 70% cost savings, and a 60% increase in fraud detection rates.”
Of course, the veracity and usefulness of any predictive model depend on the quality and quantity of data going into it. “Junk in, junk out,” as the modeler’s lament goes. Good data in and business thrives. Bad data in and the business dies.
In addition to the proprietary data insurers collect from their customers and potential customers, there are countless third-party tools and sources of data, such as ratings agencies, regulators, advisory organizations, and data brokers, that insurers can use to develop their predictive models, says Nyce.
As more insurers use predictive analytics, those not doing so will be increasingly exposed to adverse selection because their market will be limited to a subsection of the general population that has worse-than-average loss ratios.
As more insurers use predictive analytics, those who don’t will increasingly be exposed to adverse selection because their market will be limited to a subsection of the general population that has worse-than-average loss ratios, contends Nyce.
Insurance Industry Use of Predictive Analytics
Marketing Property-casualty insurers can use predictive analytics to analyze the purchasing patterns of insurance customers. This information can be used to increase the marketing function’s hit ratio and retention ratio. Underwriting Insurers can use predictive analytics to filter out applicants who do not meet a pre-determined model score. This type of screening can greatly increase an insurer’s efficiency by reducing the employee hours it may have spent researching and analyzing an applicant who ultimately is not a desired insurer. If an applicant’s model score is sufficient for consideration, then the model score can be used as a rating mechanism on which the insurer can base a variety of price/product points. Claims Insurers can use predictive analytics to help identify potentially fraudulent claims. It also can be used to score claims based on the likely size of the settlement, enabling an insurer to more efficiently allocate resources to higher priority claims.
Source: Predictive Analytics White Paper
Managing Data Quality
Insurance companies operate in a world increasingly filled with complex rules and regulations, with regulatory frameworks like GDPR, CCPA, and industry-specific regulations that both demand robust data governance practices and assess exorbitant fees for any breaches of those violations. Insurance companies must also navigate requirements for data protection, data retention as well as reporting obligations while maintaining compliance with state, national, and, sometimes, international laws. These regulations mandate strict controls over personal information, financial data, and claims history, making comprehensive data governance not just beneficial to the company but essential for operational continuity and risk mitigation.
Data governance is a all-inclusive framework designed to effectively manage an insurance company’s data assets. It encompasses several key components that ensure data integrity, security, and accessibility, which are particularly critical in the insurance sector.
Data quality management involves systematic practices designed to maintain high standards of data quality. This includes processes for data validation, data cleansing, and data enrichment. Effective data quality management is essential for mitigating errors from poor data quality while also ensuring that data remains a reliable asset for corporate decision-making.
Detailed data policies serve as a guideline that governs the use of data across an organization. These policies define acceptable practices for data handling, ensuring consistency and compliance with regulatory requirements. Well-defined data policies help reduce data discrepancies and promote a unified approach to data governance.
Detailed data policies serve as a guideline that governs the use of data across an organization. These policies define the acceptable practices for data handling, ensuring consistency and compliance with regulatory requirements. Well-defined data policies help reduce data discrepancies and promote a unified approach to data governance.
Case Study
Texas Mutual Insurance Company
In his Why Data Governance is Essential for the Insurance Industry blog entry, Matt Turner states, “Texas Mutual Insurance leverages Alation with Snowflake to streamline data usage, leverage data as an asset, and promote data literacy. On the platform, Texas Mutual built a consolidated view of the full ‘life-cycle’ with definitions, including, quotes, written, earned, billed, and net premiums. This allows every business function to report on key areas consistently and make decisions using the same, foundational understanding.” The result: an increase in data trust as well as faster data delivery, says Turner.
Texas Mutual reduced data delivery time for key business dashboards by 80%, he says, with executives now given daily access to dashboards they trust, which empowers them to make in-the-moment decisions about where critical capital should be deployed each day. With trustworthy data, knowledgeable business decision affecting the success or failure of the company can be made in real-time.
Faster Business Intelligence
80%
Reduction in data delivery time for key business dashboards
With trustworthy data, knowledgeable business decision affecting the success or failure of the company can be made in real-time.
Source: Alation.com
Modern Data Governance
In his article, The Path to Modern Data Governance , Dave Wells offers a data governance framework containing six layers, including goals, methods, people, processes, technology, and culture (Figure 1). Each layer must be addressed as part of a data governance modernization project. Data governance modernization “isn’t quick and easy. It is a journey, not an event,” warns Wells, adding that planning is crucial.
Figure 1: Data governance framework. Source: Eckerson
For Wells, the framework breaks down as follows:
Goals: Why the data is governed.
Methods: How the data is governed.
People: Who governs the data.
Processes: The series of actions taken to achieve specific results.
Technology: The features and functions that fill many roles in data management.
Culture: The environment where data governance, agile projects, and self-service data analysis and reporting all work together in harmony.
Data governance involves monitoring and measuring to identify improvements. It ensures that only the right people have the right access to the right — and often highly sensitive — data. Implementing a robust and reliable data governance framework that allows finance companies to conduct regular audits and monitor compliance. This enhances overall operational efficiency . A robust data governance framework enables effective risk management by identifying and mitigating data-related risks, which allows banks to effectively manage credit, operational, and compliance risks.
Leveraging Data as an Asset
The brutal reality of the insurance industry is, if you can’t manage risk in the short term, you won’t be in business for the long term. Perhaps this is the reason why the insurance industry embraced analytics early on. It knew the existential risk that managing risk innately holds. It’s been a good marriage. High-quality data is critical for accurate underwriting, claims processing, and risk assessment. Governance frameworks help maintain data accuracy, consistency, and reliability. They uncover risks associated with data breaches or misuse.
Well-governed data can streamline processes, reduce redundancies, and improve decision-making, leading to better customer service and operational efficiency. Effective data governance enhances the ability to leverage data analytics for business insights, helping insurers better tailor customer products and improve risk assessments.
Creating a successful data governance framework is crucial for insurance companies trying to ensure data integrity, security, and usability. Key components of an effective framework include data stewardship, data quality management, data validation processes, and compliance monitoring. These frameworks also define clear roles and responsibilities, establishing accountability for overall data quality and compliance. By leveraging data-driven insights, insurers have been able to improve risk assessment and deliver better services to their customers.
Arguably, no industry is more reliant on data than the insurance industry, so data governance is more important to it than just about every other industry in the world. It’s a truism as old as the industry itself, which has been around since Biblical times — if you can’t manage risk in the risk management business, you’re soon going out of business.
Establish World-Class Data Governance Framework
Since 1997, Pioneering Enterprise Data Governance Solutions
155+
Successful Client Partners
25+
Years of Excellence
Transform Your Data Strategy