Last Updated 03/27/2020
Issue: As insurers collect more granular data about insurance consumers, state insurance regulators need greater insight into what data is available to the industry, how it is being used, and whether it should be used by insurers. Big data refers to a complex volume of data and the set of technologies that analyze and manage it. While the use of big data can aid insurers’ underwriting, rating, marketing, and claim settlement practices, the challenge for insurance regulators is to examine whether it is beneficial or harmful to consumers. Additional consumer concerns include how collected data is safeguarded and how consumer privacy is maintained. Furthermore, state insurance regulators need data beyond what has been traditionally collected from insurers. State insurance regulators may need to collect more useful data (beyond financial and market conduct data collected today) to allow for greater insight into insurers’ models to further enhance regulation.
Background: The digital revolution has allowed for the collection and storage of large and diverse amounts of information. This data is referred to as big data because it is too complex for traditional data processing techniques. For insurance purposes, big data refers to unstructured and/or structured data being used to influence underwriting, rating, pricing, forms, marketing and claims handling. Structured data refers to data in tables and defined fields. Unstructured data refers to things such as social media postings, typed reports and recorded interviews. Predictive analytics allows insurers to use big data to forecast future events. The process uses a number of techniques—including data mining, statistical modeling and machine learning—in its forecasts. According to SNS Telecom & IT, insurance companies invested $2.4 billion in big data technologies and it is expected increase to $3.6 billion by 2021.
Insurers use big data in a number of ways. Insurers can use it to:
- More accurately underwrite, price risk and incentivize risk reduction. Telematics, for example, allows insurers to collect real-time driver behavior data and combine it with premium and loss data to provide premium discounts.
- Enrich customer experience by quickly resolving service issues.
- Improve marketing effectiveness by tailoring products to individual preferences.
- Create operating efficiencies by streamlining the application process. An example of this is a pre-filled homeowners application.
- Facilitate better claims processing by applying machine learning algorithms to outcomes.
- Reduce fraud through better identification techniques. For example, text analytics can identify potential "red flag" trends across adjusters' reports.
- Improve solvency through the ability to more accurately assess risk.
Big data has tremendous potential to positively affect insurers and consumers. According to Yes Magazine, the implementation of Big Data has resulted in 30% better access to insurance services, 40-70% cost savings, and 60% higher fraud detection rates. However, all disruptive technologies bring challenges. Big data concerns include:
- Complexity and volume of data may present hurdles for smaller-sized insurers.
- Insurance regulatory resources for reviewing complex rate filings.
- Lack of transparency and potential for bias in the algorithms used to synthesize big data.
- Highly individualized rates that lose the benefit of risk pooling.
- Collection of information sensitive to consumers' privacy or potentially discriminatory.
- Cyberthreats to stored data.
Status: The age of big data brings both positive and negative impacts to society. The job of state insurance regulators is to ensure regulations and regulatory activities sufficiently protect consumers from harm. To assist with this, the NAIC created the Big Data (EX) Working Group of the Innovation and Technology (EX) Task Force.
For 2020, the working group is charged to review current regulatory frameworks used to oversee insurers' use of consumer and non-insurance data, propose a mechanism to provide resources and allow the states to share resources to facilitate their ability to conduct technical analysis of, and data collection related to, the review of complex models used by insurers for underwriting, rating and claims, and assess data needs and required tools for state insurance regulators to appropriately monitor the marketplace.
Additionally, the Casualty Actuarial and Statistical (C) Task Force is working to 1) propose revisions to the Product Filing Review Handbook to include best practices for review of predictive models and analytics filed by insurers to justify rates; 2) draft and propose state guidance for rate filings based on complex predictive models; and 3) facilitate training through predictive analytics webinars. The Task Force continues to work on its draft white paper on best practices for the regulatory review of predictive analytics and is reviewing comments from 11 interested parties.
Committees Active on This Topic
Ethical Considerations of Big Data Analytics
June 2018, NAIC Summit Presentation
Big Data Analytics: Changing the Calculus of Insurance
November 2017, CIPR Newsletter
How Artificial Intelligence is Changing the Insurance Industry
August 2017, CIPR Newsletter
The Year Before Us: Perspectives from NAIC President Ted Nickel
March 2017, CIPR Newsletter