Skip to main content
Big Data

Last Updated 05/27/2021 

Issue: As insurers collect more granular data about insurance consumers, state insurance regulators need greater insight into what data is available to the industry, how it is being used, and whether it should be used by insurers. Big data refers to a complex volume of data and the set of technologies that analyze and manage it. While the use of big data can aid insurers’ underwriting, rating, marketing, and claim settlement practices, the challenge for insurance regulators is to examine whether it is beneficial or harmful to consumers. Additional consumer concerns include how collected data is safeguarded and how consumer privacy is maintained. Furthermore, state insurance regulators need data beyond what has been traditionally collected from insurers. State insurance regulators may need to collect more useful data (beyond financial and market conduct data collected today) to allow for greater insight into insurers’ models to further enhance regulation. 

Background: The digital revolution has allowed for the collection and storage of large and diverse amounts of information. This data is referred to as big data because it is too complex for traditional data processing techniques. For insurance purposes, big data refers to unstructured and/or structured data being used to influence underwriting, rating, pricing, forms, marketing and claims handling. Structured data refers to data in tables and defined fields. Unstructured data refers to things such as social media postings, reports and recorded interviews as well as pictures such as satellite imaging. Predictive analytics allows insurers to use big data to forecast future events. The process uses a number of techniques—including data mining, statistical modeling, machine learning and, in some cases, narrow artificial intelligence—in its forecasts.  

Insurers use big data in a number of ways. Insurers can use it to: 

  • More accurately underwrite, price risk and incentivize risk reduction. Telematics, for example, allows insurers to collect real-time driver behavior and usage data to provide premium discounts and usage based insurance. 

  • Enrich customer experience by quickly resolving service issues. 

  • Improve marketing effectiveness by tailoring products to individual preferences. 

  • Create operating efficiencies by streamlining the application process. An example of this is a pre-filled homeowners application. 

  • Facilitate better claims processing by applying machine learning algorithms to outcomes. 

  • Reduce fraud through better identification techniques. For example, text analytics can identify potential "red flag" trends across adjusters' reports. 

  • Improve solvency through the ability to more accurately assess risk. 

According to Yes Magazine, the implementation of Big Data has resulted in 30% better access to insurance services, 40-70% cost savings, and 60% higher fraud detection rates. However, all disruptive technologies bring challenges. Big data concerns include: 

  • Complexity and volume of data may present hurdles for smaller-sized insurers. 

  • Insurance regulatory resources for reviewing complex rate filings. 

  • Lack of transparency and potential for bias in the algorithms used to synthesize big data. 

  • Highly individualized rates that lose the benefit of risk pooling. 

  • Collection of information sensitive to consumers' privacy or potentially discriminatory. 

  • Cyberthreats to stored data. 

Status: The age of big data brings both positive and negative impacts to society. The job of state insurance regulators is to ensure regulations and regulatory activities sufficiently protect consumers from harm. To assist with this, the NAIC created the Big Data (EX) Working Group, recently combined with the Artificial Intelligence (EX) Working Group of the Innovation and Technology (EX) Task Force

For 2021, the new working group is charged with researching the use of big data and artificial intelligence (AI) in the business of insurance and evaluating existing regulatory frameworks for overseeing and monitoring their use. As part of this, they will review current audit and certification processes and also assess data needs and required tools for state insurance regulators to appropriately monitor the insurance marketplace. 

Additionally, the Casualty Actuarial and Statistical (C) Task Force has published the Regulatory Review of Predictive Models White Paper to identify best practices for the review of predictive models and analytics filed by insurers with regulators to justify rates and to provide state guidance for the review of these rating filings.