Data Quality Management

The Solvency II Directive introduces strict requirements for data quality. Insurance companies will need to manage these processes to comply with the Directive.

Solvency II is a fundamental review of the capital adequacy regime for the European insurance industry, and its aim is to ensure the financial soundness of insurance companies to not only protect policyholders’ interest, but also increase competition in the EU insurance market. Preparing to meet Solvency II requirements is one of the highest priorities for insurance companies in Europe.  Data management activities represent the largest portion of the work involved in Solvency II projects, with data quality having a prominent role.

Without a doubt, having good quality data is an essential prerequisite to calculating technical provisions correctly. The implementation measures of Solvency II have been conceived to establish guidance for a consistent approach to data quality across Pillar 1 to support an accurate calculation of the Solvency Capital Requirement (SCR) and the Minimum Capital Requirement (MCR).

Modeling systems characterized by multiple sub-models

The European Insurance and Occupational Pensions Authority (EIOPA) introduces specific data quality regulations and recognizes data quality as a critical aspect of sound risk management practices and calculations. For many insurers, current data governance, data quality and monitoring capabilities are not sufficient to comply with these new regulations, and this must change. Obviously, the more reliable the data is, the more reliable the model output will be. This will lead to more confident forecasts based on internal modeling and, ultimately, a better risk management decision-making process, avoidance of regulation-imposed capital add-ons and increased return on capital.

The guidelines state that the "quality" of this data needs to be assessed by three criteria:

  • Appropriateness - the data needs to be appropriate to the specific risks that are being assessed and to the calculation of the capital requirements to cover these risks. For example, data used in the management of risk in life insurance might not be appropriate for home insurance or car insurance.
  • Completeness - all of the main risk groups in the company’s portfolio of liabilities must be included in the data. The data must also be sufficiently detailed and must include both historical and current information to enable analysis of trends in the behaviour of the underlying risks.
  • Accuracy - the data must be free from errors and omissions, and must be collected and stored in a timely and consistent manner. The business needs to be confident in the data, and to demonstrate this confidence by using it in operational decision-making.

Our solution: A data directory mapped to your Solvency model

As an Actuary and an IT consultant with more than 20 years of experience in the insurance market, Brainmize has both the technical skills and the industry knowledge to help you meet your obligations under Solvency II. As one of our consultancy services to help you meet the Solvency II requirements.

We can deliver a clearly structured and fully automated data directory.

We can deliver a clearly structured and fully automated data directory.

Take a big step towards meeting your Solvency II requirements with a data directory. Our solution automates the storing and profiling of data, giving you and the regulators a complete view of your data lineage from the Solvency model through to the data sources, and ensuring that your risk calculations are both credible and robust. We determine the mapping between the source and the model to provide full transparency in data flows throughout the organization.

Our solution: Providing software and actuarial services to help you increase the quality of your processes

Data management is key for the successful implementation of Solvency II, and insurers must take an integrated approach that involves data governance, data quality processes and data management technology.

 Brainmize solves the problem by providing software and actuarial services to help you

Brainmize solves the problem by providing software and actuarial services to help you in four areas:

1. The Data Quality Management Process

The process of data quality management is an iterative and ongoing process that will continue to be leveraged within an organization constantly as new data is introduced. The four phases of the process are depicted below:

  • Data definition - The Directive requires that the data describing the business undertaken must be appropriate and complete. Data requirements should contain a proper description of the single items and their relationship.
  • Data Quality Assessment - Data quality assessment involves validating the data according to three criteria: appropriateness, completeness, and accuracy.
  • Problem resolution - The problems that are identified during the assessment of the data quality are addressed in this phase. It is important to document data limitations and enable data historization to demonstrate historical data that justifies to the supervisor the remedies applied to deficient data.
  • Data Quality Monitoring - Data quality monitoring involves monitoring the performance of the associated IT systems, based on data quality performance indicators.

2. Data Collection, Storage, and Processing

The data collection, storage, and processing should be conducted according to the following criteria:

  • Transparency - Refers to the fact that the logical connection between inputs and outputs should be clear, rather than a "black box".
  • Granularity - Refers to the detail level of data accumulation in the insurance risk data repository. The finer the level of detail (the policy level, eventually to be aggregated via model points), the more robust will be the calculation process downstream.
  • Accumulation of historical data - Historical “relevant data” should be stored and accumulated on a regular basis in order to evaluate certain technical provisions. 
  • Traceability - Refers to the important requirement that any update to the data must be properly documented and audited.

3. Auditors and the Actuarial Function

Internal and external auditors will be in charge of auditing specific datasets, conducting a formal and systematic examination, employing techniques commonly adopted by audit professionals.

Conversely, the insurance company’s actuarial function does not have the responsibility to execute a formal audit on the data.

However, the actuarial function is required to review data quality by performing "informal examinations of selected datasets" in order to determine and confirm that the data is consistent with its purpose.

4. Identification and Management of Data Deficiencies

Many deficiencies occur as a result of of low-quality data that is due to singularities in the nature or the size of the portfolio. If the deficiency is due to the internal process of data collection, storage, or data quality validation, the responsibility is often assigned to the IT Department or to the high cost of data collection and maintenance.

The original text mentioned that the insurance company should take immediate measures to remedy this situation, but in the final version "immediate" was changed to "appropriate" displaying the fact that the fix might require significant IT effort, which should be scheduled according to priorities.

These adjustments must be justified and documented and should not overwrite the raw data. This means that there is a clear requirement for tracing and historizing data modifications.