Data Dump

Using master data management, an insurer can turn a data junkyard into a mechanized marvel.

March 06, 2011 Photo
An insurer's store of data is one of its prime assets. Historically, this information has been locked away in the business units where the data originated. Customer information, policy and product data, preferred vendors and business partners, and claim information is usually accessible only in the systems in which they originate and reside. That siloing can inhibit the company's ability to exploit the richness of the data and can limit its value in the evaluation of claims. One solution is master data management, or MDM—software and programs that make data assets available and usable enterprise-wide.

Insurers that have grown through acquisition compound the volume of data for which they are responsible. Post-merger integration plans typically account for the costs to integrate systems; however, they rarely account for the costs to integrate data from multiple sources and the analysis required to resolve differences in data models, business validation rules and data relationships.

Master Data Management
Master data are core pieces of information needed to uniquely define entities that are shared throughout the enterprise. They have a common set of characteristics, regardless of domain or industry. These include being uniquely identifiable in distinct domains that reflect data important to running the business. For an insurer, this may include products, claims, customers, agents, employees or vendors.

Master data should be shareable across business functions and is defined according to company business policies. Finally, master data must be consistent, accurate, highly available and of high quality in order to be used across the enterprise.

Master data management is the process of creating, managing and sharing such data across an enterprise. The ultimate goal of MDM is to create a single version of the truth so the business can be confident in the information it is using to make decisions. To achieve this goal, MDM requires convergence of processes, organization and technology to enable common and shared data definitions. It will provide timely, accurate, accessible and complete data to all enterprise applications.

MDM unlocks data stored across the organization in various systems and formats (e.g., claims applications, policy admin systems, customer relationship management, self-service portals, and other servicing applications). For most insurers, there are likely several applications capturing data for each line of business sold. An insurer selling both commercial workers' compensation and personal auto insurance is unlikely to have a single application that captures both types of policies. However, the information captured in these applications may be similar and helpful to other business functions (e.g., first notice of loss or special investigations).
In claims processing, MDM can lead to a reduction of data input errors during FNOL and claim setup, a consistent view of customers' claim and interaction history through the claim lifecycle, an improved ability for customer self-service through the claim adjudication process, and an efficient approach to analyzing claim history across an insurer's book of business.

Today's insurance customer demands an improved and consistent experience across all channels and interactions with their insurer. These expectations benefit insurers that maintain a single view of the customer based on the products, claims and relationships with the insurer. MDM applications, specifically those focused on customer data integration, relate customer records from various sources to provide a complete and comprehensive customer view.

Historically, policy administration systems have allowed users to input more than one customer in unstructured data fields. If a single data field contains multiple policyholders (e.g., "Mr. John V. Smith and Mrs. Samantha Browne-Smith"), then automated policy verification during claim intake can fail. Automated front-end parsing of customer name information at the time of data entry would separate the above example entry into two distinct customer records associated to the policy. As a result, the insurer would find more claims eligible for straight-through processing, reduce manual work required on claim intake, and improve downstream claims analysis on its customer base for future underwriting and risk management.

Claims applications typically capture and store claim information locally, often replicating the data to an enterprise data warehouse. There is tremendous benefit from integrating that same claims information via an MDM tool and making it available to share across multiple channels. An agent able to see claim or loss data as part of a customer's consolidated interaction history can reinforce customer service and address the customer needs at the point of sale.

How MDM Improves Data Quality
Ensuring quality and accuracy of master data, especially in the customer data domain, is a common challenge for insurers. Business processes typically address data quality through downstream audits and resolution of discrepancies in operational data stores. As a result, insurers have difficulty parsing, standardizing or evaluating data against pre-defined business rules, as well as identifying patterns for predictive modeling.

Current trends in proactive data quality management integrate technologies such as name and address standardization software into front-end business processes. When adding address information for an involved party on a claim, standardization software can validate an address against USPS standards to store a guaranteed accurate mailing address. It can also help to identify anonymous values, or "dummy data," (such as a Social Security number of 999-99-9999 or date of birth 01/01/01). Data standardization software can cleanse, verify and standardize names and addresses.

Data inaccuracies, especially with customer contact data, can affect an insurer's ability to comply with regulatory requirements for written communications to claimants. If a customer address is incorrect, it may be several weeks until the claims department is alerted to the error. Not only does this impact the customer experience, it also may put the insurer at risk for non-compliance.
Improved data quality management within the vendor data domain has the potential to provide operational efficiency and reduce loss allocated expense within individual claims. For example, claim handlers may input vendor data directly into the electronic claim file. As a result, downstream systems are unable to accurately reconcile the aggregate amounts paid to individual vendors due to data inconsistencies. For example:
  • Claim 1 is an auto claim that includes physical damage to the claimant's car. The body work is performed by and recorded on the claim file for "Joe's Auto Shop."
  • Claim 2 is a similar type of claim, and the body work is performed in the same location. However, the claim handler records the vendor as "Joe Brown Auto Garage, LLP."
Without a master vendor list, these two vendors may appear in analytical reports as two separate vendors. The insurer will be unable to negotiate potential pricing discounts because it cannot relate spending to the two seemingly unrelated vendors. An MDM solution would use other information available (address, phone number, taxpayer identification number) about the vendor to connect the records and present a more complete view of the insurer-vendor relationship. The master vendor data can be pushed upstream in the claims process by allowing claim handlers to search and retrieve the best-known vendor data and associate it to the claim. This not only reduces the time to enter data, but also increases efficiency of the claim handler.

The Keys to Success
Planning and executing a successful master data management initiative requires process, organization and technology changes. According to Gartner Predicts 2011, "Through 2015, 66% of organizations that initiate an MDM program will struggle to demonstrate the business value of MDM." Therefore, governance in each area is a prevalent theme that relies on both business and technology ownership to manage and integrate the data across domains.

The first step is to design an organizational structure that satisfies enterprise requirements. Insurers should identify an owner for master data assets and for each domain across business functions, then develop a data stewardship program to enforce consistency and reusability for data delivery and use. This can be accomplished by defining roles and responsibilities to be performed by data architecture teams to help ensure adherence to standards across the enterprise over time.

The next step is to create common and shared processes for defining, administering and governing master data across the organization. Establish governance processes and decision-making guidelines for master data to manage both implementation and maintenance. Identify and inventory sources, producers and consumers of master data, documenting how they use information.
Finally, insurers should look at their technology needs to create, administer and share global data resources. This includes software to help establish a single version of master data containing uniquely identified records from various business and information domains. Other tools to manage supporting data (hierarchies, metadata, and reference data) across the enterprise are also needed. As an insurer considers data usage, it should develop integration platforms to support the movement of master data between operational systems and across external entities.

The success of MDM programs hinges on strong senior executive support. Insurers attempting an MDM program must lay out strategic elements, such as how the program supports increased customer intimacy, customer centricity, and increased share of wallet, and tactical elements, such as improving tactical data management processes, increasing operational efficiencies, and reducing costs through data accuracy and reuse. Without clear goals, the enterprise may not align to the program's objectives, and organizational resistance to change will limit the program's benefits.

The technology marketplace for MDM solutions is growing as vendors mature their product offerings and new vendors enter the market with expanding solutions. Due to the growing demand for MDM, the current trend of solution providers is to acquire niche products and integrate them into larger MDM software suites. Informatica's purchase of MDM vendor Siperian, IBM's acquisition of Initiate Systems and TIBCO's acquisition of data-matching vendor Netrics are indicative of the expected growth and activity in this space. For these and other vendors, products range from consolidated models that centralize all data to lightweight registry-style MDM solutions that manage relationships to federated data across the enterprise. An insurer's product selection will vary based on its objectives and data management goals.

Data management can be done piecemeal, but insurers that undertake master data management addressing the entire gamut—organizational structure, data governance processes and key technology enablers—will realize exponentially greater returns on the investment.
Joshua Schwartz is Director, PwC's Diamond Advisory Services. (845) 893-7327, joshua.schwartz@us.pwc.com
Sponsored Content
photo
Daily Claims News
  Powered by Claims Pages
photo
Community Events
  Claims Management
No community events