Welcome to the MedMetrics Blog

The MedMetrics blog provides comments and insights regarding the world of Workers’ Compensation, principally, issues that are medically-related. The blog offers viewpoints regarding issues affecting the industry written by persons who have long experience in the industry. Our intent is to offer additional fabric, perspective, and hopefully, inspiration to our readers.

Search The MedMetrics Blog

Tuesday, January 9, 2018

Avoid Analytic Death by Bad Data

by Karen Wolfe

“The promise of technology is not yet at pace with the reality of the data, plagued by continued issues of cleanliness, connectivity, and availability.”[1] Regardless of the analytic methods in use, they will not generate value unless the data is clean, complete, and integrated. Unfortunately, current data is none of those things.

Analyzing faulty data is costly and misleading, moreover it sabotages all analytic initiatives and outcomes going forward. True value can never be achieved from analytics executed on bad data. Regardless of how elaborate or elegant the analytic initiative, its outcome can never be trusted unless the data quality issue is addressed first.

Unfortunately, accurate and complete data sets are either non-existent or rare today in the Workers’ Comp world. Whether the data source is manually key-entered or transmitted to the organization in digitalized form, it is replete with errors, duplicates, and omissions.

Data quality precedes analytics
Before engaging in any form of analytics designed to develop business insights and efficiency solutions, Workers’ Comp organizations must begin by addressing data quality. It is the first step in any analytic endeavor and will lead to far greater satisfaction with the results. Every effort should be made to correct faulty data and maintain its excellence by means of advanced technology wherever possible.

Consider the grocery industry. All pricing is computerized, side-stepping human data entry. The dollar amounts for items are entered automatically and adjusted frequently for changes in wholesale prices or sales promotions. Obviously, accuracy is critical so personnel at the cash register level never enter the dollar amounts. Instead, they enter a code for the item. The system automatically verifies the item for the code and presents the correct price.

At the supermarket corporate level systems have been established that read bar codes or QR codes on individual items as they arrive from vendors and automatically enter the wholesale price. Markups to retail are automatically calculated by percentage for that vendor and item. Changes in pricing are made by entering a percentage adjustment selected from a list. The opportunity for human error is virtually eliminated. The question is how can that level of automated accuracy be transferred to the Workers’ Comp industry?

Productivity tools
While the degree of technical automation enjoyed by supermarkets is not yet practical in the Workers’ Comp claims process, some initiatives can be adapted and applied to insure better data. The computer system designed to avoid manual data entry is the first step. Wherever possible, provide a pop-up list from which to choose data items rather than manually entering data in the system. Develop simple pick lists so the data are always entered the same way and spelled correctly.

The lists include standard formats for abbreviations. “Suite” is never “Ste”, thereby avoiding duplicates for the same record. Go even deeper to create pick lists for adjuster and nursing notes that standardize initiative documentation and establish measurable outcomes.

The ability to add, change, or delete items on the list is limited to the supervisory level where accuracy is maintained and accountability is monitored. This approach is immediately suited to, and urgently needed to keep medical provider demographic records valid and serviceable for medical provider performance analytics.

Create data quality teams
Provide incentives or rewards for those who find errors and omissions in the data. Set up competitive teams to uncover bad data. Because data is often transmitted to an organization digitally, errors and omissions may already be present. Use interns or trainees to scrutinize the data rather than accepting data "as is".

Create a team that monitors data quality, infuses changes as appropriate, and rewards personnel for accurate performance. Not so long ago in days of typewriters, typist performance was measured for speed and accuracy with their jobs depending upon good performance. What happened to those standards of excellence?

Priority shift
Elevate the importance of good data in the organization. Create an automated audit log of user performance and include worker’s data accuracy as an element in their performance evaluation. Workers will not value data quality if the organization does not make it a priority.

Cost of data quality
Prioritizing and maintaining data quality is not an insignificant shift in most organizations’ value systems and additional costs will be generated. However, in order to realize trusted analytics-informed insights, efficiency, and to remain competitive, data quality is prerequisite. Analytics-informed value is generated by good data. Bad data is just costly.

Karen Wolfe is the founder and President of MedMetrics®, LLC, a Workers’ Compensation, predictive analytics-informed medical loss management and technical services company. MedMetrics offers intelligent medical management systems that link analytics to operations, thereby making insights actionable and the results measurable. karenwolfe@medmetrics.org


[1] Bieda, L. Big Idea: Competing With Data & Analytics Blog. September 27, 2017.

No comments:

Post a Comment