Welcome to the MedMetrics Blog

The MedMetrics blog provides comments and insights regarding the world of Workers’ Compensation, principally, issues that are medically-related. The blog offers viewpoints regarding issues affecting the industry written by persons who have long experience in the industry. Our intent is to offer additional fabric, perspective, and hopefully, inspiration to our readers.

Search The MedMetrics Blog

Thursday, August 20, 2015

Seven Ways Your Data Can Hurt You



by Karen Wolfe

Your data could be your most valuable asset. Participants in the Workers’ Compensation industry have been collecting and storing data for decades. Big Data (meaning a lot of data) is available, as are vast numbers of smaller data sets, yet few analyze data to improve processes and outcomes or to take action in a timely way.

Analytics (data analysis) is crucial to all businesses today to gain meaningful insights into product and service quality, business profitability, and to measure value contributed. But data processes need to be examined regarding how data is collected, analyzed, and reported to determine and gain its current and potential value. Attention to data and its processes is crucial to insuring data is an asset, not a limitation. Begin by examining these seven ways data can hurt or help.

 1.   Data silos 
Data silos are common in Workers’ Compensation. Individual data sets are used within organizations and by their vendors to document claim activity. Without interoperability (the ability of a system to work with other systems without special effort on the part of the user) or data integration, the silos naturally fragment the data, making it difficult to gain full understanding of the claim and its multiple issues. A comprehensive view of a claim includes all its associated data.


     2.  Unstructured data
Unstructured documentation in the form of notes leave valuable information on the table. Notes sections of systems contain important information which cannot be readily tapped and integrated into the business intelligence. The cure is to incorporate data elements such as drop-down lists containing data elements to describe events, facts, and actions taken. Such data elements provide claim knowledge and can be monitored and measured.


     3.  Errors and omissions 
 Manual data entry is tedious work and often results in skipped data fields and erroneous content. When users are unsure of what should be entered into a data field, they might make up the input or simply skip the task. Management has a responsibility to hold data entry people accountable for what they add to the system. It matters.

Errors and omissions can also occur when data is extracted by an OCR methodology. Optical Character Recognition is the recognition of printed or written text characters by a computer. Interpretation should be reviewed regularly for accuracy and to be sure the entire scope of content is being retrieved and added to the data set. Changing business needs may result in new data requirements.

     4.  Human factors
Besides manual data entry, other human factors effect data quality. One is intimidation by IT (Information Technology). Usually this is not intended by IT, but they are often perceived that way. Remember people in IT are not claims adjusters or case managers. The things of interest and concern to them can be completely different and they use different language to describe them.


People in the business units often have difficulty describing to IT what they need or want. When IT says the request will be difficult or time-consuming, the best response is to persist. It’s their job and they will usually protect it by exclaiming its complexity.


     5. Timeliness
Timeliness regarding data, refers to timely reporting of critical information found in the data. This does not refer to analysis of historic data. Rather, it means appropriate reporting of critical information found in current data. The data can often reveal important facts that can be reported automatically and acted upon quickly to minimize damage. Systems should be used to continually monitor the data and report, thereby gaining workflow efficiencies. Time is of the essence.


     6.  Data fraud
Fraud seems to find its way into Workers’ Compensation in many ways, even into its data. The most common data fraud is found in billing—overbilling, misrepresenting diagnoses to justify procedures, and duplicate billing are a few of the methods. Bill review companies endeavor to uncover these hoaxes.


Another, less obvious means of fraud is when the provider seeks anonymity through confusion by using multiple tax ID’s or NPI’s (National Provider Number) for the same individual or group. The fraudulent provider is able to obfuscate the data, thereby disqualifying analysis. The system will consider the multiple identities as different and not capture the culprit.


The same result is achieved by the provider using different names and addresses on bills. Analysis of provider performance is made difficult or impossible when the provider cannot be accurately identified.


     7.  Data as a work-in-process tool
Data can be used as a work-in-process tool for decision support, workflow analysis, quality measurement, and cost assessment, among other initiatives. Timely, actionable information can be applied to work flow and to services to optimize quality performance and cost control.

Accurate and efficient claims data management is critical to quality, outcome, and cost management. When data accuracy and integrity is overlooked as an important management responsibility, it will hurt the organization.

Karen Wolfe is the founder and President of MedMetrics®, LLC, a Workers’ Compensation medical analytics and technology services company. MedMetrics analyzes the data and offers online apps that link analytics to operations, thereby making them actionable. MedMetrics analyzes data continuously and sends alerts as appropriate. MedMetrics also analyzes and scores medical provider performance. karenwolfe@medmetrics.org