“Data makes all the difference.” This is according to a White Paper
published by LexisNexis®. Entitled, “More Data, Earlier: The Value of
Incorporating Data and Analytics in Claims Handling” states that carriers can
reduce severity payments by up to 25 percent.[1]
This is true for P&C carriers, but especially true for Workers’ Compensation payers where medical costs have steadily increased for decades. In Workers’ Compensation medical services are not limited by plan design. The costs for medical now amount to more than 60% of claim cost and they continue to climb. Nevertheless, data managed correctly, can make all the difference and save real dollars.
Big data
Big Data
is currently in vogue. Everyone is talking about Big Data as though it will
deliver a panacea of some sort. The notion is that organizing and analyzing
copious amounts of data will produce new and improved insights, thereby gaining
desired results. That may be true however, this outcomes is a function of
complete, consistent, and accurate data. Unfortunately, data purity is rare,
regardless of the size of the data set.
Bad data
The gains
promised by Big Data are dependent upon data quality. Whether Big Data is
comprised of large data sets or made up of many small data sets, quality may be
the elusive factor. In order to achieve positive results using any data set, it
must be complete and accurate. Duplicate records must be cleansed and merged
for starters. More importantly, bad data input processes must be altered
upstream where data is created. Standards for data quality must be set and
enforced.
One
reason data is of such poor quality is that little value has been placed on its
veracity. That is changing as the vision for improved outcomes based on analytics
is increasingly clear. Nevertheless, data input from the outset should be set
to rigorous standards with accountability checks along the way. Automated
imaging systems must be regularly calibrated to insure accuracy while individuals
who input data along with their managers must be held responsible for the
quality of the data.
Voluminous data
In
Worker’ Compensation as with all insurance lines, comprehensive data is a fete
accompli. Data has been collected digitally for decades, driven by claims
payment requirements. In Workers’ Compensation, the claim is set up in the
payer’s system and continually fed by incoming data. Mandatory reports of
injury are submitted by employers and treating physicians. Bills from medical
providers and others are streamed through bill review systems, then to claims
systems throughout the course of the claim. Events such as litigation, court
dates, and bills paid are documented in the claims system. Pharmacy is managed
by the PBM (Pharmacy Benefit Management), thereby setting up an additional unique
database related to the claim. Most payers also collect medical utilization
review and medical case management data. The question is not the amount of
data, but its quality and what can done with it. How is it applied?
Disparate data
Unfortunately,
in Workers’ Compensation much of the data remains in separate silos. The focus
has been on collecting the data. Now the question is, how to make data an
operational tool that achieves the kind of positive savings results reported in
the LexisNexis study. A different approach is needed.
Integrated data
Making
data a useful work-in-progress tool is a matter of first integrating the data
across multiple data sets relating to claims. This is sometimes a tedious
process, but invaluable. The request and funding must come from the business
units where anything related to data is not usually a priority. Business
managers must begin to value the process of collecting good data and converting
it to actionable information.
Analyzed data
Once the
data is collected and integrated, analyzing it to gain the business knowledge is
the task. Business managers can learn to articulate for IT what they want and
need for decision support and other initiatives. IT has a role in assisting
business managers in understanding how to ask more effectively for what they
need. Cost drivers and trends can be uncovered in the analyzed data. Raw data
is not a usable claims management tool, but analyzed and logically portrayed
information can be powerful.
Current data
The power
of data is best exploited when it is analyzed and made available to the
business units as concurrently as possible. Intervention is far more effective
when it is mobilized early.[2] Damage control is best
achieved before it is irretrievable. Moreover, the analyzed data must be linked
to operations, thereby making it actionable.
Knowledge derived from analytics is useless
until it is acted upon.
Linked data
Regardless
of how impressive the analyzed data, it is useless unless acted upon. To
actualize the data for useful application, it must be analyzed and re-presented
to the business units in ways that can be easily accessed, understood, and
applied. Through analytics, the data is transformed to knowledge: knowledge
about conditions in claims, events, costs, and performance of vendors. Knowledge
should not only be current, but should reach the operational front lines and portrayed
in ways that promote action.
Actionable knowledge
Actionable
knowledge is derived from analysis of the data that is presented to the
business units in a functional form. To achieve measureable cost savings, continuously
monitor the data, integrate, and analyze it, then re-present it to the business
units in the form of easily interpreted knowledge and action tools. Individuals
can be prompted by the system to take specific initiatives based on the
knowledge, thereby creating a structured and powerfully enhanced approach to medical
management with measurably positive results.Karen Wolfe is the founder and President of MedMetrics®, LLC, a Workers’ Compensation medical analytics and technology services company. MedMetrics analyzes the data and offers online apps that super-charge medical management by linking analytics to operations, thereby making them actionable. karenwolfe@medmetrics.org
No comments:
Post a Comment