March 28, 2024

Why Data Quality Control is a Prerequisite for True Interoperability

By Carl Rudow

VP, Client Success, Healthcare Practice

Interoperability is about more than integrations and data exchange. If the technical capabilities are there but one or both parties can’t use the data, there’s no value, no true interoperability.

For example, both payers and providers have EMR systems, but the way they operate and classify data is different enough to drive misalignment. So when the two systems attempt to integrate, data is often misclassified, resulting in errors, gaps, and poor quality.

To achieve true interoperability, you need data quality control. Here’s how to build the systems and processes necessary to make that happen.

Why data quality control is a prerequisite for true interoperability in healthcare

Let’s consider a scenario. A provider implements a new clinical system. The goal is to clean up the previous unusable data under the old system, enabling the new system to operate efficiently going forward.

But even if you clean up your data and have a perfect implementation, if your underlying data control processes don’t change, the new data will be as unusable as the old. Between your initial assessment and implementation, you’ll have potentially months of bad data—and that’s before the new system starts to operate.

Or, in other cases, you’ll redefine and reclassify old data based on how you think the new system is supposed to work. As time goes on, however, your processes don’t align with the expectations surrounding the new system, resulting in misaligned data.

For example, your EMR can send an interoperable medical record from a provider to a payer. But if the payer needs to not only receive that record, but also understand what was said and done, code all of these procedures and practices correctly, and ensure they’re charging the correct amount. Which means the systems need to be able to classify and coordinate those specific data.

While there are data exchange standards in healthcare, there are also nuances within those standards. And while they may seem small, when dealing with massive datasets, these nuances have the potential to add up quickly.

The result: wasted time and resources, inefficiencies, and dissatisfied patients and healthcare workers alike.

Whether EMR or other systems, the goal of true interoperability is to reach a state where you’re constantly comparing apples to apples. This is only possible with rigorous, continuous data quality control.

Two-step approach to healthcare data quality control

True data quality control comes in two stages. Unfortunately, most healthcare organizations and their technology partners focus on one and not the other. This, as we’ve seen, causes major problems.

Here are the two steps necessary for effective data quality control:

  • Proactive data quality control. This is where you change your current actions so data in any future state aligns with the standards you’ve put in place.
  • Retroactive data quality control. This is where you look back on all your data thus far and update it to align with your desired standards.

If you have the latter without the former, you’ll inevitably end up working against yourself, as old data will get cleaned up but new data will still be a mess. To make matters worse, most often these two processes are managed by different teams or functions, often so siloed that neither knows they’re working against the other.

Step 1: Proactive data quality control

Before you fix your existing data, you need to change your current data capture and management processes. Change today so things will be better tomorrow. Otherwise, you’ll destroy what you’ve worked hard to build.

Most technology and data partners don’t account for this when initiating data quality control efforts. One example is the gap between assessment and implementation. Assessments typically happen at the beginning of the engagement, defining the workload for retroactively correcting data quality.

But if the engagement takes six months to complete, and the organization has done nothing to correct their processes over those six months, they end up starting the new system with six months of bad data.

Rather than starting with fixing what came before, change your current operations going forward, including adopting these key practices:

  • Real-time data verification to correct errors as they arise
  • Data entry analysis and integrity constraints to limit incomplete data from entering the system
  • Implementing data normalization and standardization techniques to drive consistency
  • Capturing data in real time to ensure all information is up-to-date
  • Removing unnecessary or irrelevant data to increase data analysis efficiency
  • Protecting data from unauthorized access, tampering or breaches; compliance with HIPAA and other standards; and safeguarding patient privacy and confidentiality
  • Establishing governance policies, including defined roles and responsibilities, data standards, and quality control processes
  • Comprehensive documentation of data sources, collection methods, transformations, and changes
  • Regular data audits and cross-references with external sources

Only after you’ve implemented these data quality control processes will you be in a position to fix your existing data.

Step 2: Retroactive data quality control

Retroactive data quality control looks back on all the data you’ve captured thus far and adjusts it to align with current standards. This includes cleaning up the data, correcting errors, filling incomplete data sets, and converting it into a usable format.

Once you finish these two steps, you’ll be in a much better position to quickly apply complex data analysis. As problems arise, you’ll be able to retrieve information at the point of need, enabling faster, better decision making.

Not only will this improve your current operations, but it will also lay the groundwork for future scalability efforts, as you’ll have the governed processes in place to handle data from new systems and integrations.

Final thoughts on data quality control and healthcare interoperability

Interoperability in healthcare isn’t about just turning on an integration and calling it a day. It’s about ensuring the data shared via that integration is usable for both parties. This requires a more stringent data quality control process than most healthcare organizations currently have.

While implementing the policies listed above is a big lift, it’s easier when you have a data and technology partner who has proven expertise in these areas. Learn more about 3Pillar Global’s healthcare data expertise here.