Uncategorized

Improving Clinical Document Exchange

SMART C-CDA infographic -- click to enlarge2014 will see wide-scale production and exchange of Consolidated CDA documents among healthcare providers. Indeed, live production of C-CDAs is already underway for anyone using a Meaningful Use 2014 certified EHR.

C-CDA documents fuel several aspects of meaningful use, including transitions of care and patient-facing download and transmission.

This impending deluge of documents represents a huge potential for interoperability, but it also presents substantial technical challenges.

We forecast these challenges with unusual confidence because of what we learned during the SMART C-CDA Collaborative, an eight-month project conducted with 22 EHR and HIT vendors.

Our effort included analyzing vendor C-CDA documents, scoring them with a C-CDA scorecard tool we developed, and reviewing our results through customized one-on-one sessions with 11 of the vendors.

The problems we uncovered arose for a number of reasons, including:

  • material ambiguities in the C-CDA specification
  • accidental misinterpretations of the C-CDA specification
  • lack of authoritative “best practice” examples for C-CDA generation
  • errors generated by certification itself, i.e., vendors are incentivised to produce documents that set off no warnings in the official validator (rather than documents that correctly convey the underlying data)
  • data coding errors that reflect specific mapping and translation decisions that hospitals and providers may make independent of EHR vendors

SMART C-CDA Scorecard examples

Our full findings are set out in a detailed briefing we provided to the Office of the National Coordinator for Health IT (ONC).

The key takeaway from our effort is this: live exchange of C-CDA documents will omit relevant clinical information and increase the burden of manual review for provider organizations receiving the C-CDA documents.

While not all of these C-CDA difficulties are going to be fixable, many could be if they can be easily and consistently characterized. To achieve the latter, we have proposed a lightweight data quality reporting measurethat, combined with automated, open-source tooling, would allow vendors and providers to measure and report on C-CDA document errors in an industry-consistent manner.

We crafted a proposal for how this might be done, the key section of which follows:

Proposal: Informatics Data Quality Metrics on Production C-CDAs
Our findings make a case for lightweight, automated reporting to assess the aggregate quality of clinical documents in real-world use.
We recommend starting with an existing assessment tool such as Model-Driven Health Tools or the SMART C-CDA Scorecard.
This tool would form the basis of an open-source data quality service that would:
  • Run within a provider firewall or at a trusted cloud provider
  • Automatically process documents posted by an EHR
  • Assess each document to identify errors and yield a summary score
  • Generate interval reports to summarize bulk data coverage and quality
  • Expose reports through an information dashboard
  • Facilitate MU attestation

We recognize that MU2 rules impose an administrative burden on providers, including a range of burdensome quality reporting requirements that are not always of direct clinical utility. Here we propose something distinctly different:

  1. ONC’s EHR Certification Program. We propose two laser-focused requirements:
    a. Any C-CDA generated by an EHR as part of the certification testing process must be saved and shared with ONC, as a condition of certification.
    b. In production, any certified EHR must be able to perform “fire-and-forget” routing of inbound and outbound C-CDAs, posting to a data quality service.
  2. CMS’s MU Attestation Requirements. We propose a minimal, straightforward, copy/paste reporting requirement. The PHI-free report is directly generated by the data quality service and simply passed along to CMS for attestation.

These two steps constitute a minimal yet effective path for empowering providers to work with EHR vendors to assess, discuss, and ultimately improve data quality.

At a technical level, the following components are required to support the initiative:

  • A data-quality service that leverages existing C-CDA validation technology
  • EHRs that route inbound and outbound C-CDAs to the data-quality service
  • A dashboard web application that generates simple reports enabling a Provider, Hospital, or IT staff to monitor C-CDA data quality and perform MU attestation

Our proposal describes a service that is technically doable (even “nearly done” if the SMART C-CDA Scorecard were made a web service). What is unknown is whether ONC and CMS will support this and whether providers adopt processes to use quality metrics to good effect. (See Thomas Redman’s recent Harvard Business Review article, Data’s Credibility Problem.)

To sum up: the case for data quality metrics is a strong one, and we encourage those who believe in its efficacy to voice support for it with ONC and CMS.

Joshua C. Mandel, MD, SB is on the faculty of the Boston Children’s Hospital Informatics Program and Harvard Medical School, where he serves as lead architect for the SMART Platforms team.

David D Kreda is an independent consultant and translational advisor to SMART Platforms.

3 replies »

  1. Hi,

    We have developed a platform to store EHRs. The health records will be automatically uploaded directly by our partnered Health Service Providers and it can be easily accessible by the patients. The user interface of this EHR platform is very user friendly. The cost of using it is very less which is less than $2 p.a. We aren’t charging any amount from the Health Service Provider.

    As an introductory, we aren’t charging any amount as of now.

    Please visit https://www.medicalui.com and review it.

  2. @Curly – you’re asking the right question, but as I read it this article posits no new layer. Rather the authors point to a missing data validation step that can be imposed on the current EHR. Yes, the SMART tool scores the C-CDA after it has been mangled by yet another undisciplined EHR. But what if the tool was a web service that scored the data at data entry? Or, said another way, what if EHRs were more coherently normative at curating the data elements that flow into a C-CDA? Or, using the terminology of the manufacturing supply chain, the SMART C-CDA scoring tool is designed to flag defects after the product (the encounter note) has been manufactured by the EHR, and I am asking what if instead the tool were to flag potential defects in the visit note prior to allowing the EHR to “manufacture” the data so badly? We understand that it is inefficient to “inspect” quality into a manufacturing chain that is designed to produce defects. Yet we tolerate an extreme lack of data discipline in our EHRs.

  3. How many more layers of data distillation are needed to simplify the morass of jabberwocky known as the EHR?