But Wes Rishell's post: "PCAST Opportunity: Documents vs. “Atomic Data Elements” that reviews the PCAST recommendations and digs into the details of the development of standards and the basis for the exchange of information (documents vs snippets or Molecules/atoms), exchange formats and Universal ExchangeLanguage (UEL) vs HL7 CDA does an excellent job of dissecting out some of the controversies and homing in on exchange of data and the standards in question as well as highlighting the challenges associated with pre and post coordination of data (read less codes requirements and more code requirements). Interestingly Wes estimates
from 20,000 to 100,000 (data elements) but a number of physicians seem to agree that a very useful collection of molecules and radicals would contain many fewer than 20,000. Stan’s presentation describes several different parallel efforts to enumerate the molecules using siloed methodologies. The one he is working on as identified more than 4,000.He points to several related posts - this from John Halamka: Detailed Clinical Models that points to multiple other standards in development in other countries (Open EHR in Australia, ISO13972, Tolven's Open Source Clinical Data Definitions and the National Health Service Logical Record Architecture) which all adds up to a pressing need for a Universal Exchange Language...and in my mind translates to a significant challenge in developing and then keeping up to date.
Both blogs point to an excellent detailed (almost 2 hours) presentation by Stan Huff (CMIO at Intermountain Healthcare): Practical Modeling issues: Representing Coded and Structured Patient Data in EHR Systems
And a link to some of the original work "Detailed Clinical Models for Shareable, Excutable Guidelines (pdf from MEDINFO 2004) that detailed the NIST funded multi institutional SAGE research project
sharable, executable clinical guidelines. The project envisions a system that enables the authoring, localization and execution of significant clinical guidelines in a vendor independent mannerOne thing is for sure - this is a complex issue requiring much attention and focus since the coordination and sharing of data is a fundamental building block of effective, efficient and safe healthcare. Effective sharing of data appears to require standards and exchange languages. But today is the first of three part episodes of Jeopardy featuring Watson . I talked about this back in June and again in December. Tonight is the first night of the Waston Jeopardy challenge.
In the countdown to Jeopardy
It is clear IBM has pushed the limits of computing technology and "understand" the complex human language which Nuance has partnered to bring this same technology to Healthcare: Clinical Documentation Challenges that will apply a whole new way of looking at clinical documentation that reduces our dependance on codes, structure and defined clinical data models.
If we consider the adeptness of the human mind and our ability to understand the fine nuance detailed of clinical reports without the data being tagged or encoded this suggests that the Watson applied to healthcare concept may hold a critical key to exchange and intelligent use of data. Based on the existing standard of "documents" we can innovate and use these documents that are currently a natural part of clinical care.
So as Wes Rishell stated in his summary:
Documents will continue to be at the heart of information flow for patient care and one primary way of bundling clinical information about peopleIn conjunction with some form of
- evolving universal exchange language (UEL),
- encoding of data, and
- reliable and simple ("but only as simple as possible, but not simpler") data representation
In conjunction with machine based understanding that I think we will see tonight has been advanced to new and exciting levels we will have a foundation of sharing data efficiently and intelligently in our healthcare system