I had the pleasure of talking with Chuck Webster, MD (@EHRWorkflow) over the last few days and weeks and he just posted this interview at his websitePosted via email from drnic's posterous
We found much in common (not justin healthcare but in books, space and science fiction) and he has kindly agreed to join me tomorrow for the #VoiceoftheDr radio show at 2:30 ET
We will be continuing the theme from this interview and discussing EMR usability, how interface design is so important and how it can be improved with the addition of intelligent speech interfaces and the importance of enabling clinicians to use narrative documentation as the source of truth in our march towards digitization of medicine and the medical record
Natural language processing (NLP) applied to medical speech and text, also know as Clinical Language Understanding (CLU), is a hot topic. It promises to improve EHR user experience and extract valuable clinical knowledge from free text about patients. In keeping with this blog’s theme, NLP/CLU can improve EHR workflow and sometimes uses sophisticated workflow technology to span between users and systems.You can see the full interview at Video Interview and 10 Questions for Nuance’s Dr. Nick on Clinical Language Understanding via chuckwebster.com
10. Most of my previous questions are pretty “geeky.” So, to compensate, from the point of view of a current or potential EHR user, what’s the most important advice you can give them?To me, the core issue is usability and interface design. The interface and technology has struggled to take off in part because the technology has been complex, hard to master and in many instances has required extensive and repeated training to use. The combination of SR [speech recognition] and CLU technology offers the opportunity to bridge the complexity chasm, removing the major barriers to adoption by making the technology intuitive and “friendly”. We can achieve this with intelligent design that capitalizes on the power of speech as a tool to remove the need to remember gateway commands and menu trees and doesn’t just convert what you say to text but actually understands the intent and applies the context of the EMR to the interaction. We have seen the early stages of this with the Siri tool that offers a new way of interacting with our mobile phone, using the context of your calendar, the day and date, location and other information to create a more human-like technology interface that is intuitive and less intimidating.