When the hypothetical naked, unconscious and alone patient presents at your ER with no immediately evident reasons for his distress and presumably holding his driver license between his clenched teeth, would you find it helpful if you could see a nicely typed, or hand written, list of diagnoses and current medications for this hapless person?
When a family moves across the country and brings in their eight year old for her first visit with the new pediatrician, would it be helpful to see a slightly fuzzy image of her immunizations list from back home?
When an elderly patient you’ve been seeing for umpteen years is shipped to the hospital in the middle of the night, would it be helpful to find the admission record in your to-do list for today?
Perhaps these things would be nice to have, but EHRs can’t talk to each other, so before any of these miracles can occur we must make EHRs communicate.
How do we make EHRs talk to each other? That’s simple: we look at how people talk to each other, and apply the same principles to EHRs. Thus, EHRs have to share the same language, use the same syntax, know when to speak and when to listen, and when not in physical proximity, use a variety of paraphernalia to carry voice over large stretches of land and sea. And since EHRs are really computers and this is after all the 21st century, we have the blueprint for a solution in our hands, because any computer in Papua New Guinea can talk to any computer in Boonville, Missouri. How? By using the magic of the Internet.Continue reading…







