Uncategorized

What if I Had To Do HIT All Over Again?

This post is aimed at serving as an interlude to the “public option/death panels” discussions. No matter what healthcare reform bill, if any, is passed this fall, HIT will be part of the program.  Four short years ago I was involved in the creation of a comprehensive, some would say monolithic, EMR/Practice Management/Billing system. This new product was built in reaction to the very large, very expensive and very clunky systems already on the market.

Remembrance of Things Past – The driving design considerations four years ago

  • The problem – Paper charts are causing inefficient workflows in physician offices. It is hard to find pertinent information in a big chart and it is hard to analyze that information. Charts can only be accessed by one person at a time and cannot be accessed from outside the office. Charts are sometimes misplaced and may be lost during a fire or natural disaster. Every new chart costs money to create, store, pull and maintain.
  • The solution – Application software that provides a computerized version of the paper chart – an Electronic Medical Record. Computers are great at storing and arranging data in all sorts of ways and formats. Computers can analyze, graph and report on enormous amounts of data. The software should be web based so it can be easily accessed from anywhere by multiple users simultaneously. No more misplaced charts and no more wasted office space and a SaaS solution would make sure the records are disaster proof.
  • Constraints – The application must be pleasing to the eye, easy to use, customizable and economical to purchase. It should include standard practice management and billing features, or be able to easily integrate with such software. The application should ensure that all patient data is secure at all times.
    • Insights  – Medical records, whether they are stored on paper or ellectronically, are dispersed across multiple care systems. If the medical record is to be of any value, it must be comprehensive. Any provider, care giver or patient must be able to access data aggregated from all those  disparate sources, either “just in time” or from a centralized location.Evidence suggests that the few providers with EMRs in their practice are having difficulties using these systems. Aside from clunky features and bug infestations that plague the majority of EMRs out there, there seems to be one common complaint: cumbersome data entry and quality of resulting documentation. Last, but not least, is the cost issue. Most EMRs are still too expensive, particularly for small practices. There are two components to cost: upfront investment and loss of productivity over time due to the complicated nature of the software itself.
    • Perspectives – Any attempt to mitigate the lack of continuity in the medical record must begin with standards. Terminology and data standards, as well as communications standards. This does not necessarily imply one monolithic standard. There is room for many different ways of storing and transferring data, as long as the standards are documented and understood. There could be an industry niche for translation gateway providers (similar to the Star Trek universal translator). Of course, all standards should be open and free.Once the vast majority of providers, payers and patients have electronic capabilities, an addressing system should be created to allow “Just In Time” (JIT) access to medical records, no matter where the information resides. The aggregation should occur at the point of request and the translation gateways may fulfill this function as well. Some argue that it is better to aggregate all data in a centralized massive storage system. Building and securing such a “database in the sky” is a monumental task and the recent NHS experience suggests that this may not be the right approach. I am, by no means, discounting the logistic difficulties involved in JIT medical record aggregation, but technology is bound to advance and attenuate such difficulties.
  • Musings – The other day I was doing laundry in my basement. I found myself staring at the washer and dryer thinking that we tried to do too much with our EMRs and the technology was not there to allow it. Think about washers and dryers. You wash the clothes in one, manually move them to its neighbor to be dried and neither one sorts or folds laundry. Yet every household has a washer and dryer. Nobody is missing the old vats of boiling water and the big old wringer. The washer and dryer industry did not insist on a perfect, complete and seamless automated process.So why are we shooting for a paperless office? Because it sounds good? Let’s face it, no other industry has paperless offices. Not even the banks.Maybe we just pick a few things that are important and do them really well, instead of doing a mediocre job for everything.Maybe we computerize only data that is both pertinent to patient care and easily captured, codified and standardized.Maybe the EMR should neither sort or fold laundry. Maybe it shouldn’t attempt to create prose while physicians are required to painstakingly click on a multitude of little boxes.Maybe we take a step back and provide simple, basic, robust and really useful tools instead of one big unwieldy glob of software.Maybe one day technology will advance enough to obviate the need for manually collecting data at the point of care. Maybe the EMR will just sit there and quietly observe the patient/doctor interaction, while continuously processing and recording pertinent data on its own. The perfect scribe. It sounds like sience fiction, I know, but so did many other things that we now take for granted, like washers and dryers.

After all the customary trials and tribulations, a software product was born. Features were added, directions were changed, certifications were obtained, and like all software applications, the EMR kept on growing and progressing on a predictable trajectory. I was fairly pleased. But, what if…?What if I had to do it all over again? What if I was starting today with a blank piece of paper (or whiteboard) trying to design the perfect EMR? Would my considerations be different than four short years ago? Would the design principles be the same? What about the implementation?  Would today’s technologies be able to provide solutions to yesterday’s insurmountable problems?

Prelude to a Philosophy of the Future – Insights, perspectives and musings

So if I had to do it all over again, I would take a hard look at Microsoft Office. I would build multiple useful applications, like Word, Excel, Power Point, etc. I would make sure I can export data from one to the other. I would make sure that the user interface is consistent between them. I would allow others to create templates and integrate their software into my tool bars. I would borrow from Google and make it all web enabled and capable of communicating with all interested parties. And if I had all the money in the world, I would make it open source and free.Yes, I know, it sounds an awful lot like Clinical Groupware.

Margalit Gur-Arie is COO at GenesysMD (Purkinje), an HIT company focusing on web based EHR/PMS and billing services for physicians. Prior to GenesysMD, Margalit was Director of Product Management at Essence/Purkinje and HIT Consultant for SSM Healthcare, a large non-profit hospital organization.

Categories: Uncategorized

Tagged as: ,

45 replies »

  1. I appreciate your point of view about XML and CSV. I believe the conflict typically stems from how CSV is commonly viewed, i.e., just an array of data in which fields (columns) are separated by commas and records (rows) by line breaks. People are surprised to discover that all the elements, attributes, and hierarchies from an XML file can be conveyed more simply in a specially constructed CSV or spreadsheet file, which can also be pre-serialized and whose data elements are able to maintain “live” numeric values without transformation. Nevertheless, this is not relevant to html presentation.
    Instead, CSV has the greatest benefit when its contents are used in computational models, as opposed to strictly textual models. Since I’ve been involved for the past two decades in developing a P2P pub/sub cyber-infrastructure focused on managing the exchange, evolution, and use of computational decision-support models, we routinely use a novel method by which spreadsheet-based templates (a) produce and transmit data delivered in CSV (or XLS) data files (via publisher functionality), and (b) retrieve and consume & render those same files (via subscriber functionality). Due to the way the data and template files are constructed (using “positional correspondence”), the data elements and attributes are conveyed from publisher to subscriber, and rendered in offline interactive reports, without the use of markup tags and html (even though the templates can consume and transform XML and HTML).
    Anyway, there is room for both methods of data transfer.

  2. Well, well,
    Just thinking about going from XML to CSV is ludicrous. Please pardon my characteristic bluntness.
    XML is based on a principle required for efficient information interchange: It is structured data that can be easily validated. CSV is difficult to validate in a universal sense.
    XML derives from the very succesful HTML. Billions of succesful information exchanges are performed every second with HTML.
    XML is in high use in every industry that uses information technology.
    CCR and CCD leverage XML for this reason. CCD derives from CCR but it also leverages the HL7 RIM, so in essence they are very similar.
    Nowadays the difference in transmitting text information whether plain or formatted in XML style is insignificant.
    XML is human readable as well.
    Thanks,
    The EHR Guy
    Note: (EHR .NE. EMR) (EHR = Universally Accessible Patient Information)

  3. Dr. Beller, click on the common network tab at that site. Every flow chart element has an associated PDF. There is a wealth of information in each one, including network communications.

  4. Thanks, Margalit. I knew Carol is with Markle’s Connecting for Health, but I’m uncertain where she discusses P2P pub/sub architectures.
    Concerning CCR/CCD, I believe one thing making CCR simpler is that a CCD contains embedded html tags and much more tagged metadata in addition to XML data & their markup tags, whereas the CCR is just the XML data and their tags.
    A corresponding CSV, on the other hand, has no such tags and converts immediately into a basic data grid that can be rendered, for example, in a MS Office-based report that contains (conditional) formatting instructions. XML-based CCDs and CCRs, on the other hand, need XSD or XSL/T schema files to properly render a report.
    Note that I have nothing against XML, per se, although there are benefits to other simpler data formats– such novel ways to use CSV–that (a) are more human readable in their native form, (b) are smaller to transmit and store, and (c) parse very rapidly into a searchable data grid (such as a spreadsheet) that is able to be rendered quickly into an offline interactive report.
    Anyway, this discussion exemplifies the importance of truly out-of-the-box (original, creative) thinking that, as I mentioned earlier, can be stifled by conventional standards.

  5. Dr. Haughton, I find the work you are doing at DocSite very impressive. I have been working in the comprehensive EHR/PMS/Billing arena for quite some time, and as the article above implies, I am starting to wonder whether this is the only way, or even the best way.
    It’s beginning to look to me that having a well appointed toolbox, may be more valuable then having a glitzy all-in-one big tool, and any mechanic would likely say the same.
    I’m probably going to have to think some more and write some more just to clarify things for myself…..

  6. Dr. Beller, Carol Diamond’s work is part of the Connecting for Health initiative of the Markle Foundation.
    Here is the URL for that project:
    http://www.connectingforhealth.org/
    Regarding the CCD/CCR standards, I find the CCD to be unnecessarily complex. The CCR is rather simple and nimble, but that could be just a personal preference.

  7. Dr. Haughton:
    I’m interested in learning more about Carol Diamond’s call for use of a pub/sub node-based HIE / NHIN infrastructure. Do you have any links you can share?
    And concerning CCD vs. CCR, I think their reliance on XML makes them both more complicated and inefficient than is necessary. I say this because the data they contain can more easily be laid out in a comma separated value (CSV) file (including any parent-child hierarchies, although they are rarely, if ever, required for health data exchange).
    In fact, I’ve developed an open source app that uses an MS Excel VBA macro to convert a CCD into a much slimmer and much more human readable CSV file at https://sourceforge.net/projects/convertxmltocsv/. Note that the CSV could be used instead of the CCD for transmitting data from node to node. Nevertheless, CCDs/CCRs are today’s standards and thus cannot be dismissed.

  8. Interesting thread… Some thoughts…
    1) The last post about A and B getting translated by an intermediary pretty well describes the desire behind RxNorm (input Multum, First Databank or other and translate to RxNorm or one of the other systems) — Good idea, would be even better if the Government would create an open wiki or similar to create a crowd-sourced comprehensive drug-drug interaction system (Would cut about $20/Doctor/Month off the cost of e-prescribing, now paid to Drug Data manufacturers).
    2)The Pub/Sub Node with some reporting central store – describes well Carol Diamond’s and lots of others architecture for an HIE / NHIN infrastructure (eg hybrid federated – pub/sub node and centralized – central clinical repository).
    3) The Vermont Blueprint and VITL exchange that Governor Douglas (Vermont Gov, also Chair of National Governors Assoc this year – different topic, but look at his RxReform platform for accessible, affordable accountable healthcare – pretty interesting) — Anyway, the exchange started generating data for reporting and for community coordination by doing 2 things – 1) Agreeing on transport (started as CCR, then moved to CCD – both work, but as Phil Marshall from WebMD stated in his HIT Policy committee testimony – CCR is easier to use unless one needs to use CCD for standards reasons) and 2) Agreeing on a LIMITED semantic dictionary – make sure to collect a few important things in a structured, easy to manage fashion, and the system can be used by lots of parties.
    Bottom line feels as if designing to solve the GOAL of the PROJECT or TASK ends up with a simple, effective solution – the heart of the original post – it was right on target.
    Feel free to contact me for further info 919 256 9510 or j haughton at docsite dot com (no spaces).

  9. “The interesting thing is that the only ‘standard’ that clinicians use in the daily care of people is English. I think this is unlikely to change, Dr. Beller.”
    Good point, Dr. Dussia. I’d go one step further: I believe our country should be engaged in international collaboration and research, so English isn’t even a universal standard.
    In any case, using a pub/sub node-to-node architecture, there can be one or more nodes between the publisher and subscriber that serve a data translation/conversion function via mapping methodology. That is, if the publisher uses a local terminology standard “A” and the subscriber uses local standard “B,” then the data can be sent to an intermediary node where corresponding terms are translated into the subscriber’s parlance. This would not only improve communications between clinicians in different regions and facilities, but also in between clinicians in different disciplines. Likewise, the terms could be translated into layman’s language when communicating with patients!

  10. Ms. Gur-Aire, you said, “However, in the more common situation where the ED has a totally different system, there is a need to somehow export/access the Medisyn records to/from the ED system. It will be almost impossible for the ED system to decipher and assimilate your very nice, but free text based, problem list and medications.”
    In my opinion, there is a big difference between the ED doc and the ED system. Since I am interested in taking care of patients, I designed the Medisyn system to allow the ED doctor (or any other clinician) to have necessary and sufficient clinical information without a patient interrogation. If the ED doctor desires to use the ED system to generate a clinical note then the simple mechanism of creating an HL7 message (which most of the COTS EMRs claim to be able to do) and using a free interface to allow the generated note to move into the clinical repository.
    For the majority of clinicians in the country who do not have an EMR, the Medisyn system can be used as an in office tool or a point of service clinical tool if the clinician moves around to see patients.
    The big problem is that the patients move around, as was seen in the Medisyn tutorial. The interesting thing is that the only “standard” that clinicians use in the daily care of people is English. I think this is unlikely to change, Dr. Beller.
    For the EHR Guy, I use voice recognition for many applications, but not in my clinical practice. It takes too much of my time when I am creating a clinical note. I measured and found out that a more efficient use of my time was to send my dictation to a transcriptionist who could do the formatting of my note in a much more cost effective manner.

  11. I don’t think there’s a lack of standards … It’s more like there are many different standards (including technology/messaging standards, terminology/data standards, and care measurement and process standards), which continue to evolve over time. As I discuss at this link — http://curinghealthcare.blogspot.com/2007/05/art-of-health-knowledge-creation-use.html — standards can be a double-edged sword: While on the one hand they can promote interoperability, knowledge growth, and better understanding, they can just as likely impede creativity, increase costs, and inhibit knowledge and understanding. That’s why I argue that HIT tools should be agile enough to embrace all useful standards…the one’s that are currently mainstream, the ones that are valuable to niche markets, and the ones that will emerge in the future.

  12. Michael, those standards you mention are all fine. However, if you are going to have ad hoc requests for information, whether in a peer to peer, or larger network, there needs to be more than that. Maybe a predefined set of methods exposed via web services with very clear in and out variable definition, or something similar. Also, the terminology needs to be standardized. As discussed above, you can have multiple standards published and delegate translation to an intermediary.
    Let me know when your hologram is ready… I’d like to beam it up here for a quick look….

  13. Margalit,
    I’ve always stated that the input to the software is the big problem. Interrupting the clinicians workflow by forcing them into a non-natural state (for their activities I mean) to feed the computers such as someone working at a desk is detrimental.
    Products such as the ones Bill Crounse refers to address this problem.
    We have to leverage technology as much as we can. Voice recognition is one of them and there are many others as well.
    And no, we don’t have to use interactive holograms yet. I’m working on that and it’s not quite ready.
    Thanks,
    The EHR Guy

  14. I keep reading there are no standards for healthcare IT interoperability.
    Could you please expound?
    As far as I know there are many standards e.g.(DICOM, HL7, ASTM CCR)
    Thanks,
    The EHR Guy

  15. I also agree that an intermediary would be useful for larger scale P2P implementations so that each peer/node can find other peers/nodes during the publisher-subscriber activation process (i.e., when two nodes connect with each other for the first time, which includes authentication and authorization). A RHIO/HIE would be an ideal intermediary supporting such P2P connectivity regionally. A Federal government agency, or even a “supra-RHIO/HIE” node that connects the regional ones, could do this nationwide.

  16. Health Care costs will not/cannot come down unless/until the government recognizes the need to prevent further blockage by Big Pharma and the AMA of real, meaningful PREVENTIVE MEDICINE – the kind practiced by naturopaths, integrative,and complementary physicians.
    They prescribe safe, natural supplements instead of pharmaceuticals that kill! And it’s much, much cheaper!

  17. I agree Alexander. It won’t work on a very large scale without an intermediary or a super node or a translation gateway, whatever we end up calling it.
    What I like about eCW’s announcement is the change in the way vendors are thinking. Exchanging information is finally becoming a worthy goal. As long as they are moving in that direction, every small step is an achievement.

  18. P2P communication works great when a PCP refers a patient to a specialist or orders a test. And there are already exchange formats widely used for that, such as HL7, CCD and CCR. But in order to get all patient EHR’s through P2P connections, (1) the requester has to somehow find out, which peer systems have that information, (2) make sure they are connected, (3) send a request to each of them. And every EHR application must have its own authentication and authorization module to handle external requests… I just don’t see how this may work without an intermediary.

  19. Hey Paul, just for my own curiosity, what is your vision of what will work?

  20. “I’m not ruling out RHIOs or other intermediaries, but I believe the actual data need not reside anywhere other than the provider system.”
    Yes! I suggest that one important role for RHIOs, HIEs, etc. would be (1) to aggregate de-identified patient data; (2) to make those data available to authorized research organizations (universities, etc.) who study the data to help develop and evolve evidence-based preventive, diagnostic, self-maintenance/management, and treatment guidelines that focus on bringing ever-increasing value (i.e., cost-effectiveness) to the patient/consumer; and (3) to disseminate the resulting guidelines to all parties.
    In this scenario, using the decentralized node-to-node architecture, the patient data would be stripped of patient identifiers and shipped to a centralized research data warehouse. The stripping and shipping would be done by the nodes having direct access to where those data are stored, that is, to the nodes belonging to the clinician/provider that access the data from their EHRs, and to the patient nodes having access to their PHRs. Nodes having direct access to the research data warehouses would then receive the de-identified patient data. In other words, the clinician and patient nodes would implement their publisher (sender/transmitter) function to transmit the data, and the RHIO/HIE’s data warehouse nodes would implement their subscriber (receiver) function to retrieve the data. And the resulting guidelines would be shipped via the RHIO/HIE nodes by implementing their publisher function; the guidelines would be received by the clinician nodes implementing their subscriber functions and subsequently be presented through clinical decision support software programs.
    This scenario is an example of a hybrid mesh node network architecture in which both centralized and decentralized networks work in harmony. BTW, another example of a hybrid mesh is when a multi-site healthcare organization with a centralized EHR system (behind a firewall) connects via nodes to the EHRs and PHRs of other parties outside their organization (beyond their firewall).

  21. “I’m not ruling out RHIOs or other intermediaries, but I believe the actual data need not reside anywhere other than the provider system.”
    Yes! I suggest that one important role for RHIOs, HIEs, etc. would be (1) to aggregate de-identified patient data; (2) to make those data available to authorized research organizations (universities, etc.) who study the data to help develop and evolve evidence-based preventive, diagnostic, self-maintenance/management, and treatment guidelines that focus on bringing ever-increasing value (i.e., cost-effectiveness) to the patient/consumer; and (3) to disseminate the resulting guidelines to all parties.
    In this scenario, using the decentralized node-to-node architecture, the patient data would be stripped of patient identifiers and shipped to a centralized research data warehouse. The stripping and shipping would be done by the nodes having direct access to where those data are stored, that is, to the nodes belonging to the clinician/provider that access the data from their EHRs, and to the patient nodes having access to their PHRs. Nodes having direct access to the research data warehouses would then receive the de-identified patient data. In other words, the clinician and patient nodes would implement their publisher (sender/transmitter) function to transmit the data, and the RHIO/HIE’s data warehouse nodes would implement their subscriber (receiver) function to retrieve the data. And the resulting guidelines would be shipped via the RHIO/HIE nodes by implementing their publisher function; the guidelines would be received by the clinician nodes implementing their subscriber functions and subsequently be presented through clinical decision support software programs.
    This scenario is an example of a hybrid mesh node network architecture in which both centralized and decentralized networks work in harmony. BTW, another example of a hybrid mesh is when a multi-site healthcare organization with a centralized EHR system (behind a firewall) connects via nodes to the EHRs and PHRs of other parties outside their organization (beyond their firewall).

  22. Dr. Beller, I think a biometric ID is probably a very good choice, short of implanting a chip 🙂
    The NCVHS has been tinkering with this for over a decade, but nothing happened. There seems to be some reluctance on the part of most people to have such identifier. I’m not sure why, since we all get SSNs immediately after birth and think nothing of it.
    I think the technology is available for biometrics and the logistics are not insurmountable (put a machine in every DMV).
    Alexander, I know that availability is an issue with the current crop of EMRs, but I strongly believe that SaaS is the future. Besides, as Dr. Beller mentioned, we all use phones without the operator having to patch calls through anymore and without having to run to the telegraph office to send something. Technology changes fast and I can see a device or an executable installed in every office to ensure availability.
    I’m not ruling out RHIOs or other intermediaries, but I believe the actual data need not reside anywhere other than the provider system.

  23. Reminds me of the conversation in the movie City Slickers when Billy Crystal tells his friend his life is a do-over.
    From where I sit, I think a do-over is exactly what’s needed on two fronts. On the provider side, EHR decisions need to be based on what business problems are being addressed and on an ROI, not on what DC may or may not do.
    On the interoperability or transport side of the record I do not believe much of what is being worked on today will exist in 3-5 years (which further compounds the difficulty of what the providers are doing.) I think Meaningful Use and Certification will cease to exist, and that the structure of hundreds of Rhios and HIEs will cease to exist because they will have failed to work.

  24. Margalit, what you describe, basically, reflects the principles, on which the proposed NHIN infrastructure is based. The only difference is that it is supposed to connect RHIO’s rather than separate EHR systems. Without a nationwide patient ID, though, it is going to be very challenging to find and link all medical records on the same person since some important data fields used by matching algorithms can be empty or contain incorrect values. Besides, as I mentioned before, it is much more difficult to predict availability of EHR systems installed in small medical offices or hospitals, unless they use cloud-based applications.
    Actually, as far as I understand, the Conservatives in Britain do not reject the idea of national repository of medical records, they are just criticizing the way the Government handled the project, and consider contracting Google or Microsoft to host the data.
    How about taking this discussion offline, because I would like to learn more about your approach, and could share my views on the topic if you are interested? Objections to a centralized data repository are less about its efficiency and availability, but rather stem from widespread mistrust of the Feds and their intentions.

  25. “…what I envision is that any new EHR installation would register with a gateway and establish a live connection. The gateway would have unmitigated access to a list of patient identifiers (need to figure an acceptable identifier). When the patient presents at a connected provider, the provider system will make a request to the gateway, including something that authenticates the patient as the requester”
    Margalit – What do you thing use of a biometric index to create a unique patient identifier (medical record number)? It would negate the necessity to establish and connect to a central repository, and it would enable the fluid exchange of patient health info between any nodes in a mesh network architecture, which is similar to the way communication is done in telephone networks (see http://wellness.wikispaces.com/Network+Architectures)

  26. Alexander, I keep thinking that assembling the bits and pieces of the medical record in real time is really what needs to happen sooner rather than later. Even now, with all the disparate technologies and terminologies, there is much that can be gained.
    I think a Microsoft or IBM or Google or any experienced data aggregator would be best suited to tackle something of this magnitude, but I can see opportunity for determined new ventures as well.
    I could write volumes on this, but briefly, what I envision is that any new EHR installation would register with a gateway and establish a live connection. The gateway would have unmitigated access to a list of patient identifiers (need to figure an acceptable identifier). When the patient presents at a connected provider, the provider system will make a request to the gateway, including something that authenticates the patient as the requester. The gateway would scan its connections, locate the records, authenticate the patient to the other systems, retrieve the records, reformat as best it can and present to the requesting provider.
    Of course it is much more complex than that and there are many possible variations and roadblocks, but that would be the general idea and we do have the technology to accomplish this today.

  27. Dr. Dussia, I just looked at your Medisyn system tutorials and I like the simplicity and clean look & feel. I can definitely see how it can be a very useful tool for clinicians, even though it doesn’t have all the bells and whistles of a full blown EHR, and that is basically what I was trying to say above.
    There is one basic problem though, you are assuming that various care providers all have access to see and modify the Medisyn records (like the ED example). In this case there really isn’t a need for codifying anything.
    However, in the more common situation where the ED has a totally different system, there is a need to somehow export/access the Medisyn records to/from the ED system. It will be almost impossible for the ED system to decipher and assimilate your very nice, but free text based, problem list and medications.
    I am not necessarily advocating ICD9 codes, just a common terminology that can be electronically transmitted and understood by all software systems. This is imperative for automated community wide measurements and observations.

  28. Oh, my bad… I misread “NHS” for “HHS”. There are a number of publications that mention various problems with the NHS medical records repository, but offer little information on the technical details of the project. The need for an integrated patient health record, across all care settings and state borders, is apparent. What should be clearly defined upfront, before geeks even enter the room, is: ownership of the record; its contents; what is updateable or immutable; who is responsible for maintaining its sections; access rights, including “breaking the glass” rules, and consent management; security and so on.
    The simplest scenario would be to give everybody a memory stick that the doctor can plug in to his computer, enter the decryption key, and his EHR application opens our records. At the end of the visit, he saves your demographic information and encounter details in his system for billing purposes, and copies them to your drive. Would this be a perfect solution? I am not sure unless we keep our PHR card on the keychain at all times.
    In order to aggregate all bits and pieces of our EHR, scattered around multiple clinics and hospitals and systems, at least, (1) there must be some unique identifier that links them together and to the patient and (2) all those systems have to be available at the time of aggregation.
    I am still leaning towards a nationwide patient record repository. The NHS medical record woes are most likely a result of terrible project management. I remember a failed attempt to create the Integrated National Crime Information System in New Zealand ( http://en.wikipedia.org/wiki/INCIS )…

  29. Alexander, I was referring to the British healthcare system, who tried to centralize everything and failed miserably.
    Dr. Duriseti, I think you hit the nail on the head with your comment. The entire EMR/EHR conversation is upside down. If we defined a set of short term and long term goals, propose solutions and then, and only then, look for appropriate technology to achieve those solutions, none of the current products would probably exist in their current format.
    Dr. Dussia, I totally understand the need of an individual clinician to see what other physicians are thinking, and this sort of information can be stored and transferred in document form. However, that may not be enough for the community. If we are to measure outcomes, improve quality and research effectiveness, we must also collect some discrete data. This data must be standardized (terminology and structure) so it can move between disparate systems and allow them to make meaningful use (no pun intended) of it. The alternative is to have one national EHR system. I wouldn’t choose to go that route.
    Think about various e-mail applications. They are mostly proprietary, yet any e-mail client can easily connect to any e-mail server and immediately “understand” its content. You can even export/import data between servers. This is possible because all mail systems use similar standards.
    Does the government need to create/impose those standards? It would have been better if it did not. However, since the EMR industry failed to create its own standards, and we are engaged in healthcare reform, I don’t see why the government shouldn’t step in.
    Dr. Crounse, although I think Microsoft Office is one of the better software applications in over a decade, I was referring more to its design principles than anything else. What the HIT industry should borrow is the plug and play abilities, the unified user interface and mostly the ability to allow users to decide on their desired level of expertise, while being productive at all stages.
    Cindy, I must confess that it took me a bit too long, but I came to realize that all the patient centric talk is pretty much useless without an informed and actively participating patient at the actual center of care. All the interoperability and data liquidity considerations must include the patient, not just provider-to-provider communications.
    Dr. Beller, I am very encouraged that folks out there are actually starting to think differently about these issues. I hope your prototype lives long and prospers 🙂

  30. One core functionality that must be redesigned to achieve a more “perfect” EHR product is the flawed process used by all existing ambulatory and hospital-based EHRs to report the results of patient diagnostic tests to physicians and patients.
    It is a wasteful and costly anachronism of the mainframe computer era for EHRs, and growing numbers of PHRs and HIE platforms, to continue to use infinitely variable formats to display the results of the 6,000 different patient diagnostic tests as fragmented data, not as organized, easy-to-read information.
    It’s also ironic at a time when reducing unnecessary costs and waste in the current health care system is a critical national priority and there is a seller’s market for any member of Congress who has a credible plan or amendment to cut costs while also improving quality.
    The practical solution is to combine a standard reporting format with clinical data integration to enable the more efficient viewing and sharing of the more than 30 billion annual test results. Facilitating results viewing and sharing would help physicians minimize duplicate and non-contributory testing costs, improve patient safety and satisfaction, foster widespread EHR and PHR adoption, by enhancing interoperability and usability, and leverage the highest value from public and private investments in all three types of HIT platforms.
    The major reasons this hasn’t yet been achieved to date by any of the hundreds of HIT vendors are eloquently described by Clayton M. Christensen at: http://innovatorsprescription.com .

  31. “So if I had to do it all over again, I would take a hard look at Microsoft Office. I would build multiple useful applications, like Word, Excel, Power Point, etc. I would make sure I can export data from one to the other. I would make sure that the user interface is consistent between them. I would allow others to create templates and integrate their software into my tool bars.”
    Wow, Margalit, that’s exactly what we done! We’ve actually just presented the first live public demonstration of a prototype of our system to doctors, educators, and insurers. It went very well!
    The demo showed, in real time, how this MS Office based system enables: (1) primary care physicians (PCPs) to send personalized referrals to specialists, (2) the specialists to reply to those referrals, and (3) the PCPs to respond to the specialists’ acceptance reply by sending them XML-based continuity of care documents (CCD) and other supporting data files, and (4) the specialists to access and view the resulting patient info.
    This is all done with encrypted e-mail attachments and a small software program and macro routines that process the e-mails automatically. They automatically encrypt, zip, and attach the files to e-mail and put them in the outbox; as well as retrieve the email from the inbox and unzip, decrypt, format and display those files, and store them encrypted in the recipient’s computer.
    It requires as few as 5 mouse-clicks per end-user for the entire process. No need for central servers (or any other infrastructural build-out), there is little if any need for IT support, and there are no other costly complexities.
    And all the data are stored locally in encrypted files, which is automatically retrieved and rendered any time via a few button clicks. From a technical perspective, it’s a simple node-to-node (peer-to-peer), publisher-subscriber, and asynchronous decentralized desktop solution that uses Office macros, .Net, and SMTP. It is literally the easiest, most convenient, and least costly way I know to exchange and present patient health information securely between any EHRs in a way that promotes care coordination.

  32. Margalit, thanks for sharing your experience and reflections. You have many good points, including:
    “Maybe we computerize only data that is both pertinent to patient care and easily captured, codified and standardized.”
    Ram, I agree with you and can’t figure out why this isn’t obvious to more people. Without developing standards for interoperability in advance, we are going to end up with a lot of clunky EMR systems that can’t communicate with each other. That won’t fix health care.
    We need free and open standards to be established (and meaningful use defined) before EMR systems are expanded. We need more people like Margalit and Ram to play a role in guiding national HIT implementation (instead of people who don’t know what interoperability means). We need designers and usability experts who can work with the IT people, systems engineers, clinicians, and patients to develop systems that aren’t clunky and unusable.

  33. “…….So if I had to do it all over again, I would take a hard look at Microsoft Office. I would build multiple useful applications, like Word, Excel, Power Point, etc. I would make sure I can export data from one to the other. I would make sure that the user interface is consistent between them. I would allow others to create templates and integrate their software into my tool bars”.
    Thanks for the unsolicited plug 🙂 Of course, I tend to agree. I’m extremely pleased to see vendors picking up on MS Office (Word) as a documentation tool for the EMR (www.gloStream.com) and other vendors (www.ushealthrecords.com for example) also making use of the investment we have made in the common user interface (see http://www.mscui.net and check out the patient journey demonstrator).
    I also agree that much of what is on the market today is way too expensive. We need solutions that are priced more like MS Office. Cloud computing, clinical groupware, and ICT solutions that focus as much on the “C” as the “I” are also needed. I don’t even mind open source so long as the playing field is level and people understand that the cost of acquiring software is often the smallest slice of total cost of ownership.
    Bill Crounse, MD Senior Director, Worldwide Health
    Microsoft

  34. I am not a wonk, just a doctor.
    I just need one thing—I want to know what the other attending clinicians’, who share the care of my patient, are thinking.
    The “data” I want is the work product of the clinician, the document of the clinical encounter and the “standard” is English” (well, maybe, medicalese.) This isn’t complicated. I just want the clinical information straight from the clinical author.
    Ms. Gur-Arie is right in noting that the problem is that the “charts” are lost to the variety of others who need the “chart.” It doesn’t make any difference whether the chart is paper or a site specific EMR—it is still lost to the attending clinicians when the patients distribute themselves across multiple care systems away from the site of storage.
    As for the “one common complaint”: Cumbersome data entry is a product of poor software design and the quality of the document is a function of the author/clinician. The author/clinician has to sign what they create and so that the rest of the clinical community can know what and how they are doing. Quality is easy to spot.
    Mr. Saip is right in two different area: first, Federer must be using some strange herb and second, we do need an infrastructure. The good news is that Ms/Mr Federer will probably find the right blog to answer and the infrastructure is already in place. It is the Internet.
    Dr. Duriseti and I are both clinicians and engineer/designers, but we have a very different view of the feds. He thinks they can help. I think they want governance over the clinical information.
    Dr. Duriseti mentioned ICD. I am still looking for a code for “illiterate” and about 30% of the population is functionally illiterate. Codes don’t come close to being able to cover the variety of things the busy primary care doctor sees in their practice. Coding systems are only useful because the computers that get the claim forms are programmed to take them. This is definitely a finite problem that should be changed.
    The clinicians’ solution is a cheap conduit that will allow the busy clinician to push the completed document of the clinical encounter into a repository at is readily available to other busy clinicians at the point of service. No software, no hardware and no EMR—just a community of clinicians who participate in a “lending library” where each clinician author can expand on the continuing saga of our patients lives in a logically connected and efficient manner. I’ve been using it since the 1990’s.
    We don’t need the federal government for that.

  35. Interesting post. I suspect only wonks like me will flock to it. Like me, I suspect you see EMRs (however one chooses to define the entity) as simply a means to an end. The ends are error reduction, quality improvement, and information with a dollop of portability and accessibility.
    To facilitate these 3 major objectives, there are basically 2 critical components: discrete data capture and standard terminologies for information storage, analysis, and exchange. I have yet to find an EMR company that was built on these principles. They seem to have largely grown out of a billing capture tradition that slaves itself to a very different set of objectives.
    Given the above, I have often wondered why the scores of people getting paid millions of dollars in the Federal Government have not come to the following obvious conclusion: 1) build a federally certified data model around which other data structures and any user presentation can be built, 2) Define the set of terminologies, or create it (it’s a finite task) to cover the range of issues encountered in clinical care. There isn’t a physician or informatician who doesn’t recognize the failings of ICD (not comprehensive and not accurate) or SNOMED (overly complicated and embedded with an implied goal of auto-classification). Maybe my background as a clinician and an engineer who has actually built applications gives me a different perspective. I readily submit that it could be an entirely incorrect perspective…

  36. I believe that digitizing medical records is not just good for the environment, but can help improve care coordination, clinical decision support, early epidemic detection, etc.
    “…If the medical record is to be of any value, it must be comprehensive.”
    Does that include physician and nurse notes, reports, and other documents? If yes, then your question: “So why are we shooting for a paperless office?” – sounds sort of contradictory. I absolutely agree that user-application interaction should fit in to care setting workflows as seamlessly as possible, but that will still require some adjustments on the user side. Computers are extremely efficient analyzing, processing and sorting structured data, but they are not good at all dealing with ambiguity.
    “…Some argue that it is better to aggregate all data in a centralized massive storage system. Building and securing such a “database in the sky” is a monumental task and the recent NHS experience suggests that this may not be the right approach.”
    Are you referring to the Nationwide Health Information Network (NHIN) ( http://healthit.hhs.gov/portal/server.pt?open=512&objID=1142&parentname=CommunityPage&parentid=2&mode=2&in_hi_userid=10741&cached=true )? From my perspective, the project has quite the opposite goal in mind: to create an infrastructure for patient information exchange between disparate RHIO’s. The problem is that they all have very different data requirements and privacy restrictions, apart from formats and communication protocols.

  37. In view of the fact that narcotics can be addictive, they should only be prescribed when no other alternative is available and should only be taken as directed by your doctor. Most often, this findrxonline the patients are required to consent to adhere to certain rules regarding the use of their prescription listed in a “Narcotic Agreement” between the patient and physician. Often, violation of this contract, especially selling, sharing, or trading the medication, attempting to obtain duplicate pain medication prescriptions from different physicians, and attempting to have the medication refilled early, at night, or on the weekend, to mention a few, would result in the patient’s discharge from the practice.
    So, take responsibility for your actions and know all your treatment options. Narcotics are rarely your sole savior.