Uncategorized

An Open Letter to the New National Coordinator for Health IT: Part 2 – Opening the Aperture of Innovation

6a00d8341c909d53ef01157012476e970b-pi

One of the important decisions before Dr. Blumenthal and his colleagues at ONC and HHS is whether the national health information network will be one of closed appliances that bundle together proprietary hardware, software, and networking technology, or one of open data exchange and management platforms in which the component parts required to do medical computing can be assembled from different sources. If the former direction is chosen, power and control will be concentrated in the hands of a very few companies.  If the latter, we could see an unprecedented burst of disruptive innovation as new products and services are developed to
create the next generation of e-health services in this country.

Separating the data from the devices and applications, and maintaining a certain degree of independence of both from the networks used for transmission, is far more than a technical quibble. It can determine the economics of technology in stunning ways.

A familiar example may help here.  On April 10, 2009, BusinessWeek reported that Apple was approaching a milestone:  one billion iPhone Apps downloaded.  This is an amazing number in that it occurred in less than a year, but even more surprising because it might never have happened at all.  Prior to July, 2008, the Apple iPhone was a closed appliance
that could only offer up applications developed for the iPhone by Apple; for example, the iPhone offers a calendar and contact management app, but these are Apple products.  Third-party developers and programmers had no way to make their apps run on the iPhone, and Apple might have kept it this way.

Then, in the summer of 2008, Apple released an upgrade of the software that runs the iPhone and iTouch which included an SDK (software development kit) that allows third-party developers to create software that can be downloaded and run on the iPhone and iTouch; some of these apps are free and others cost up to a few dollars. Whamo! Suddenly the iPhone
became the hottest development platform on the planet. As the New York Times technology blog observed, “Fizzy pints of virtual beer, lightsaber simulators and ancient flutelike instruments [and several health and medically related applications] all have one thing in common: they’re flying off the digital shelves of Apple’s App Store for the iPhone and iPod Touch.” Apple has created a website with a counter showing the actual numbers of apps as they download, and a list of the top 20 most popular apps at the iTunes store.  No one knows for sure what has been the size of the economic stimulation that resulted from opening up Apple’s iPhone platform to third-party development, but it must be many hundreds of millions of dollars and perhaps even billions, and certainly thousands of jobs have been created.

*****

It is not coincidental that the Internet itself was made possible by several legal cases that lost AT&T its monopoly position mandating the use of a specific device – an AT&T telephone – to handle voice data, on AT&T’s private network. As Jonathan L. Zittrain writes in his remarkable book “The Future of the Internet and How to Stop It” —

These [legal] decisions [against AT&T] paved the way for advances invented and distributed by third parties, advances that were the exceptions to the comparative innovation desert of the telephone system. Outsiders introduced devices such as the answering machine, the fax machine, and the cordless phone that were rapidly adopted. The most important advance, however, was the dial-up modem, a crucial piece of hardware bridging consumer information processors and the world of computer networks, whether proprietary or the Internet. With
the advent of the modem, people could acquire plain terminals or PCs and connect them to central servers over a telephone line. Users could dial up whichever service they wanted: a call to the bank’s network for banking, followed by a call to a more generic “information service” for interactive weather and news.

The separation of devices that might utilize data, from the data itself, eventually made it possible for millions of computers of all kinds, makes and configurations to connect with one another over the Internet. Today, audio data in many different formats travel freely over the Internet, and these are consumed, interpreted and put to use by many thousands of devices and the applications running on those devices. If we send you an MP3 file of a song or a conference recording, we don’t need to specify the device or application you will use to decode and “play” that file. If we call you on the phone today, you might answer on a cellular phone, a land line-connected phone, or a computer running Skype.  Each of these needs to “understand” the data in great detail: no one dictates which device the consumer has to use.

*****

What do these examples have to do with health IT and HITECH? Just about everything. They illustrate the layers of standards that operate on a digital network: physical wires/cables (or wireless) components of the network; devices connected to the network; applications running on the devices; data of various kinds created and used by the applications; and finally and most importantly, the social interaction based upon the content – that is, the purpose and meaning – of the whole by users.

The future of today’s health care IT systems seems to be converging on a handful of enterprise-run networks that do not interconnect, and a dozen or so health IT applications from companies that have the power to decide who can connect, what tasks they can accomplish, under what terms, and at what cost. The irony here is that while Microsoft and Google appear eminently capable of finding ways to exchange health data securely over the Internet in partnerships with New York Presbyterian Hospital and CVS Pharmacy, among others (see “Patient Records Inch Into 21st Century,” New York Times, April 5, 2009 , and “CVS joins Google Health Rx network: millions can access medication records online,”
April 6, 2009), the traditional health IT industry seems to be declaring itself unable to free the data from its proprietary
applications and devices quite so easily. Instead, they want a certification process to attach to the software applications from a select group of vendors BEFORE we get interoperability, based on the products’ features and functions, only one of which is the ability to handle the data layer.  In the process, we believe they are acting more like the old AT&T than the new Apple iPhone.

Congress in its wisdom attached incentive payments to the social interaction layer of the network: “meaningful uses” are by definition the outcomes of the deployment of networked technologies and data exchanges, not necessarily referable to specific applications and devices themselves.  This is real progress.  Having done that, the next layer down, the data
layer, is where attention ought to be paid in order for successful and widespread meaningful uses to proceed and economic stimulus to be unleashed in the health IT economy.

The government can and should facilitate the private sector’s arrival at agreement about the content of clinical data that it wants to become accessible to providers and patients over the network;  it can also decide in what structures those data elements are formatted or packaged; and it can certainly set expectations and specifications for the protection of
privacy of the packages and their contents, whether coded data, text, images, audio, or video.   With a limited set of available and well tested standards, this could be done quickly and easily, and would immediately be seen as advancing implementation of the “meaningful uses” that Congress and HSS wish to see.

Here is where the Gordian knot of HITECH needs to be untied.  There is no need for ONC to regulate the next layers, the applications, device layers or networks, or to link its specifications for the data to specifications for those products and
services that will use the data.  They need to know when to stop tying the knot, in other words.

*****

Here’s the risk:  limiting the kinds of devices and software applications that can handle standardized health care data to a few government (or government agent)-certified products would dramatically stifle innovation and utility while raising
(or at least maintaining) costs. Imagine if the government told consumers they had to buy a Dell computer to connect to the Internet, or an iPod to listen to digital music recordings, or computers running Microsoft Vista to use email.  Worse still, imagine we all had to use a modern-day version of the private network Prodigy in order to exchange health data of any kind.  By locking in the use of data to a specific product/device/application or even a specific class of products/devices/applications, consumers suffer by not having the choice that an open market permits. Nor do they benefit from innovation and competition that arise from comparisons of service, features, usability, and price.  And society doesn’t get the economic stimulus, or, rather, it goes to a few large corporations only.

Physicians are, generally speaking, fearful that they will see their EHR choices limited to the expensive and difficult to-implement products that CCHIT now certifies. David Blumenthal is quite right to worry that doctors will “rebel” if given only the choice of a small number of products that are notoriously not “user-friendly” and not designed to meet the goals of “meaningful use” set by Congress.

But we don’t believe that the deeper problem is CCHIT or the dubious politics of hiring a vendor-run organization to certify the vendors’ products. That real or apparent conflict of interest is an issue, but it’s not the major one. Rather, it is the potential linkage of incentive payments to a certification process that would require specific applications to manage health data. That is the root of the the dilemma that Dr. Blumenthal et al. now face; this is the Gordian knot that needs to be undone.

David C. Kibbe MD MBA is a Family Physician and Senior Advisor to the American Academy of Family Physicians who consults on healthcare professional and consumer technologies. Brian Klepper PhD is a health care market analyst and a Founding Principal of Health 2.0 Advisors, Inc.

Click here to read Part 1 of the Kibbe & Klepper series: An Open Letter to the New National Coordinator for Health IT – Untying HITECH’s Gordian Knot

21 replies »

  1. I think it is interesting that the Federal Government is willing to give me, a physician, what amounts to a passthrough payment to the vendor of their choice(the ones who are on the CCHIT board)and call it a “stimulus”. These same vendors who supply the docs with off-the-shelf products that they can “customize” and have never figured out what it is the doctors need.
    Most doctors in the US practice in 1-6 doctor offices. What the vendors offer is front office and back office automation services–I call it electrifying the mess of the local office.
    What the doctors need is a clinical information management tool with a credentialed network for clinicians only, not more coded languages (also known as “standards”).
    This is ultimately about who controls the clinical information, not doctor office automation. CCHIT is a long excursion down a rabbit trail.
    I like the FOSS option as well because it answers the four major concerns of the doctors expressed in the paper that appeared in the NEJM last summer: 1) it costs too much, 2) I can’t make it work for me, 3) it cannot figure my ROI, and 4) it might become obsolete.
    There are better ideas in the pipeline.

  2. Whilst i agree with your point i think the iPhone example – whilst a popular concept – actually makes a pretty poor analogy…
    Third-party developers and programmers still have no way to make their apps run on the iPhone except through the Apple Store. Except with the Jailbroken handsets the software developed by third-party developers MUST be paid for and downloaded from Apple.
    Contrary to the point you’re making Apple (and AT&T to an extent) dictate exactly which device the consumer has to use and how they must pay for it.
    In this instance Apple have forced everyone to use one centralized system. Apple tell consumers they have to buy a iPhone, an AT&T account and set up a App Store Account. The iPhone-AT&T-AppStore is actually an example where the “next layers, the applications, networks” have been highly regulated. And whilst this has limited the offering it has actually delivered countless benefits including usability, adoption, revenues, value etc etc in a much quicker time frame than more open offerings.

  3. Tom is right that the translation of global data standards to local ones is a tricky business. Nevertheless, it is something we ought to strive to achieve. I have ideas about how to start doing it, such as mapping terminology to a provider’s specialty.
    The problem I have with forcing everyone to adopt a particular technology (e.g., messaging) standard is that it can drive our radical innovation. What if, say, someone invented a way to pack a lifetime of detailed cross-disciplinary personal health information for each person into a tiny file (i.e., each personal health record is its own file). And what if this file could be a simple encrypted delimited text file that, unlike XML required by HL7, needs no markup tags? And what if this file can be easily split up and/or combined with other data files, and shared with authorized persons who use personalized report writers to present that information in ways most meaningful to each person (including terminology translation)? Demanding that only XML formats be used would stifle adoption of such innovation. The same goes with communication architectures; e.g., forcing everyone to use centralized systems prevents decentralized innovations (such as P2P).
    These are some reasons why we ought to be wary about forcing global terminology and technology standards on the healthcare industry.
    Steve Beller, PhD

  4. have skimmed the posts and read the original article. I think the development of electronic records with a standard is critical for innumerable reasons. For one, ultimately associations never thought of will appear.
    I am very concerned about anything done over the internet, including current transmission of lab data. We all are, or should be concerned over security to protect financial information and there have been enough instances of identity theft which while eventually perhaps solvable with return of cash are still a nightmare. Just spend some time talking with someone who has had a credit card number stolen.
    The risks to individuals are enormous. Consider the following example. My wife and I have been mutually monogamous since the mid 70’s We have never had transfusions, nor as far as I know engaged in risky behavior prior to when we met. There is a huge push to make HIV testing routine. Indeed when I have had my colonoscopies and other procedures, I had to cross out permission to do so from the consent form. My risk of being HIV positive is vanishingly small. According to Bayes theorem, or really i guess the corollary, a positive result is far more likely to be a false positive than a true one and would almost certainly be shown to be false on further testing.
    Anyone who believes that false positive will not follow them around for the rest of their lives if the information is available through the internet for even a day or two is naive. In the absence of a single payor system they will never be insurable. In the presence of such a system, they will still face challenges, possibly including employment, etc. It is foolish to believe that laws which are designed to protect individuals in such cases would not be circumvented.

  5. Dear Tom and Margolit: Thanks for your opinions. I know there is a growing chorus of people who would prefer to change the law and remove the certification requirements. Who knows? Stranger things have happened. DCK

  6. Steve Beller says:
    > while setting arbitrary global standards for health-
    > related terms is one way to foster widespread
    > communications, the serious downside to eliminating
    > the local standards people rely upon includes loss of
    > important information due to reduced semantic
    > precision and nuance.
    Bracketing the perjorative “arbitrary” disignation, this is Virgil Slee’s point, except he says (more or less) that the proper local standard is “whatever terminology the doctor recorded in his notes, probably longhand”. In his view, this is the REAL medical record, and coding, no matter how granular or sophisticated, amounts to a stylized abstract of the real record. I have gone a step further and said there is no reason we can’t have more than one abstraction of the real record for different purposes, and that we can abstract old records again when our methods improve. I have great hope for natural language processing systems in this area.
    > It would be much better, therefore, to keep local
    > standards, support their evolution, and use the data
    > translation methods to ensure everyone gets the
    > information needed using the terms they need and
    > understand.
    This begs the question “Translated into what?” And translation is a tricky, tricky business.
    Dr. Kibbe says:
    > Our point in this blog post is to “do this right” if
    > you must certify something.
    Right. I’d prefer to certify a transmission protocol. The level to which the DICOM messaging standard is specified is what I have in mind. HL7 V2.x is hopeless — V3 shows some promise. I think Margalit prefers this too, and she quite correctly points out that the X12 messaging standards and things like NCPDP already work fine in their own problem domains. There is absolutely no reason whatever to insist on one grand messaging protocol.
    What a system does with the messages it receives is a quality of implementation issue to be sure, driven in large measure by the organizations implementing the systems.
    t

  7. Wow! Great article. I am going to Tweet it right now and I am also going to post it in my LinkedIn Group – iPhone Healthcare (http://www.linkedin.com/groups?gid=1893202).
    As much as I love developing for the iPhone and as much as I think it is a great device for healtcare workers, I AGREE WITH YOU 100% that any device for the healthcare industry that becomes a hit (I am paraphrasing) needs to be removed from monopolistic strangleholds (at least that is what I think you are saying).
    I am going to write a blog posting about that right now on my site http://www.iphonebiotech.com
    Great Article !

  8. Just don’t make the doc work in real time with whatever beast is wrought. It will crush productivity.

  9. “the web browser platform”
    Most systems are or any can be made available on an ASP basis which automatically means that they are accessible through any web browser.
    Too much is made in any case of “web-accessible” versus software installed on a LAN. A LAN needless to say is a component of the Internet which is obviously the encompassing network of all these subnetworks.
    Margalit Gur-Arie’s comments could be mine. The “certification” issue was inserted in the bill, now law, to restrict offerings to those software developers who go through the unnecessary hoops imposed by the certification process. Restriction of supply, nothing else and should be ignored. This is a case where a competitive marketplace will work to get the best software to the hands of users.
    I reiterate that the sponsorship of a FOSS model should be the primary effort of the USA federal government using as a starting point any current good system. That would immediately bring the outlandishly high price of many commercial EMR/PM systems to some more reasonable level, likely largely predicated on the cost of user training, service and documentation.

  10. David, I suspect the certification obligation imposed by the law was included there based on input from current “certification authorities”. If Dr. Blumenthal takes a fresh look at the goals of ARRA, I don’t see how he concludes that “certifying something” is in any way advancing those goals.
    Laws can be amended and changed. I personally don’t see any reason to spend money and effort in certifying something just so we can say that we certified something.

  11. Margolit: I don’t think there’s any disagreement between us. The question is really this: the ARRA is statute that requires “certification” of “EHR technology.” Unless the law is changed, this is a fixed obligation for Dr. Blumenthal et al. So, what will be “certified?”
    Our point in this blog post is to “do this right” if you must certify something. At least understand the basics of network architecture, and don’t tie knots that will lead to strangling innovation.
    Kind regards, DCK

  12. This is a needed exchange, as I’m curious as to the pending “Big Bang” of ICD-10. As the ARRA opens the floodgates for incentives relative to EMR, being the USA’s collective investment to go digital in Healthcare, what will happen as investments may go into systems that are not able to meet the October 2013 requirements for ICD-10?
    Perhaps the opportunity is to consider the ARRA incentives and timeline relative to ICD-10 requirements, as all resouces required for the ARRA alone could be at risk. I understand no extentions will be considered.
    Thoughts?

  13. Parley, I may have not been very clear.
    I am not talking about the EMR market. I am talking about a variety of payers, reference labs and the vast majority of pharmacies. These entities adopted the X12, HL7 and NCPDP standards without any mandates from the government and in turn forced EMRs and other portal applications to communicate with them using these standards and they complied because the consumer demanded that.
    As to EMRs, for the last 4 years CCHIT has been trying to “standardize” and certify a collection of irrelevant functionality in a semi-federal capacity. That is not my definition of free market. I have no idea why the government, and CMS in particular, is allowing this to happen, but I do know that we’d be better off without any of that. So I guess we agree on this one.
    I have no problem with the government endorsing existing standards, such as HL7, NCPDP, X12, XML, or recommending new ones. The thing to understand is that standards do exist and are being used all day every day. The EHR mess is not so much due to lack of standards per-se, as it is due to lack of perceived value to the physician. The current, “certified” products are mostly hard to use, extremely expensive and CCHIT is there to protect those products and their manufacturers by constantly erecting barriers to innovation. No free market here really.

  14. “…I share Dr. Beller’s concerns about having the government define data standards. Why should it? There are millions of healthcare transactions over the internet every day, that are very “meaningful” – labs and medications. The data standards were NOT defined or enforced by the government. They sprung from the free market because they provide real value to those using them.” – Margalit
    Please. Data standards have miraculously sprung up from the “free market”? You mean, just like how the free market has miraculously given us a sensible, inexpensive, and very efficient health care system with incomparably better outcomes?
    Yes, we’re all for leveraging the forces unleashed by market competition. And that’s what can be achieved if there are a national set of data standards adopted. The current application certification process merely serves to limit the field to a few major corporations peddling old technology with proprietary standards at ridiculous cost. This amounts to more of the same corporate welfare that has marked public/private interaction for the last 30 years. That’s not a free market – that’s a manipulated market that is both anti-competitive, and coddles a few well-connected corporations, often at the expense of the public good.
    Markets perform best when they are confined to a set of rules. When these rules are dismantled or never implemented in the first place – with a misplaced quasi-religious faith in the “invisible hand of the free market” – why should we continually be surprised by the outcomes? In the case of EHR, lack of publicly-defined standards at led to… well, the mess we see today.
    btw, in the cases you cited of networks and the internet, data standards *were* in fact created by the government.

  15. Great post as usual, and I completely agree. I just think that things are much simpler than that.
    Network – The network is there already. It’s called Internet and it’s free and accessible by everybody, physicians included.
    Transport – I’m not sure why this is even a question. Millions of individuals and businesses are securely transporting data over the Internet every day. We know how to do that.
    Data – I share Dr. Beller’s concerns about having the government define data standards. Why should it? There are millions of healthcare transactions over the internet every day, that are very “meaningful” – labs and medications. The data standards were NOT defined or enforced by the government. They sprung from the free market because they provide real value to those using them. Physicians, pharmacists, labs and, yes, patients are driving this adoption of useful applications and strangely enough they all found a way to “meaningfully” communicate without anybody forcing them to do so or dictating how they should communicate.
    My point here is that if the government wants to pay for data, then just do that. It should be irrelevant how the data gets pulled out, or consumed by whatever tools the providers has in his office. The market will find a way, if this is indeed of value to suppliers and/or consumers of such data. If it turns out that the transfer of data has no value, it will not be transmitted, no matter how many standards and incentives are being defined.
    As to the tools, David, I wouldn’t want to define that either. Why Mozilla? Chrome is really nice. Some people have only IE on their machines. Why not live the browser decision to the user?
    Having several applications interacting in one browser window is not as simple as it sounds. Why don’t we leave that to the industry as well? Let people think independently and come up with solutions and prices and features and whatever else makes an innovation.
    The only requirement should be ability to move the data sets that the government is willing to pay for, and that requirement will be enforced by the physician (customer) because he/she wants to participate and get paid.
    There really is no need for standards and certifications and all that overhead. Just tell people what data you want and how much you are willing to pay for it. The free market will do the rest.

  16. I agree with the post, but would go one step further: Even dictating global standards for clinical data is a double-edged sword. As I have discussed two years ago in a series of post at http://curinghealthcare.blogspot.com/2007/05/knowledge-standards-and-healthcare.html, data standards can be divided into at least four categories: terminology, measurement, care process, and messaging format standards. I show how forcing everyone to accept global standards is problematic for many reasons.
    Take terminology standards, for example. The Healthcare Information Technology Standards Panel, which is setting technical standards for a nationwide record system, identified an initial set of 90 medical and technology standards, out of an original list of about 600. These standards specify such things as how laboratory reports are to be exchanged electronically and entered into a patient’s electronic record, as well as how past lab results are to be requested. More than 190 organizations—representing consumers, providers, government agencies, and standards development organizations—participate in the panel. It’s no wonder, therefore, that a consensus on medical standards is so difficult and fraught with politics as standard-setting involves intense negotiations and delicate compromises. And once such IT standards are set, software systems and databases must be designed to conform with those standards.
    Politics aside, there is good reason to enable people to maintain their “local” terminology standards and use a “universal translation” process to assure everyone gets the information needed in the specific terms they need. Take, for example, the term “high blood pressure” — the following terms are synonyms of high blood pressure or the names of conditions referring to it: accelerated hypertension; arteriolar nephrosclerosis; benign hypertension; benign intracranial hypertension; chronic hypertension; essential hypertension; familial hypertension; familial primary pulmonary hypertension; genetic hypertension; hypertension-essential; hypertension-malignant; hypertension-renovascular; hypertensive crisis; idiopathic hypertension; idiopathic pulmonary hypertension; malignant hypertension; nephrosclerosis-arteriolar; pph; pregnancy-induced hypertension; primary obliterative pulmonary vascular disease; primary pulmonary hypertension; primary pulmonary hypertension (pph); primary pulmonary vascular disease; pulmonary arterial hypertension, secondary; pulmonary hypertension; renal hypertension; secondary pulmonary hypertension; severe hypertension; toxemia; toxemia of pregnancy, hyperpiesia, and hyperpiesis.
    Now imagine two electronic health record systems attempting to exchange patient data. One system is able to recognize the term “high blood pressure” and the other the term “hypertension,” but neither can recognize both terms. These two computers would be unable to share the data because they don’t “understand” what each other is “saying.” This is because computers cannot deal with synonyms (using different words to say the same thing) or homonyms (when the same terms or phrase means different things in different contexts). When multiple healthcare providers treat the same patient, exchanging patient data can thus be difficult (due to the issues of semantics and syntax).
    So, while setting arbitrary global standards for health-related terms is one way to foster widespread communications, the serious downside to eliminating the local standards people rely upon includes loss of important information due to reduced semantic precision and nuance.
    It would be much better, therefore, to keep local standards, support their evolution, and use the data translation methods to ensure everyone gets the information needed using the terms they need and understand. Required are truly innovative software systems that work with any standards, rather than demanding everyone comply with a single standard.
    Steve Beller, PhD

  17. Dear Pam: A couple of things. HIPAA clearly states, and the new ARRA reiterates, that patients have the right to access their health records upon asking, and if those health records are in electronic or digital format, to ACCESS THEM IN ELECTRONIC FORMAT.
    Secondly, there are a number of formats for retrieving older paper records, the easiest of which would be a summary document written or transcribed by your personal physician. I used to do this all the time for patients who were moving, or who migrated between my practice and doctors elsewhere in the world. Some elements of your medical history, such as immunizations, may retain their value and meaning for a long time, even a lifetime. There is no rule that “after 4 years it’s obsolete.”
    For Wendell and Tim: I am a great fan of FOSS, and I do believe that an approach to EHR technology that is web based and a platform for apps that plug-and-play would be a huge boon to FOSS development. I’ve looked with some chagrin at open source EMRs that are OS, yes, but repeat most of the errors of proprietary EMRs.
    What I’d like to see is a FOSS project to create a) the web browser platform, using Mozilla; b) the data exchange services; and c) the specific apps needed to qualify physicians for ARRA incentive payments. These include: a viewer (the platform); an ePresribing app; an app to consume, create, and transfer CCR xml files for care coordination; and a registry for managing populations of patients, ie. diabetics and patients with hypertension. That’s it.
    Of course, that’s a lot! But it’s much less than the full blown comprehensive CCHIT-certified EHR. The real trick will be to keep it simple and inexpensive to own and operate.
    Kind regards, DCK

  18. As I note repeatedly elsewhere this an opportunity to for the USA federal government to sponsor on its own or through another entity a FOSS EMR/PM product. The government could likely convince eClinicalWorks or another commercial vendor to provide its product in open-source or it could build on existing open-source products such as Indivo or PatientOS or openEMR. PatientOS is primarily designed as a hospital system, so it could be the basis for a hospital and/or ambulatory facility system. The VA’s VistA system could be refactored and used as software for any large hospital/healthplan complex.
    The $19 billion in funds should go primarily, if at all (no obligation to spend it), to training of clinical and other provider staff and to patients rather than to pay licensing fees to software vendors or to pay hardware manufacturers for equipment.
    Training is the big cost in any case and offers the greatest potential for a good return on the spending. Money – not much would be needed – perhaps in the tens to hundreds of thousands of dollars should be allocated to developing comprehensive documentation addressing both software and clinical issues.
    Commercial software suppliers would adapt to the circumstances just as companies such as IBM or Oracle provide commercial application servers (WebSphere and BEA WebLogic) that compete with FOSS application servers such as Geronimo and JBoss. This is true in most areas of software.

  19. >>>>>
    But we don’t believe that the deeper problem is CCHIT or the dubious politics of hiring a vendor-run organization to certify the vendors’ products. That real or apparent conflict of interest is an issue, but it’s not the major one. Rather, it is the potential linkage of incentive payments to a certification process that would require specific applications to manage health data. That is the root of the the dilemma that Dr. Blumenthal et al. now face; this is the Gordian knot that needs to be undone.
    <<<<<
    I couldn't agree more. Just to add to the depth of this incentive payment issue. The customers are ill trained to make the decisions themselves (in most, not all cases) and therefore will completely rely on those recommendations.
    I believe that there should be national (financial) involvement but not under this current incentive structure.
    Also, thanks for the analogies. I believe that they help people understand what is possible and what is at stake.
    –Tim Cook

  20. Dear Mr. Kibbe and Mr. Keppler,
    I have been a Kaiser patient for 7 1/2 years but my records have been computerized for only the last 4 years now. Anything prior to 4 years is stored in facilities located counties away from the hospital and clinics where treatment was given. When I moved last July from Oakland, California to Marin County across the Bay, I requested that these paper and film records be sent to the Marin location. After several very frustrating snafus, my records finally arrived in my own mailbox in March, because Kaiser regulations do not provide that records be forwarded or duplicated or located in any way at the new facility. Doctors may request the file much like borrowing a book at the library. Then the file is returned. Mine was misplaced for some time and, when found, was impossible to get transferred.
    My point is that for patients with a long health history and/or chronic disease some summarization of existing paper files must be done in order to provide electronic records that are useful. Are there any standards being proposed about converting existing paper records to electronic ones? At one point in my odyssey one young Kaiser employee suggested that any record older than 4 years was useless anyway. Do you agree?
    Pam Drew