An article in Information Week caught my eye recently . It reviews a new program offered by Texas A&M with support from Dell to help medical students and other healthcare professionals “come to terms with the ways technology is changing their jobs”. The article, Doctors Can Go Back to Tech School, says Texas A&M will launch its new health technology academy later this year as part of its continuing medical education program.
Now, don’t get me wrong. I’m all for education and career improvement. I’m just not sure that the best way to improve Health IT is to get more physicians trained in IT so they can, as the article suggests, move into IT roles. How about giving full time clinicians who have an interest in improving Health IT some extra support and time so they can help those who work in IT better understand what clinicians need to do their jobs efficiently and safely? How about just a little paid time away from the daily treadmill of patient care to educate IT about the nuances of medicine and clinical workflow? I believe understanding that would do more to help IT deliver better solutions.
Over the course of my career, I’ve been many things. First and foremost, I am a physician. Only a true clinician understands how clinicians think and work. For many years, I continued to practice even when it no longer made a whole lot of sense with regards to my income or available time. I was a biology major in college. I went to medical school and did a residency in family medicine. I never had any formal training in either business or technology. I learned the ropes by doing. It was often trial by fire. I’ve had my share of success as well as a few failures along the way. When I advanced into the role of a hospital CIO and CMIO, it wasn’t because I knew tech. When my then CEO asked me to step into the CIO role, I’ll never forget what he said to me. He said, “I want to put a civilian in charge of the military”, meaning a doctor in charge of a department that existed to serve clinicians and their patients but had become a renegade army running out of control and way over budget.
What I lacked in technical knowledge, I was able to hire and manage. I surrounded myself with really smart people who could execute on what I as a clinician envisioned that could better serve my professional colleagues and their patients. Looking back on it, I think we did some really good work together. During my time as a CMIO, I always viewed my role as a kind of “interpreter”. My job was to listen and communicate back and forth between IT and my fellow clinicians using a language that both sides could understand equally.
All of this eventually paved a path to where I am today, in a role not all that different from my “interpreter” days as a CMIO. I now help a multi-billion dollar, worldwide tech company better understand health and healthcare while helping my professional colleagues learn how to apply our technologies to improve patient care and community health around the globe.
I’m not sure that sending docs to tech school is the answer. Goodness knows the time and education required to become, and continue to perform as, a really good clinician is demanding enough. Considering that the people who get into medical school are generally some of the smartest among us, they will pick up what they need to know about tech. I’d rather they spend their time learning how to be really great clinicians, and then using that knowledge to help tech understand how to develop the software, devices and services that will allow clinicians everywhere do their work without needing an extra degree in Health IT.
Categories: Uncategorized
I thinks its not going to work..To become a full trained doctors are required to have years of tanning which means wastage of money and time..
This posting, and the Information Week article it references, are oblivious to decades of work developing the field of biomedical informatics, and its subfield of clinical informatics that is now a formally recognized medical subspecialty for physicians.
Talk about re-inventing the wheel! Where have these people been over the last 5, 10, or 20 years? The field of clinical informatics has been addressing the issues raised here for a long time.
Physicians don’t need to learn IT; they need to learn informatics, which is the science of using information, assisted by technology, to improve the quality, safety, and cost-effectiveness of health and healthcare:
http://www.biomedcentral.com/1472-6947/9/24
Most physicians do not need to be “tech experts,” but they do need to know how to use informatics to function as better physicians. They need to be facile in finding and appraising knowledge, entering and using data in the EHR, and using aggregate data for quality improvement and population health management. We recently published a paper on competencies required for medical students in clinical informatics:
http://www.dovepress.com/beyond-information-retrieval-and-electronic-health-record-use-competen-peer-reviewed-article-AMEP
A smaller cadre of physicians need to be informatics leaders. There is a vibrant community of Chief Medical Informatics Officers (CMIOs) and others, along with emerging pathways for their training and certification:
http://informaticsprofessor.blogspot.com/2014/06/eligibility-for-clinical-informatics.html
It would be great if people would learn from the experience of others and not re-invent the wheel.
I have recently called for a more intensive role for clinicians in designing EHRs and systems controlling clinical workflow:
http://www.emrandehr.com/2014/08/13/could-clinicians-create-better-hie-tools/
The article sort of straddles the two sides of the debate: on the one hand, the letter from Mory Weschler (which you can see by following the links in Crounse’s article) that suggests the EHR vendors should take responsibility for usable products, and the Texas A&M classes that at least imply that clinicians need take responsibility. It would be easy to say after decades of unsatisfactory EHRs that we can’t depend on the vendors. The highest rated EHR by doctors, VistA, was reportedly developed by clinicians, or with clinicians and programmers side by side.
On the whole, I’d be reluctant to ask any busy professional to learn another field and become cross-disciplinary, but some understanding of the capabilities of IT–what’s currently viable and what isn’t–would help doctors know what to ask for.
Agree with the thrust of Bill’s essay.
Here’s a revolutionary idea: “Why not design health IT to make it easy to use!!” It seems like clinicians get into the process too late to affect usability. Ease of use is the central issue, not something you tinker with after you’ve put your over engineered monstrosity in place. . . .Sort of like what that Apple guy, whathizname, did with the mobile phone.
Where docs have been the inventors, or those who work directly with them, the result has been better. MedicaLogic and athenahealth bear examination because they both sprang directly from clinical practice.
If I were the CEO, I would want to be sure that my executives who order systems implemented were required to log a certain number of hours and demonstrate basic proficiency with them.
I would also want to track my docs usage and administrative time: not so that we could identify “problem docs” or “slackers” and issue stern administrative warnings and (this appears to be the goal of many of the less enlightened bureaucrats who are in “implement from above” mode); I’d want to figure out who is spending too much of their time and figure out how to help them
A med school boot camp for health IT folks might not be a bad idea …
We all might learn something