As we look back over the past year and some of the amazing medical breakthroughs like wearable robotic devices, genomic sequencing and treatments like renal denervation that are improving people’s lives, it bears reflection on what else we could be doing better. Our world has changed more in the past century than in thousands of years of human history. We not only know more about our biology than ever before, but science and technology are unlocking the secrets of the very building blocks of our health. Somehow, in the midst of this incredible innovation, we’ve gotten fat, and not just a little. The result? Alarming rates of obesity and related chronic disease that threaten to crush us physically and financially.
But is it technology’s fault that we’ve become fat? A recent study by the Milken Institute that tied the amount an industrialized country spends on information and communication technologies directly to the obesity rates of its populations thinks so.
Most of us are guilty of a little overindulgence around the holidays but for many, overindulgence is a normal way of life. As economies transition to more sedentary, the physical movement that burned calories and kept us fit simply does not occur. Our lifestyles compound the issue — dual-income homes rely on the convenience of packaged meals, and our leisure activities have shifted to heavy “screen time” with movies, games and social media.
What if the next time you step into your doctor’s office for an examination, she reaches into her white coat pocket and pulls out an iPhone instead of a stethoscope? That’s the idea behind The Smartphone Physical, a re-imagination of the physical exam using only smartphones and a few devices that connect to them. These include a weight scale, blood pressure cuff, pulse oximeter, ophthalmoscope, otoscope, spirometer, ECG, stethoscope, and ultrasound. Want to know more? I’ve answered some questions here for THCB. And have a few myself.
What are the pros and cons of using smartphones for clinical data collection?
Smartphone penetration in virtually every market has exceeded expectations, and healthcare is no exception. More than 80% of physicians in the US have smartphones, and of those three-quarters use them at work. Much of this is currently personal communication, but increasingly physicians are using smartphones as reference tools; between 30-40% report using their smartphones for clinical decision support. It seems like a logical next step to go beyond reference apps and to start using peripheral devices, such as cases that convert the smartphone into an ECG or otoscope as well as peripherals such as pulse oximeters and ultrasound probes, for easy and reliable data collection.
At TEDMED we found that using our smartphones and the clinical devices actually improved our ability to engage with the “patient,” because we were able to share and explain the physical exam findings directly at the point of care. We could take a quick snapshot of the carotid arteries and tympanic membrane and, for the first time ever, show the patient what theirs looked like and field any questions they may have. Ideally in the near future we’d be able to go one step further and upload this data to the patient record. That is one of the most powerful aspects of the Smartphone Physical because we will be able to establish baselines for individuals. For example, instead of the current model of a primary care ophthalmologic exam, where a physician will write “W.N.L” or “unremarkable” for a patient without a concerning optic disc finding, we will be able to take and store an actual image of what the patient’s optic disc looked like at an earlier time-point. This may be particularly useful for patients who present years later with concerning visual changes.
Furthermore, smartphone-based collection of clinically-relevant data will help patients become their own data collectors. This may abstract away the mundane and standardize the unreliable aspects of the physical exam, and allow for trending data that needs to be taken in context and not just at once-yearly visits (e.g. blood pressure, temperature, etc).
A lot of people think Google Glass can be used as a development platform to create amazing healthcare apps. So do I.
Many of these ideas are relatively obvious, and many of them could be relatively simple to develop. But we won’t see most of them commercialize in the first year Glass is on the market. Maybe even 2 years. Why?
The most obvious analogy to Glass is the iPhone. It’s a revolutionary new technology platform with an incredible new user interface. Glass practically begs the iPhone analogy. Technologically, the analogy has the potential to hold true. But economically, it does not. Because of the economics of Glass, many of these great ideas won’t see the light of day anytime soon.
First, there’s the cost. Glass will run a cool $1500 when it lands in the US this holiday season. The most obvious analogy to Glass is the iPhone. It’s a revolutionary new technology platform with an incredible new user interface. Glass practically begs the iPhone analogy. Technologically, the analogy has the potential to hold true. But economically, it does not. Because of the economics of Glass, many of these great ideas won’t see the light of day anytime soon. There’s no opportunity for a subsidy because Glass doesn’t have native cellular capabilities.
My job and my life intersected in a profound way when my daughter was diagnosed with Type I diabetes. Years working in mobile innovation didn’t prepare me for how personally relevant mHealth so quickly became. Her clinical trial at Stanford University, supported by the National Institutes of Health through Congress’ Special Diabetes Program, featured a world-class endocrinologist working alongside software coders, applications developers, algorithm writers, network engineers and other mobile innovators. They were all pushing together for what could be a revolution in diabetes management—the artificial pancreas.
Recently I had the opportunity to talk about my daughter’s experience and share my thoughts on how government can help encourage the next wave of mHealth innovation, when I was invited to testify before Congress on mobile innovation and health care.
America’s leadership in the mobile economy — 40,000 apps and counting in the broad mHealth category — matches America’s leadership at the cutting edge of medical technology.
Mobile devices, wireless networks and targeted applications are enabling better, more seamless and cost-effective care that empowers and informs stakeholders on both sides of the stethoscope.
The virtuous cycle of investment in the mobile ecosystem — from networks, to handsets and tablets, to applications — provides an unparalleled foundation for dramatic advances in the nation’s health and wellness. My message to Congress was to lean in and strike a reasonable and circumspect balance that both protects patient safety and privacy and propels the dramatic, mobile-fueled advances we are seeing through American medicine today.
In 2004, I was managing a hospital division at the University of Chicago and our clinic director walked into my office and asked whether I thought that all physicians should be issued with smartphones. My first internal thought was, “Hmm, what’s a smartphone?”
Today, we all know how dramatically different mobile phones are than they were a year or two ago, much less back in 2004. But as the power of mobile technology increases, tech entrepreneurs have taken a lead on challenging old rules that haven’t been discussed in decades. What if the development of the smartphone could give us some clues into the future of healthcare IT?
Recently, I was on a business trip to Boston and met a friend for dinner. As we discussed where to go, I wanted to go someplace close, thinking that getting a taxi would be a pain. My friend pulled out his smartphone and requested a car to pick us up through the car-sharing service Uber. If you haven’t heard of Uber, or Sidecar, or Lyft, the essence is that the headache, the wait, and sometimes the expense of getting a taxi are virtually eliminated.