It happened again. I was talking to a particularly sick patient recently who related another bad experience with a specialist.
“He came in and started spouting that he was busy saving someone’s life in the ER, and then he didn’t listen to what I had to say,” she told me. ”I know that he’s a good doctor and all, but he was a real jerk!”
This was a specialist that I hold in particular high esteem for his medical skill, so I was a little surprised and told her so.
“I think he holds himself in pretty high esteem, if you ask me,” she replied, still angry.
“Yes,” I agreed, “he probably does. It’s kind of hard to find a doctor who doesn’t.”
She laughed and we went on to figure out her plan.
This encounter made me wonder: was this behavior typical of this physician (something I’ve never heard about from him), or was there something else going on? I thought about the recent study which showed doctors are significantly more likely than people of other professions to suffer from burn-out.
Compared with a probability-based sample of 3442 working US adults, physicians were more likely to have symptoms of burnout (37.9% vs 27.8%) and to be dissatisfied with work-life balance (40.2% vs 23.2%) (P < .001 for both).
This is consistent with other data I’ve seen indicating higher rates of depression, alcoholism, and suicide for physicians compared to the general public. On first glance it would seem that physicians would have lower rates of problems associated with self-esteem, as the medical profession is still held in high esteem by the public, is full of opportunities to “do good” for others, and (in my experience) is one in which people are quick to express their appreciation for simply doing the job as it should be done. Yet this study not only showed burn-out, but a feeling of self-doubt few would associate with my profession.
“Would I let my son play football?”
It’s a question that more and more parents are asking themselves these days. There are some people out there who say, “No way!”
Football is way too violent and should be abolished as a sport. Even some NFL players admit that they would not let their own sons play football. Then there are others, fierce advocates who think football is a wonderful game with tremendous benefits to its participants and think all of the media hype about injuries are just overrated scare tactics and headline grabbers.
But the majority of us are probably somewhere in the middle and aren’t quite sure what to think. So why don’t we spend a little time sifting through all the facts and emotions and see if we can make some logical decisions about the subject. I have an interesting perspective in that I am a sports medicine physician who is a true fan of the game, has played the game, has sustained injuries and has a son of my own.
Thus I can see the argument from all sides. Let’s start with the physician side. My job is taking care of injured athletes. I see patients with fractures, sprains, strains, overuse injuries, head injuries, concussions, trauma, you name it. During the months of August, September, October and November, I probably see more patients than I do for the entire remainder of the year. Why? Football season.
Skype and videoconferencing have surpassed the tipping point of consumer adoption. Grandparents Skype with grandchildren living far, far away. Soldiers converse daily with families from Afghanistan and Iraq war theatres. Workers streamline telecommuting by videoconferencing with colleagues in geographically distributed offices.
In the era of DIY’ing all aspects of life, more health citizens are taking to DIY’ing health — and, increasingly, looking beyond physical health for convenient access to mental and behavioral health services.
The Online Couch: Mental Health Care on the Web is my latest paper for the California HealthCare Foundation. Among a range of emerging tech-enabled mental health services is videoconferencing, for which there is a growing roster of choices for platforms that market a variety of features beyond pure communications.
A top executive I know recently decided to take Inderal before making high-pressure/high-anxiety presentations. The impact was immediate. She felt more relaxed, confident and effective. Her people agreed.
Would she encourage a comparably anxious subordinate to take the drug? No. But if that employee’s anxiety really undermined his or her effectiveness, she’d share her story and make them aware of the Inderal option. She certainly wouldn’t disapprove of an employee seeking prescription help to become more productive.
No one in America thinks twice anymore if a colleague takes Prozac. (Roughly 10% of workers in Europe and the U.K. use antidepressants, as well). Caffeine has clearly become the (legal) stimulant of business choice and Starbucks its most profitable global pusher (two shots of espresso, please).
Increasingly, prescription ADHD drugs like Adderall, dedicated to improving attention deficits, are finding their way into gray market use by students looking for a cognitive edge. When one looks at existing and in-the-pipeline drugs for Alzheimer’s and other neurophysiological therapies for aging OECD populations with retirements delayed, the odds are that far more employees are going to be taking more drugs to get more work done better.
Performance-enhancing (or degraded performance-delaying) drugs will become as common as that revitalizing cup of afternoon coffee.
Should that be encouraged? Or should management pretend those options don’t exist?
Most managers would believe they’re doing a good thing if they encouraged a hard-of-hearing employee to explore a hearing aid or a visually-impaired colleague to consider glasses. By contrast, encouraging an under-performing subordinate to lose 25 pounds, get a hair transplant or contact-lenses would likely inspire a formal complaint to Human Resources and/or a possible lawsuit. Ironically, the money isn’t the issue here; the business norms associated with perceived cosmetic and aesthetic concerns are radically different from those attached to job performance and productivity.Continue reading…
After a decade of conflict in Iraq, our troops have come home, producing the largest increase in the number of American veterans since the 1970s. After Vietnam, an America tired of war and consumed with political angst neglected its veterans. Fortunately, the veterans of today are receiving the homecoming they deserve. To make that homecoming complete, America needs to ensure that our returning warriors have access to one of the most important benefits they have earned: health care provided by the Department of Veterans Affairs.
A Health Care Challenge: Fewer Battlefield Deaths, More Injuries
The United States military is the most technologically sophisticated fighting force in the world. This technological advantage means that our troops in Iraq and Afghanistan are subject to fewer casualties than in Vietnam. But those who do receive injuries are significantly more likely to survive because of body armor and the high quality of medical care. According to a study conducted by the University of Pennsylvania, only 13 percent of those injured in Iraq were likely to die compared to those injured in Vietnam, where the fatality rate was nearly 25 percent. But our ability to save lives also means that many more veterans are returning home after losing limbs or suffering from the after-effects of traumatic brain injuries (TBI) from blasts experienced in battle or as a result of improvised explosive devices.
A frightening aspect of TBI is that it can be quite difficult to diagnose. It is possible for someone exposed to an explosion to show no signs of injury until weeks or months later when symptoms—such as depression, anxiety or anger issues—become apparent. Untreated, these symptoms can lead to major depression, substance use problems, unemployment and ruined family relationships. In addition to TBI, other problems—from back injuries to exposure to toxins—may only become apparent after the veteran has been separated from service for months or even years.
Imagine for a moment you are suffering from an illness that makes you feel like your soul has been run over by an angry defensive lineman, a disease that interferes with your desire to sleep, eat and make love. Oh, and this illness will continue to make you feel this way for the rest of your life. How much would you be willing to pay for a treatment makes you feel normal again?
My colleagues and I posed that question to a nationally representative sample of more than 700 Americans and we discovered something troubling—people’s willingness to pay for medical interventions depends in large part on whether the illness in question is “physical” or “mental.” People are much less willing to part with money to treat mental illnesses, even after accounting for the perceived severity of those same illnesses. Our article—“What’s It Worth?”—is available online at the Journal of Psychiatric Services.
Let me tell you a bit more about our study. We described a handful of illnesses to people and asked them to tell us, in effect, how bad each one would be to experience. For instance, we describe type 2 diabetes to people, and told them that it was uncomplicated by any other medical problems. People thought that would be pretty hard on their quality of life. We also described below-the-knee amputation, and they thought that would be even worse than diabetes. We described severe blindness, which only leaves one able to distinguish shadows. People thought that one was worse than either of the first two problems.
We also described a case of moderately severe depression to people, a level bad enough to cause the victims to “feel sad and downhearted a lot of the time.” The description went on to explain that it would make people “feel like a failure” and lose interest in food and sex. Trust me, it was a thorough and devastating picture of how depression can affect people’s lives. Indeed, people thought it was horrendous, at least as bad as any of the physical illnesses we described.