The CDC has noted an early and nasty start to the flu season. Perhaps their own website has caught it, because as I’m writing this, the whole thing is down. Assuming it recovers, I will insert relevant links per routine. Otherwise, I wish it well, and leave you to find your way there on your own.
It’s a bit soon to say, but the virus and the outbreak pattern at this point seem to resemble those of the 2003-2004 flu season, in which nearly 50,000 Americans died. At least two children have already died of flu complications this fall.
This is not the sort of stuff a public health physician can ignore.
So, I recently noted on LinkedIn andTwitter that I’ve been vaccinated — as I am every year — and recommend this year’s vaccine, which appears to match the prevailing viral strain quite well, to everyone else. I promptly got comments back from naysayers, including at least one self-identified microbiologist, who noted he never got vaccinated, and had “never gotten the flu.”
I believe him. But this is like that proverbial “Uncle Joe” everyone knows, who smoked three packs a day and lived to be 119. It could happen — but I wouldn’t bet the farm on it. Uncle Joe is that rare character who somehow comes away from a train crash with a minor flesh wound. The rest of us are mortal.
But there is something more fundamentally wrong with the “I’ve never gotten the flu, and therefore don’t need to be vaccinated” stance than the Uncle Joe fallacy. Let’s face it — those who were ultimately beneficiaries of smallpox or polio immunization never had smallpox or polio, either. If they ever had, it would have been too late for those vaccines to do them any good.
If all of us were simply to make better use of our feet, our forks, and our fingers — if we were to be physically active every day, eat a nearly optimal diet, and avoid tobacco — fully 80 percent of the chronic disease burden that plagues modern society could be eliminated. Really.
Better use of feet, forks, and fingers — and just that — could reduce our personal lifetime risk for heart disease, cancer, stroke, serious respiratory disease, or diabetes by roughly 80 percent. The same behaviors could slash both the human and financial costs of chronic disease, which are putting our children’s futures and the fate of our nation in jeopardy. Feet, forks, and fingers don’t just represent behaviors we have the means to control; they represent control we have the means to exert over the behavior of our genes themselves.
Feet, forks, and fingers could reshape our personal medical destinies, and modern public health, dramatically, for the better. We have known this for decades. So why doesn’t it happen?
Because a lot stands in the way. For starters, there’s 6 million years of evolutionary biology. Throughout all of human history and before, calories were relatively scarce and hard to get, and physical activity — in the form of survival — was unavoidable. Only in the modern era have we devised a world in which physical activity is scarce and hard to get and calories are unavoidable. We are adapted to the former, and have no native defenses against the latter.
Then, there’s roughly 12,000 years of human civilization. Since the dawn of agriculture, we have been applying our large Homo sapien brains and ingenuity to the challenges of making our food supply ever more bountiful, stable, and palatable; and the demands on our muscles ever less. With the advent of modern agricultural methods and labor-saving technologies of every conception, we have succeeded beyond our wildest imaginings.
So now, we are victims of our own success. Obesity and related chronic diseases might well be called “SExS” — the “syndrome of excessive successes.”
Massachusetts has a long track record of making headlines in the area of health care reform, whether or not Mitt Romney likes to talk about it.
In 2008, Massachusetts released results of its initiative requiring virtually all of its citizens to acquire health insurance. In short order, nearly three-quarters of Massachusetts’ 600,000 formerly uninsured acquired health insurance, most of them private insurance that did not run up the tab for taxpayers. The use of hospitals and emergency rooms for primary care fell dramatically, translating into an annual savings of nearly $70 million.
But that’s pocket change in the scheme of things, so the other shoe had to drop — and now it has. Massachusetts made news recently, this time for passing legislation that aims to impose a cap on overall health care spending. That ambition implies, even if it doesn’t quite manage to say, a very provocative word: rationing.
Health care rationing is something everyone loves to hate. Images of sweet, little old ladies being shoved out the doors of ERs that have met some quota readily populate our macabre fantasies.
But laying aside such melodrama, here is the stark reality: Health care is, always was, and always will be rationed. However much people hate the idea, it’s a fact, not a choice. The only choice we have is to ration it rationally, or irrationally. At present, we ration it — and everything it affects — irrationally.