How do you tell the family members of a critically ill patient that their loved one is going to die because there are no antibiotics left to treat the patient’s infection? In the 21st century, doctors are not supposed to have to say things like this to patients or their families.
Ever since the discovery of penicillin in 1940, patients have expected a pill or an intravenous injection to cure their infections. But our hubris as a society with respect to antibiotics has been exposed by the rise of antibiotic-resistant “superbugs.”
The Centers for Disease Control and Prevention (CDC) recently issued a new study, entitled “Antibiotic resistance threats in the United States, 2013,” reporting that at least 2 million people become infected with bacteria that are highly resistant to antibiotics and at least 23,000 people die each year as a direct result of these infections. These estimates are highly conservative. Many more people die from other conditions that were complicated by an antibiotic-resistant infection.
Meantime, we have ever-decreasing new weapons to wage the war against such infections because the availability of new antibiotics is down by more than 90% since 1983.
Interventions are needed to encourage investment in new antibiotics, to prevent the infections in the first place, to slow the spread of resistance and to discover new ways to attack microbes without driving resistance.
A major reason for the “market failure” of antibiotics is that they are taken for short periods of time, so they have a lower return on investment than drugs that are taken for years (such as cholesterol-lowering drugs). The Food and Drug Administration can help reverse the market failure by adopting new regulatory approaches to encourage development of critically needed new antibiotics.