- The Washington Times - Tuesday, October 23, 2007

There is a constant war between pathogenic bacteria and humans, and the microbes seem to be winning. New data from the Centers for Disease Control (CDC) indicate the incidence of serious invasive infections from a strain of bacteria resistant to most first-line, commonly used antibiotics was higher than previously thought. The CDC estimates methicillin-resistant Staphylococcus aureus (MRSA) kills 18,000 Americans each year and causes serious infections in more than 90,000.

The phenomenon is not new, but reports of outbreaks in schools across the nation and the death of a high school student in Virginia earlier this month have focused national attention on the problem.

The medical community has been worried for years about growing antibiotic resistance in many kinds of bacteria. Especially if an infection is contracted in a hospital — in a surgical wound, for example, or in the form of pneumonia — there is a high probability the bacteria responsible will be resistant to one or more antibiotics, and the outcome is often deadly. Almost 2 million patients contract infections in U.S. hospitals each year — approximately 4½ percent of admissions — and 100,000 die, according to earlier data from the CDC.

The death rate in such cases is alarmingly high not because the patients initially are gravely ill, but because hospital germs increasingly are resistant to multiple antibiotics: About 70 percent of those infections are resistant to at least one drug, so the infections are hard to treat. In many cases, we”re already out of good second- or third-line alternatives that are effective, can be administered by mouth and have few side effects, so we must resort to drugs that are inconvenient to administer or have significant toxicity.

Many bad bugs are spreading beyond our hospitals into the greater community. Bacteria are masters of evolutionary adaptation: Given sufficient time and exposure, they use a variety of clever genetic and metabolic tricks to resist any drug we invent. There is no antibiotic in clinical use today to which some resistance has not developed. A future with few effective antibiotics would be treacherous; many of today’s routine medical procedures, from surgical operations to chemotherapy, would be far more dangerous if we permit the bacteria to outwit us.

To combat this public health emergency, important initiatives are under way by both government and the private sector to promote more sparing and intelligent use of antibiotics. Regulators and livestock producers are collaborating to reduce the amounts of antibiotics used to prevent disease in livestock, and many HMOs have adopted policies that restrict antibiotics to infections that seem unequivocally to be caused by bacteria. (For example, patients should not routinely get antibiotics for colds, which are caused by viruses, not bacteria.)

The CDC is promoting four strategies to prevent antibiotic resistance in health-care centers — prevent infection, diagnose and treat infection, use antimicrobials wisely, and prevent transmission — but federal officials have paid little attention to the flip side of the problem: the shortage of new antibiotics.

Twenty years ago, about a half-dozen new antibiotics would appear on the market each year; now it’s at most one or two. For decades we’ve relied largely on new variations on old tricks to combat rapidly evolving pathogens: Most antibiotics in use today are chemically related to earlier ones discovered between 1941 and 1968. During the last 38 years, only two antibiotics with truly novel modes of action have been introduced — Zyvox in 2000 and Cubicin in 2003, the latter of which must be infused intravenously.

Market forces and regulatory costs have exacerbated the antibiotics drought. Until about a decade ago, all the major pharmaceutical makers had antibacterial research programs, but they have dramatically trimmed or eliminated these efforts, focusing instead on more lucrative drugs that treat chronic ailments and lifestyle issues: drugs for lowering cholesterol and treating erectile dysfunction, for example. Whereas antibiotics cure a patient in days, and may not be required again for years, someone with high cholesterol or erectile dysfunction might pop expensive pills every day for decades. Moreover, drug development has become hugely expensive, with the direct and indirect costs to bring a drug to the U.S. market now averaging about a billion dollars. Only about a half-dozen new antibiotics are now in late-stage clinical trials.

To address this public health threat, we need multiple strategies. In the short term, improved infection-prevention procedures at hospitals would have a tremendous impact. A pilot program at the University of Pittsburgh found that screening tests, gowns and other precautions that cost only $35,000 a year saved more than $800,000 a year in infection costs. A review of similar analyses published last year concluded that screening for MRSA bacteria both increases hospital profits and saves lives.

Longer term, we need to adopt the kinds of critical policy reforms suggested by the Infectious Diseases Society of America to spur new drug development. Among them: expediting the publication of updated guidelines for clinical trials of antibiotics, including a clear definition of what constitutes acceptable surrogate markers as endpoints; encouraging “imaginative clinical trial designs that lead to a better understanding” of antibiotics’ efficacy; and the exploration of animal models of infection, in vitro technologies and microbiological surrogate markers to reduce the number of efficacy studies required.

In addition, regulation needs to be more enlightened. Regulators should grant accelerated review status to priority antibiotics and be more sensitive generally to the critical need for new antibiotics.

The two novel antibiotics that have been introduced since 2000 won’t be enough to keep rapidly mutating pathogens at bay for long. Once resistance appears, it will spread rapidly. Unless we create economic and regulatory incentives for companies to develop antibiotics, it’s unlikely we’ll see many more wonder drugs in the near future. That’s something to think about next time you contract bronchitis, or are hospitalized for elective surgery.

Henry I. Miller, a physician and molecular biologist, is a fellow at Stanford University’s Hoover Institution and a former official at the National Institutes of Health and the Food and Drug Administration. He is the author of the book “The Frankenfood Myth.”

Sign up for Daily Newsletters

Manage Newsletters

Copyright © 2020 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.


Click to Read More and View Comments

Click to Hide