‘A good gulp of hot whiskey at bedtime — it’s not very scientific, but it helps.” That was the advice that Alexander Fleming, the father of antibiotics, gave to a gathering of the American Academy of General Practice in Cleveland, when asked about the best way to combat the common cold. It was March of 1954, and Fleming, who had been awarded the Nobel Prize just nine years earlier, was touring the United States to talk about his medical breakthrough.
It was the golden age of antibiotics. So the doctors in attendance and those who subsequently read about the whiskey comment in the newspapers (it echoed from Ohio to Australia), may have been surprised by Fleming’s answer. Thanks to penicillin, people were living through bacterial infections that would have killed them a decade earlier. Pharmaceutical companies were racing to develop these miracle drugs that could cure everything from a sore throat to pneumonia. In just a few short years, antibiotics had revolutionized the practice of medicine.
Today, infectious disease experts wish that more doctors heeded Fleming’s caution. His comment may have been seen as a joke, but imbedded in his observation is this simple fact: Colds are caused by viruses, not by bacteria. And antibiotics only work against the latter.
Doctors know this, of course. Yet, up to 50 percent of the antibiotics given out in hospitals are either unnecessary or inappropriate, according to the Centers for Disease Control. That number is about the same for antibiotics prescribed in outpatient settings, though it’s more difficult to track. Anecdotal evidence, however, is easy to come by.
“We all know the stories of patients coming into the office and demanding antibiotics for things like sinus infections and bronchitis, diseases that are often viral,” said Dr. Arturo Casadevall, a Bloomberg distinguished professor and a member of the division of infectious diseases at Johns Hopkins. “They want a pill to make their symptoms go away, and doctors oblige because they don’t have the time to explain the downsides to their patients. It’s also true that doctors in general aren’t learning enough about antibiotics to know which ones to give in which scenario.”
The issue of physician training has come into sharper focus thanks to the rise of antibiotic-resistant pathogens, which kill roughly 23,000 people in the United States and 700,000 people internationally each year. By 2050, the number of deaths is expected to climb to 10 million worldwide, with the bulk of those occurring in Asia and Africa.
A variety of factors explain why this problem is getting worse, including large-scale use of antibiotics in farming, the lack of diagnostic tools, and the lack of research going into new drugs. But, Fleming’s quip takes on greater significance in this context because one of the main drivers of resistance starts with the prescription pad.
By and large, infectious disease experts argue there needs to be a fundamental culture shift when it comes to the way doctors across all specialties view antibiotics. There is a sense that these medications are less risky for patients — compared to, say, chemotherapy drugs — so it can’t hurt to prescribe them just to cover all the bases.
“The problem is that antibiotics are seen as very routine and straightforward and that anybody can prescribe them because there aren’t any real side effects,” said Dr. Sean Elliott, professor of pediatrics and infectious diseases at Banner— University Medicine in Arizona. “That’s the perception. It’s not accurate. There is harm in noncontrolled, uneducated prescribing practices.”
While antibiotics may seem relatively harmless, they can have a number of side effects that shouldn’t be taken lightly. Some patients can experience debilitating diarrhea or large-scale rashes. More extreme cases of kidney dysfunction and severe allergic reactions have resulted in hospitalization and death. And, unlike any other class of drugs, antibiotics come with a compounding social cost. The more they are incorrectly prescribed, the more they contribute to resistant bacterial infections.
This problem has been accelerated by the aggressive marketing of broad-spectrum antibiotics, medications that kill almost every bacteria — both good and bad — in their path. Researchers have only recently begun to understand the effects these drugs have on the good bacteria in our bodies, which rely on these microbes to perform basic functions like breaking down food and regulating our immune systems.
Medical students, for the most part, once learned about these issues in classes that required rote memorization about different types of antibiotics, their side effects, and correct dosing. But over the past two decades, medical schools have moved away from this model, requiring less classwork and more hands-on experience treating patients. There is no specific, nationwide curriculum today that is required to teach safe and appropriate antibiotic use.
Instead, the hope is that young doctors will pick up what they need to know as they practice various procedures or go on rounds in hospitals during their residencies. The system, as it stands, is haphazard. It also doesn’t track what medical students and residents are absorbing.
“We are relying on chance encounters for our medical students and residents to learn how to use antibiotics,” said Dr. Sara Cosgrove, an infectious disease expert and associate professor of medicine at Johns Hopkins. “We say that they’ll learn about this when they are in clinical rotations, but again that depends on how well trained their teachers are.”
In 2013, Cosgrove along with a handful of other doctors, published a study that revealed more than 90 percent of the graduating medical students surveyed wished they had more training with antibiotics. That same study asked a series of questions about the proper use of antibiotics. The respondents only got about 50 percent of those questions right. (Forty-one percent also admitted to consulting Wikipedia to learn about prescribing antibiotics, while only 29 percent turned to the guidelines created by the Infectious Disease Society of America.)
That’s not surprising given how the curriculum has evolved. Though the approach to teaching antibiotics varies among schools and residency programs, other requirements have crept in, pushing aside the time once dedicated to the diagnosis and treatment of bacterial infections. There are new diseases and medicines to study, but the old maladies haven’t gone away, making it difficult to find time to cover everything.
“You really have to work to understand antibiotics,” Cosgrove said. “There is so much information to memorize, so what’s happening is that doctors will get into a comfort zone. They’ll say, I know that these one or two antibiotics cover everything, so I’m going to use them all the time. Then the medical student that’s working with them picks up the same habit and does the same thing.”
At the same time, fewer students are opting for infectious disease rotations — it’s an elective — and fewer are going into the field. There are more slots open for infectious disease fellowships than there are doctors who apply for them each year. That means there are fewer experts studying the dozens of antibiotics on the market today, in which situations they work the best, and their side effects. What got lost along the way was the knowledge of how to prescribe narrow-spectrum antibiotics that target specific infections.
“If every doctor across every specialty can prescribe antibiotics, we should all have some specific training on them,” said Dr. Liise-anne Pirofski, chief of the infectious diseases division at the Albert Einstein College of Medicine. “That should be the driver’s license. If there was more of a consensus that you need to have satisfied certain educational goals to prescribe antibiotics, than that would be helpful in combating the rise of resistance.”
While most physicians agree antibiotic resistance is a crisis and acknowledge overprescribing is, at least in part, driving the problem, there is less consensus across the medical community that all doctors need more training in infectious diseases. Still, those who argue antibiotics require more specialized instruction have made some progress.
One idea that is gaining traction is to create squads of highly trained experts to oversee antibiotic use in hospitals. Several of the top academic facilities already have these “antimicrobial stewardship teams,” which have shown promise in both lowering costs and improving patient outcomes.
It is often the case that when a critically ill patient with an infection goes to a hospital, he or she is started on a broad-spectrum antibiotic because the treating physician doesn’t know what the person has. The stewardship team’s role is to analyze that patient’s lab results and advise the front-line doctors on whether they can cut back the dosage or switch to a more narrow-spectrum drug.
Policy makers in Washington, D.C., are also starting to take notice. In its multipart plan to deal with drug-resistant pathogens, the White House has made stewardship programs a centerpiece of its initiative. (The Obama administration wants all acute care hospitals to have one by 2020, an ambitious goal that will require a substantial financial commitment.) The Centers for Medicare and Medicaid Services may also soon require hospitals to have an antimicrobial stewardship team in place to be eligible for payment — a powerful incentive for these facilities to fall in line.
“Drug resistance makes the use of antibiotics even more complex,” said Arjun Srinivasan, the associate director for health care associated infection prevention programs at the CDC. “It creates even more of a need for expertise in this area.”
The CDC has been pushing for improved antibiotic use since the late 1990s and is now leading some of the efforts to get all 4,000 acute care facilities to establish one of these teams. So far, a little less than half have satisfied antimicrobial stewardship requirements. Given the momentum surrounding President Obama’s national action plan to combat antibiotic-resistant infections, Srinivasan expects most, if not all, of the rest to adopt the practice sooner rather than later.
But getting doctors to buy into a new system where someone is essentially looking over their shoulder will require a cultural shift. Physicians are used to having complete autonomy in prescribing medications for their patients.
“There is a need for education on how to use antibiotics properly at the earliest levels of medical school and with the on-the-job training that doctors get during their residencies,” Srinivasan said. “That’s something I think we struggle with a lot — how best to provide that education.”
Laura Colarusso is a Cambridge-based writer. Follow her on Twitter @LauraColarusso.
• Diagnosing cancer? There’s an app for that
• Should doctors consider medical costs?
• Hindsight: How an African slave helped Boston fight smallpox