I started medical school in the late 1960s, when physicians answered patient calls at all hours of night and day. Hospitals were somewhat frumpy, primarily community servants rather than businesses. The businesses associated with medicine existed primarily to provide the materials and drugs physicians needed to help patients, while making a modest profit. In the aggregate, health made up 7 percent of the gross domestic product.
I don’t miss those days — other than treating infections, setting broken bones, and removing cancers, there was very little we could do to change the course of disease. Over the next 20 years, health care changed dramatically, in ways that made things a whole lot better and a whole lot worse at the same time.
The big changes involved knowledge, money, and desire. What we know about disease has exploded in the last 50 years, thanks to basic biological and medical research, funded largely by the National Institutes of Health. Research scientists were able to figure out new ways to diagnose and treat everything from allergies to life-threatening heart disease and cancer.
Getting new kinds of treatments out of the lab is expensive. The founding of Medicare in 1965, combined with increasing numbers of people with health insurance, meant there were a lot more financial resources in health care. Hospitals started to think of themselves as businesses rather than community services. And industries like pharmaceuticals and medical devices were able to turn risky investments into profitable new products.
These developments met with an eager market. Patients were no longer patient. They wanted results that would affect their health, and they knew it could be done. For example, in the late 1980s, patients with HIV/AIDS forced the Food and Drug Administration to cut drug approval times for new treatments from years to months. People with other illnesses wanted to know why approval times could not be accelerated for their medical conditions. A few years later, FDA “user fees” were established, and approvals for drugs and devices went from years to months.
With this alignment, medicine blossomed in the late 1990s and beyond. We are fixing hearts and brains that were considered unrepairable in the 1990s; we are treating what were once untreatable cancers; new medicines have emerged to successfully treat common conditions such as asthma and hepatitis, and uncommon disorders such as cystic fibrosis or Gaucher disease. Big operations are done through small incisions; hip, knee, and other prostheses are extending our ability to participate in the exercise needed to protect our hearts and minds; and physical trauma can be mitigated to a much greater extent than in the past. We are living longer and healthier. But here’s the part that’s unresolved — healthier lives cost a lot of money. Health-related expenditures now consume nearly 18 percent of the gross national product. Costs can be ruinous for individuals. This raises hard questions, such as have we progressed beyond our means? Are we spending too much on our health?
I think the answer is no. Wellness is worth a lot. I’d wager that most people would rather be healthy than have a week’s vacation in the Caribbean in the winter. But many people feel that the cost they have to pay for their health is more than they want to spend. Is there a way to have a healthy life and spend less?
I think so. We could do this by improving how we spend on health. Right now, when new drugs come to market, they are advertised directly to consumers. We should put a moratorium on such advertising, probably three to five years, until we know whether the new treatment lives up to its promise. When a drug is approved by the FDA, it means that it is safe and effective, but it may not be safer or more effective than the treatments we are already using. However, the new drug is always much more expensive. Rather than spurring consumer demand, we could create a post-approval research phase, where people who are prescribed a brand-new drug would be followed closely. This would give the medical community time to understand how the drug works in the real world as opposed to the research world. It would also make it clear to patients that when they use a brand-new drug, it is still, to some extent, experimental. None of this comes through on the slick ads we see on TV or the Internet.
Drugs could still be marketed to medical professionals, but a waiting period on advertising to consumers would help us avoid wasting lots of money on drugs that turn out to have marginal incremental value, such as rosiglitazone (Avandia) for diabetes, and carry unknown risk, such as rofecoxib (Vioxx) for arthritis.
We are living longer and healthier lives than ever before. We are spending more money in so doing, and much of this is money well spent. Adopting the discipline to determine a treatment’s worth will help us spend more wisely, and get the value we deserve for our health care dollar.
Jeffrey M. Drazen is editor-in-chief of the New England Journal of Medicine, a professor at Harvard Medical School, and a physician at Brigham & Women’s Hospital. Send comments to firstname.lastname@example.org.