‘At the cusp’ of digital healthcare
Medicine arrived late to the information age. Years after banking, travel, retail, and most other industries went digital, hospitals and doctors’ offices still featured endlessly ringing telephones, indecipherable handwritten documents, and busy mailrooms.
But in the last five years, spurred in part by the Affordable Care Act’s mandate for electronic health records, the medical world has embraced computers as a tool as vital as stethoscopes and X-rays. This past year, Partners Healthcare, which includes Brigham and Women’s Hospital, MGH, and several other Boston area hospitals, spent over a billion dollars adopting Epic, a system that allows a doctor to look up test results, answer a patient’s questions, communicate with consultants, write prescriptions, and bill for his or her services, all with a few clicks.
For many in the healthcare field and beyond, Dr. Robert Wachter’s book “The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age,” published in April, has served as a guide to a transition that has been less than smooth. In it, Wachter, a professor of medicine at the University of California, San Francisco and clinical pioneer (he coined the term “hospitalist”) explores why the long-foreseen digitization of healthcare has been so difficult.
Wachter is no Luddite — in a recent phone interview, he mentioned that he’d just located a Starbucks using an app and visited his 80-year-old mom via FaceTime. But, he said, medicine’s embrace of the computer has been plagued by “one unintended consequence and one disappointment after another.” Medical staff contend with charts bloated with cut-and-pasted gobbledygook; patients are frustrated by doctors who stare at the screen instead of making eye contact; doctors get burned out by finding themselves doing data entry after years of medical training. Though computerized systems have helped catch potential medical errors, it was an incident in which a computer led to a child at his hospital receiving 39 times the correct dose of an antibiotic that ultimately inspired Wachter to write the book.
Wachter sees no villains in this story, preferring to highlight those leading the charge to improve healthcare with digital technology, including several physicians in Boston. Few argue against giving patients greater access to their own medical information or requiring doctors and nurses to document, in legible, electronic form, that they’ve given someone a flu shot or counseled them about quitting smoking.
Still, Wachter points out, many well-intentioned ideas have fallen to the clinician to implement. “If you were a computer designer, you would have said, ‘We want to bake all that in; what a great opportunity we have.’ But if you’re a primary care doctor, your life has been ruined,” he said. For patients “generally disappointed with the health care system in terms of its access and responsiveness to their needs,” digitization offers greater control. But, Wachter believes, patients worry justifiably about being treated like data rather than human beings.
Wachter thinks that a new generation of doctors, raised on the screen but no less caring than their elders, will find ways to reimagine the use of the computer in medicine. “I just think that we’re so early in this that we have not thought deeply about whether there is a way to digitize what we do — because we clearly have to — and retain, or possibly enhance the humanity of our relationships with one another and our relationships with patients,” he said. He sees this as a 10- to 20-year process and, he notes, “We’re really just at the cusp of it.”