fb-pixelLet’s waste more money on science - The Boston Globe Skip to main content
Ideas | Thomas Levenson

Let’s waste more money on science

Rahul for The Boston Globe

Two men came to Zurich in the spring of 1913, Max Planck and Walther Nernst, both future Nobel laureates. Their mission: to persuade Switzerland’s best-regarded young physicist, Albert Einstein, to join them in Berlin. As the three of them talked, Planck asked Einstein what he was working on at the moment. Einstein replied that he was wrestling with a new theory of gravity that would, if he could work it all out, supplant Isaac Newton’s universal law of gravitation — the most famous idea in the history of science.

Taking pity on the younger man, Planck broke into his pitch. “As an older friend,” he told Einstein, “I must advise you against it, for in the first place, you will not succeed; and even if you succeed, no one will believe you.”

He was wrong, of course. Einstein would complete the general theory of relativity two years later, and its confirmation in 1919 would propel him to worldwide fame.

Still, Planck was, up to a point, quite right. It took Einstein eight years of repeated error to arrive at his final, correct answer. For all of his public celebrity, his bravura new idea was largely ignored by his fellow physicists. From the late 1920s to the late ’50s, general relativity was a backwater and seemingly divorced from anything that mattered in the real world.

Advertisement



Now, of course, general relativity is everywhere: It explains the structure and the history of the entire universe — and has real, ubiquitous practical applications, down to the smartphone in your pocket. Its GPS program depends on general relativity to get you where you want to go.

And yet, for a half a century, general relativity would have seemed a bust. If the goal of science is to seek knowledge of the natural world humans can use for their benefit, then it could have seemed sensible, at any moment, to withhold or withdraw support for Einstein and his handful of heirs.

The United States has long boasted the world’s most successful scientific research establishment, built and mostly sustained with federal money. That funding often comes with great pressure to demonstrate immediately valuable results, and work that seems to lack practical value sometimes draws enormous scorn. In 2008, Sarah Palin famously sneered at a federal grant to study fruit flies in France. She thought the research was pointless; in fact, it was part of the investigation of an agricultural pest threatening California’s olive crop. Just last month, Senator James Lankford, an Oklahoma Republican, attacked studies on glaciers and the role of stress in drug use.

Advertisement



But the “make it pay or don’t do it” impulse is more than simple showboating at the expense of a handful of projects. It’s possible to cripple a research community in subtle ways. Science as a body of knowledge is powerful, valuable, essential to human well-being. Science as a human process is wasteful, necessarily — and even more, usefully — so.

The settled knowledge captured in textbooks emerges from a process that cannot be fully planned or managed. Science is a set of ways of thinking and acting through which, sometimes, results emerge. Failure, “wasted” time, is intrinsic to that process. It’s impossible to do science without a tolerance for uncertainty over the years or decades it can take to reveal what many discoveries actually mean.

The challenge now is to sustain that effort in the face of dangers, new and old, that could do much more than derail individual projects or research programs. The approach of a Donald Trump presidency has raised particular fears for the future of American research — his threats to NASA climate research, his dalliances with antivaccine activists, his vice president’s belief in young-earth creationism. At a minimum, it’s unclear how federal science budgets will fare over the next four years.

But the challenge of funding research that has no clear, immediate application predates Trump by decades, as both political parties have overseen a long decline in funding for basic science. Total federal funding for research and development dropped from 1.2 percent of gross domestic product in 1976 to under 0.8 percent in 2016. Even the research most obviously connected to human well-being, biomedical investigations funded by the National Institutes of Health, hit a peak in 2003 (barring the one-time spike from stimulus spending in 2009) and declined by more than 20 percent to a plateau in 2012. In 2000, over 30 percent of the projects submitted to the NIH got funded. By 2014, that number had dropped to 18 percent.

Advertisement



It is impossible to know what we’ve already lost — what wasn’t discovered as support dropped away. But in any tight budget environment, safe research that promises to deliver clear, useful, quick outcomes gets funded first. For both researchers and funders, all the incentives favor the kind of studies Senator Lankford won’t make fun of.

History reveals what’s at risk as the work that could fail gets squeezed out.

In the early 1960s, a young biologist named Thomas Brock went to Yellowstone National Park. He knew that the park’s hot springs were home to microbes that live on sunlight, and he wanted to understand their ecology — how all those micro-organisms interacted in such neatly contained environments. This was pure curiosity-driven research on what Brock thought were stable microbial communities.

In 1965, though, he started to wonder about some pink filaments he’d noticed in the hot waters of the outflow channel of Yellowstone’s Octopus Spring. These turned out to be bacteria that could live in near-boiling water — thought then to be an utterly inhospitable environment. The next year, with an undergraduate research assistant, Hudson Freeze, he found another such microbe living downstream, at slightly cooler temperatures. That fall, the two researchers managed to grow that organism, named Thermus aquaticus, in their lab.

Advertisement



In the years since, the tally of what have come to be called extremophiles has exploded. There are microbes that thrive in highly acid environments, in exceptionally alkaline ones, inside rocks, at the bottom of the deep ocean, in nuclear waste, in the pillar of salt that was Lot’s wife, and more. As pure discovery, this work is beautiful, revealing a living world more complex, more opportunistic, more ubiquitous than we had previously imagined. Because of extremophiles, scientists have drastically expanded their view of which settings might harbor life beyond Earth. Thanks to a couple of biologists who poked around Yellowstone, it seems less likely that we are alone in the cosmos.

Still, nothing in Brock and Freeze’s initial work suggested extremophiles might actually matter in any dollars-and-cents way. Paying to send a couple of guys to poke around with pretty-in-pink microbes in a national park could have as easily been ridiculed as a French fruit flies moment.

But if such a view had kept Brock and Freeze from their hot springs, the cost to humanity would have been enormous. In 1976, a decade after the bacterium was first identified, a different team of scientists found in T. aquaticus a molecule they named Taq polymerase, a version of the enzyme that cells use to synthesize new DNA from an existing strand of genetic information. In T. aquaticus, this ubiquitous molecular tool possessed one striking property: Like its parent organism, it could function at much higher temperatures than other enzymes.

Advertisement



Seven years later, another young biologist, driving through an April night, had an epiphany. Kary Mullis went on to create what’s called the polymerase chain reaction, or PCR. PCR is a cheap, swift process that “amplifies” a section of DNA, creating as many copies of the desired genetic information as needed — billions in a few hours. The procedure is now used across the whole spectrum of biotechnology, from genetic testing and disease detection to the analysis of ancient DNA.

One more thing: The PCR process has to flip between hot and cold — which is where the heat-tolerant Taq polymerase comes in. It’s the vital cog that makes the reaction work. Human beings, lots of them, are alive today thanks to work that took almost two decades to bring home, begun when two curious people waded into a hot spring.

Many other seemingly useless (if wonderful) discoveries have yielded unexpected human benefits. Sometimes a question leads to very specific unintended happy outcomes, as when the study of coral reefs led to the development of artificial bone grafts, or when work on snail locomotion inspired wall-climbing robots. Or, there can be a shift in the worldview of an entire field, as Siddhartha Mukherjee documented in the book “The Emperor of All Maladies,” when decades of research into the fundamental molecular mechanisms of cell division finally began to inform how physicians diagnose and treat cancers.

Any of these lines of research could have been strangled in the crib by an insistence on an immediate payoff. Such demands crush not just individual projects, but the way the entire infrastructure of science functions. When even the slightest hint of blue-sky thinking will kill chances for funding, then lead investigators won’t entertain such ideas. Their students and collaborators will keep their heads down. With each passing funding cycle, the scope for imagination, for possible failure, and, perhaps, for life-changing discoveries will diminish.

As funding has slid, some of this is already happening. The coming four years promise a host of other conflicts, especially for politically fraught areas like climate change. But as specific battles play out — for instance, to protect agency budgets and defend NASA’s earth-sensing satellites — the more subtle, more hidden danger will persist. Science takes time, and it occurs within a community that develops ideas over generations of professional lives. Requiring immediate practical payoffs crushes ideas and disrupts the ways that community develops, year over year, decade after decade.

Do so long enough, and it will become ever more painful, expensive, and difficult to rebuild American scientific capacity — which, for now, remains the most powerful engine for investigating the material world humankind has ever known.


Thomas Levenson is a professor of science writing at MIT and an Ideas columnist. His latest book is “The Hunt for Vulcan.”

Correction: An earlier version of this piece misspelled the name of scientist Kary Mullis and contained an incorrect given name for scientist Walther Nernst.