IN EARLY 2019, American journalists sounded a warning: China was about to — no, scratch that — has figured out how to monopolize the 21st century economy by seizing control of 5G, the next generation of super high speed internet. Critics fear that because the Chinese developed it first, they will have exclusive access to the technology behind a new, fully interconnected version of the Internet, delivering it through a network of Chinese-only fiber-optic connections designed to work only with Chinese made equipment.
Game over; the Chinese have won.
Writing in Wired, Susan Crawford warned that if the Chinese take control of the 5G ecosystem, “American companies don’t stand a chance,” while David Brooks in The New York Times proclaimed that “It’s become increasingly clear that China is a grave economic, technological and intellectual threat to the United States and the world order.”
These prophecies have a familiar ring. Not long ago, Japan was going to dominate modern manufacturing; and at one point, reasonable people feared that Microsoft’s Windows operating system would concentrate too much computing control into a single private company’s hands.
Then, as now, such fears can cross over into outright racism. Brooks, for example, asks if China is the “other,” a framing device that depicts economic adversaries as inherently different — and less worthy — than ourselves.
Regardless of such economic and potentially nationalistic concerns, the focus on China’s potential 5G dominance misses a subtler shift at work, with stakes that go far beyond immediate commercial opportunities.
Which nation will drive the study of our universe over the next century? Since World War II, America has been the unquestioned leader in basic scientific research, especially curiosity-driven inquiry that underpins many of the signature technological advances on which we now depend. A shift in scientific power will have far-reaching consequences. If the US loses its dominance over scientific research, the consequences will exceed — and outlast — any short-term commercial gains.
If our scientific dominance ends, it will not be because of Chinese perfidy, but because the US chose to surrender its commanding role in the search for knowledge – and by doing so, abandoned a unique and often overlooked source of American power.
THE AMERICAN SCIENTIFIC century effectively began on July 16th, 1945, when the world’s first nuclear weapon detonated in a remote corner of the New Mexico desert. For Vannevar Bush, who had helped oversee the Manhattan Project, the lesson of the Trinity test was obvious: America, now armed with the atomic bomb, should refocus its power and resources on maintaining its lead in an increasingly science-driven race for international wealth and power.
Shortly thereafter, Bush delivered a report to President Truman titled Science: The Endless Frontier. Part manifesto, part roadmap, Bush’s report promoted a single idea: that 20th century progress — in medicine, industry, and invention — would “require continuous additions to knowledge of the laws of nature.”
He noted that the extraordinary advances made during the war years, including nuclear weaponry, mass production of antibiotics, radar, code-breaking, and computing, had all depended on curiosity-driven scientific inquiry. If you focused on solving specific commercial problems, Bush argued, you would miss the fundamental ideas that produce much larger and bolder advances.
“This essential, new knowledge,” he told Truman, “can be obtained only through basic scientific research,” to be funded by the federal government.
It’s difficult now to imagine how radical this was. Before World War II, US basic science had been a relatively small affair, with its biggest projects — the giant telescopes at Mt. Wilson and Mt. Palomar in southern California, or E. O. Lawrence’s atom smashers in Berkeley — generally funded by private philanthropy. The war expanded the reach and cost of science beyond anything private money, even corporate funds, could reasonably support.
Even with the successful use of the atomic bomb to remind Congress how fundamental physics could end a war, it still took five years to build a coalition willing to inject the federal funds into basic science inquiry. In 1950, the National Science Foundation (NSF) was finally founded, channeling significant money into curiosity-driven research projects and keeping politicians (mostly) out of the loop. Funding decisions were made by scientists and domain experts, not civil servants.
Bush’s vision created a world-leading approach to basic science. Though Nobel prizes are an imperfect measure, with 269 science wins through 2018, US-based researchers have utterly outpaced the second-place nation, Britain, with its 89 Nobels.
More importantly, money spent on basic research produces more discoveries, enhancing a nation’s soft power. US astronauts on the moon may not have affected the price of eggs, but did establish America as the most technologically culture on the planet for the next few decades.
Unexpected technological advances have also flowed from seemingly impractical pursuits. For one classic example, the polymerase chain reaction, a Nobel-winning discovery in the 80s that enables the creation of an unlimited number of copies of a stretch of DNA, is one of the basic, essential tools of the modern bioengineering industry. The key to the process was found in the 1960s, by two microbe researchers taking samples in Yellowstone’s hot springs, just to find out how bacteria could survive in the heat. Transistors, invented in the late 1940s, turn on quantum theory. GPS relies on Einstein’s general theory of relativity to make the corrections needed to locate your phone to the stretch of sidewalk you’re passing. Some studies suggest that the economic return on science spending may range up to $80 for each dollar invested.
The frequency of American Nobel wins peaked in 1972. Since then, Claudius Gros writes, awards have declined “at a continuously accelerating rate.” Why? Certainly not for lack of money. Dr. Marc Kastner, formerly MIT’s dean of science, and now president of the Science Philanthropy Alliance, notes that “US research and development funding has been roughly keeping up our GDP” for the last several years.
But rivals have been accelerating their own funding, especially in China, where, Kastner says, “spending as a fraction of GDP has not reached ours, but is rising rapidly and will surpass ours within a decade.”
China began heavily investing in basic scientific research a mere 25 years ago, but it has quickly dominated the field. The country’s investment in science rose from $9 billion in 1991 to over $400 billion by the mid 2010s. “There has been a huge influx of money into the top 40 to 50 universities,” says John Zhang, professor of chemistry at NYU Shanghai. “That’s had a huge impact in terms of basic research.”
China is now approaching US levels of research investment: In a 2018 National Science Foundation report, the US still leads with $497 billion in research and development spending as of 2015, or about 26% of the world’s total investment. But China is a close second, at $409 billion.
And by one measure, any appreciable gap has disappeared. In 2016, China-based scientists became the world’s most prolific scientific authors, exceeding US researchers in numbers of paper published 426,000 to 409,000. That said, American and European papers continue to be more frequently cited by researchers than Chinese papers.
Some observers argue that institutional obstacles in China will continue to hinder researchers in that country from doing their best work. Zhang says that up until very recently, funding agencies have graded researchers on a strict quantitative formula: how much they’ve published, and the prestige of the journals in which that work appeared. That rigid framework tended “to encourage researchers to follow the hot trend in the United States,” he says. As a result, he concludes, “there is a lack of innovation.” Zhong-Lin Lu, a neuroscientist at the Ohio State University, who is also affiliated with NYU Shanghai, agrees. He says that “the goal has been to publish in high impact journals, not to build solid research programs. [China] is pumping a lot of money into the system, but I don’t see a lot of really good outcomes.”
But Zhang also notes that these kinds of funding directives may be changing, allowing Chinese researchers more latitude. Further, the Chinese have advantages in fields of research which “require a lot of labor,” like gene sequencing or large-scale animal studies. “10,000 rats would be hard to do in the US,” Lu says. “That’s easy in China.”
LAST YEAR AT MIT, the prospect of an imminent Chinese scientific dominance seemed to come true. On 12 April 2018, Jianwei Pan, Vice President of the University of Science and Technology in China, led a session of MIT’s prestigious Physics Department colloquium where he described his group’s series of experiments probing a phenomenon called quantum entanglement — a fundamental property of the micro-universe, vital to the emerging technology of quantum computing.
For two hours, Pan described how they were testing their ideas, from tabletop setups to a satellite that sends quantum signals across the globe.
Physicist and historian of science David Kaiser works on some of the same problems and, he says, “I expected to hear about the results in the cool work I knew about.” Instead, “the biggest result was the extraordinary scale of all the things [Pan] has been able to do. Pan and his colleagues can think up remarkable new ideas and set right to work to pursue them.”
Kaiser says that he felt pangs of envy when Pan described his seemingly limitless access to resources in China, similar to what foreign scientists 50 years ago might have experienced when an American particle physicist came to town. In those days, Kaiser says, “No project was out of range. If you could think it, you could go do it. It hasn’t felt like that in American physics for decades.”
Scientists follow opportunities. In the ’30s, refugees from fascist Europe helped jumpstart American science. Federal funding at the time enabled US universities to attract the best and the brightest from all over the world. And American doors were open to international talent.
Now, as the US becomes both less reliable as a funder of basic science and, in the age of Trump, more hostile to immigrants, it loses its ability to recruit the best minds. A decade ago, Kastner says, many scientists working in his sub-discipline of materials science would try to stay in the US. Now, he says, “we’ve started to see students from China going back.”
Unchecked, such developments will lead to the erosion of American economic and technological power.
THE RECENT CONCERNS over 5G technology and other shifts in scientific may encompass both legitimate grievances and, sadly, xenophobic arguments that the US can only lose to foreigners if they cheat. It is true that intellectual property has been bitterly contested between China and the US (and other trade partners) in recent years — and that many nations, including China (and the US), engage in various forms of espionage to gain an edge.
But there’s no treachery involved when China channels Vannevar Bush. It’s hardly cheating to decide to funnel cash into applied research — in artificial intelligence, materials science, and biomedicine, to name a few of China’s priorities — along with basic inquiry.
Where’s the sabotage in choosing to spend more as a percentage of GDP on science, in the hope of getting the same results the US won when it did the same 70 years ago? And consider this: Scientific investment isn’t even that expensive. The $8.6 billion for a border wall sought in President Trump’s latest budget request, for example, would more than double the NSF’s research spending per year.
All of which is to say that if China’s expanding intellectual ambition worries Americans, we have a simple solution: Welcome as many people as possible who seek to work into American labs. And come up with the cash. Spend more. Boost the share of wealth the US devotes to science, including, vitally, that part of the work that seems, at first, to have little or no connection to everyday experience.
An earlier version of this story misstated the decade transistors were invented. It happened in the late 40’s.Thomas Levenson is a professor of science writing at MIT and an Ideas columnist. His latest book is “The Hunt for Vulcan.”