Every day around 9 p.m., Joan Donovan bids her wife good night, heads into her home office — which she calls the “dungeon” — and binges white supremacist videos and conspiracy theories on YouTube.
Donovan’s nightly ritual, which often lasts until 2 a.m., is difficult but essential to her research at Harvard Kennedy School. At a time when false claims around COVID-19 and politics are running rampant, the former punk-rocker from Saugus has been able to predict what types of disinformation will travel from the darkest corners of the Internet to — in some cases — the highest levels of the US government.
Advertisement
She was one of the first researchers to predict medical disinformation would upend the fight against COVID. She also saw xenophobia associated with the pandemic on an alt-right YouTube show, weeks before then-President Trump called it the “China Virus.” And at 8:30 a.m. on Jan. 6, the day pro-Trump rioters stormed the Capitol, she warned on Twitter that “today we will witness the full break of the MAGA movement from representative politics.”
Now, she has a broader warning: If Facebook, Twitter, YouTube, and other social media companies don’t change their algorithms, any number of recent lies spreading online could take hold in the next few months and threaten the national discourse around the pandemic recovery, climate change, and racial inequality.
“Tech companies have built a system in which people spread misinformation so much further, so much faster, and in such higher velocities, that fighting it is like bringing a garden hose to a 30-story building that’s on fire,” Donovan said in an interview. “We need updated regulations that would ensure protection for the public interest.”
On a recent summer day, Donovan sat in her suburban Boston home office and opened up her YouTube playlist. There were videos questioning the coronavirus’s existence, clips of people disputing critical race theory — which argues that racism is embedded in laws and policies — in town halls, and footage of racist violence at street protests.
Advertisement
Donovan circled onto a clip from the controversial comedian Russell Brand delving into “The Great Reset,” a term far-right groups are using to claim billionaires and other leaders are using the pandemic, climate change, and private philanthropy to realign the world to benefit the rich.
The term — which originated in 2020 at a World Economic Forum meeting in Davos, Switzerland — had started popping up on her playlist. Clips of Brand buying into the concept far outpaced his other videos. A cross-reference of the term on 4chan, an anonymous messaging site known in part for conspiracy theories, showed it was starting to be used there as well.
“It always goes back to these very old antisemitic tropes about Jews controlling the world, or what they might call the ‘Deep State,’ or ‘New World Order,’” Donovan said. “And so ‘The Great Reset’ really clicked.”
Those tropes have started to show up in recent protests in California, she said, where some people have gathered to rally against mask mandates, comparing it to Nazis forcing Jews to wear yellow stars.
At Donovan’s lab, based out of the Shorenstein Center on Media, Politics, and Public Policy at the Harvard Kennedy School, a team of over 20 researchers dissect the major disinformation campaigns of the moment and reveal trends they see. Much of the work is manual, with the team poring over reams of content on YouTube, online forums, and social media.
Advertisement
After Trump said hydroxychloroquine could cure COVID-19 in March 2020, they traced back the false claim to an obscure Google document, showing the power of “cloaked science,” referring to disinformation that uses scientific jargon.
Now, Donovan and her team are watching the growth in “burn the mask” protests, where false beliefs that masks cause bacterial pneumonia are leading people to set them on fire and post videos online.
“The most pernicious forms of disinformation usually have a downstream effect where people do change their behaviors,” she said, “and some campaigns can lead to violence.”
To slow the spread, Donovan has suggested some solutions. Social media companies should hire librarians to help curate content on news feeds, she said, rather than simply employ people to moderate it. “Moderation is a plan to remove what is harmful,” she wrote in Wired. “Curation actively finds what is helpful, contextual, and, most importantly, truthful.”

Donovan’s immersion in this world has exacted a personal toll. She’s a frequent target of death threats, online insults, and harassment. To ensure her safety, Harvard has installed a panic button in her office on campus. “It does weigh on you,” she said. “If my wife asked me to stop, I’d stop.”
In a way, Donovan’s rise to academic prominence is no surprise.
Her interest in studying extremist groups started in the early 1990s, when she was enmeshed in the underground rock scene in Massachusetts as a teenager. The subculture was rife with neo-Nazis, racists, and skinheads, she said. To stay safe, she needed to master how people communicated to separate who was extremist and who wasn’t. (For example, skinheads who wore white laces in their Doc Martens were racist, she said, while those who wore red were not.)
Advertisement
“You had to know about the subculture in order to decode what was going on,” she said. “I can see that as something that was formative in my own biography.”
Her path to academia was not linear. Donovan — who has a love for Dodge Chargers and Stephen King novels, and who helped invent the beaver emoji — took nine years to graduate college. She started in 1997 at Northeastern University and graduated in 2006 with a bachelor’s degree in sociology from Concordia University, with a stint in a punk-rock band in between. She got her doctorate in 2015, focusing on the Occupy movement. Her early research focused on the worldview of white supremacists.
At Harvard, Donovan keeps a small teaching course load and manages her research lab, which has upward of $10 million in funding from philanthropic foundations and donors. Her work on how to “detect, document, and debunk” misinformation on the web has gotten her into the halls of Congress and the boardrooms of social media giants. (Craig Newmark, the founder of Craigslist, recently gave the Harvard Kennedy School $5 million to support her lab’s work.)
Advertisement
Donovan has been spending more time recently with lawmakers, advising them on how to squash disinformation. In January 2020, she testified before the House Energy and Commerce Committee, urging politicians to curb the spread of “Deep Fake” videos, counterfeit propaganda campaigns, and other forms of online fraud.
Shortly after the 2020 election, she went before the House Select Committee on Intelligence to say that companies including Facebook and Twitter must do more to their algorithms and practices to promote more timely, local, and accurate journalism.
In December 2020, five lawmakers in Congress wrote to President-elect Biden asking him to name Donovan to the COVID task force. We “urge you to add a member to the Task Force who has a deep understanding of misinformation, including its causes, exacerbating factors, and ways to combat it,” they said. (He did not name her.)
There are some researchers who believe Donovan’s work is not attacking the true issue. Yochai Benkler, a professor at Harvard Law School, published research with his coworkers on the 2020 election that said mass media outlets, such as Fox News, are more to blame than social media for the spread of disinformation. (In response to the criticism, Donovan said “misinformation-at-scale is a problem across all forms of media and it is not possible to disentangle social media from other channels.”)
Ultimately, Donovan believes the pandemic shows how sorely social media companies, journalists, citizens, and governments need to change the way they interact on the web.
Companies should focus less on debunking every medical myth and instead create protocols for deciding which disinformation campaigns are “reaching a tipping point” and should be addressed, she said. Local journalists should be vigilant in parsing through misinformation and not mistakenly report it as news. Politicians should require that social media platforms be more strongly regulated, like TV and radio stations.
“We should have an Internet that supports democracy,” she said.