fb-pixel Skip to main content

SAN FRANCISCO — YouTube is taking down several video channels associated with high-profile antivaccine activists including Joseph Mercola and Robert F. Kennedy, Jr., who experts say are partially responsible for helping seed the skepticism that’s contributed to slowing vaccination rates across the country.

As part of a new set of policies aimed at cutting down on antivaccine content on the Google-owned site, YouTube will ban any videos that claim that commonly used vaccines approved by health authorities are ineffective or dangerous. The company previously blocked videos that made those claims about coronavirus vaccines, but not ones for other vaccines like those for measles or chickenpox.

Advertisement



Misinformation researchers have for years said the popularity of antivaccine content on YouTube was contributing to growing skepticism of lifesaving vaccines in the United States and around the world. Vaccination rates have slowed and about 56 percent of the US population has had two shots, compared with 71 percent in Canada and 67 percent in the United Kingdom. In July, President Biden said social media companies were partially responsible for spreading misinformation about the vaccines, and need to do more to address the issue.

The change marks a shift for the social media giant, which streams more than 1 billion hours’ worth of content every day. Like its peers Facebook and Twitter, the company has long resisted policing content too heavily, arguing maintaining an open platform is critical to free speech. But as the companies increasingly come under fire from regulators, lawmakers, and regular users for contributing to social ills — including vaccine skepticism — YouTube is again changing policies that it has held onto for months.

“You create this breeding ground and when you deplatform it doesn’t go away, they just migrate,” said Hany Farid, a computer science professor and misinformation researcher at the University of California at Berkeley. “This is not one that should have been complicated. We had 18 months to think about these issues, we knew the vaccine was coming, why was this not the policy from the very beginning?”

Advertisement



YouTube didn’t act sooner because it was focusing on misinformation specifically about coronavirus vaccines, said Matt Halprin, YouTube’s vice president of global trust and safety. When it noticed that incorrect claims about other vaccines were contributing to fears about the coronavirus vaccines, it expanded the ban.

“Developing robust policies takes time,” Halprin said. “We wanted to launch a policy that is comprehensive, enforceable with consistency, and adequately addresses the challenge.”

Mercola, an alternative medicine entrepreneur, and Kennedy, a lawyer and the son of Senator Robert F. Kennedy who has been a face of the antivaccine movement for years, have both said in the past that they are not automatically against all vaccines, but believe information about the risks of vaccines is being suppressed.

Facebook banned misinformation on all vaccines seven months ago, though the pages of both Mercola and Kennedy remain up on the social media site. Their Twitter accounts are active, too.

In an e-mail, Mercola said he was being censored and said, without presenting evidence, that vaccines had killed many people. Kennedy also said he was being censored. “There is no instance in history when censorship and secrecy has advanced either democracy or public health,” he said in an e-mail.

More than a third of the world’s population has been vaccinated and the vaccines have been proven to be overwhelmingly safe.

Advertisement



YouTube, Facebook, and Twitter all banned misinformation about the coronavirus early on in the pandemic. But false claims continue to run rampant across all three of the platforms. The social networks are also tightly connected, with YouTube often serving as a library of videos that go viral on Twitter or Facebook.

That dynamic is often overlooked in discussions about coronavirus misinformation, said Lisa Fazio, an associate professor at Vanderbilt college who studies misinformation.

“YouTube is the vector for a lot of this misinformation. If you see misinformation on Facebook or other places, a lot of the time it’s YouTube videos. Our conversation often doesn’t include YouTube when it should,” Fazio said.

The social media companies have hired thousands of moderators and used high-tech image- and text-recognition algorithms to try to police misinformation. YouTube has removed over 133,000 videos for broadcasting coronavirus misinformation, Halprin said.