fb-pixel Skip to main content

For years, experts have warned that misinformation and conspiracy theories can spread like wildfires on social media platforms like Facebook, and indeed they have: Foreign governments have been able to interfere with elections around the world by spreading misinformation, and social media platforms have been some of their main conduits; in the United States, public officials all the way up to the president — perhaps the worst offender — have used their accounts to spread dangerous lies that undermine democracy and incite violence. Facebook had years to build safeguards to prevent this kind of misuse from unleashing another disaster in the 2020 election. Instead, it made some piecemeal adjustments that often seemed to address its own public relations concerns rather than protect democracy. Now, two weeks after the latest presidential election, American democracy is suffering the consequences of Facebook’s failures — failures, it should be noted, that were born of inaction on the company’s part and of government inaction.

In order to prevent bad-faith actors from spreading misinformation, for example, Facebook said it would not accept new political ads in the week before the US election. But as this editorial board wrote when it announced that measure, “If you acknowledge that such ads could compromise the integrity of the election, why enact this step over just a single week? And why wait?”


Indeed, Facebook’s decision to wait to ever-so-slightly curtail political ads was an example of the company’s poor judgment. It was also only one of the many bad decisions it has made — decisions that ultimately aided rather than hindered the ability of white supremacists and other extremists to propagandize and endanger the health and lives of people around the world. Just two weeks ago, for example, Donald Trump’s former advisor Steve Bannon posted a video on his Facebook page that suggested Dr. Anthony Fauci and FBI director Christopher Wray should be beheaded. The video stayed up on Bannon’s page for 10 hours and was viewed nearly 200,000 times before Facebook decided to take it down for violating its rules.


Facebook’s rules are notoriously murky. In Tuesday’s hearing before the Senate Judiciary Committee, Facebook CEO Mark Zuckerberg said, “If people are posting terrorist content . . . then the first time that they do it, then we will take down their account.” And yet for some reason, Bannon’s video suggesting that public officials should be murdered — beheaded — did not get blacklisted as terrorist content, and his company refused to permanently suspend Bannon, as Twitter had done. When Senator Richard Blumenthal of Connecticut pressed Zuckerberg on whether he would take down Bannon’s account, Zuckerberg doubled down. “Senator, no,” Zuckerberg said. “That is not what our policies would suggest that we should do in this case.”

Facebook was also used to sow doubt about the legitimacy of the election. A dubious “Stop the Steal” group that spread the lie that the election was being stolen from Trump, for example, amassed 320,000 members before Facebook took it down. It helped accelerate a movement, further propelled by the president himself, that took to the streets. Protesters armed themselves and stood outside vote-counting centers, and violence erupted at their rally in Washington, D.C., last week. And now Trump is continuing to use his account to lie about winning the election, long after he has clearly lost, and Facebook’s strategy to stunt the spread of these posts has only limited their circulation by 8 percent. They still remain some of the most engaged-with posts on the platform.


Facebook’s actions should not bring comfort to any concerned citizen. The company is simply not doing enough to protect democracy, and it probably never will if its users don’t demand it or the government doesn’t do anything about it.

Since Facebook was founded, Zuckerberg has been heralded as a visionary — someone who, according to his own words, wants to “connect the world.” As a result, lawmakers watched as Facebook grew, doing little to check its role as an amplifier of misinformation and hate speech. In a post he wrote after the 2016 election, Zuckerberg used the word “community” over 100 times. But in the years since, he’s done everything he can to advance that technology and media, and little to protect the community of people he says he cares so much about.

Now, as the largest shareholder and CEO of a social media platform with billions of users each month, Zuckerberg has unprecedented and unchecked power over what people around the world see and believe, power he has not used wisely.

It’s time to acknowledge that Zuckerberg’s priority was never about building community; his priority has been maximizing profits. And trusting Zuckerberg and his colleagues in the social media world to do the right thing, or to provide a public service, has proved to be bad policy. That’s why lawmakers need to act: Facebook is never going to fix itself. One way to start is to update Section 230 of the 1996 Communications Decency Act, which, as it stands, has unintentionally absolved social media platforms of any responsibility to moderate their content and promote civil discourse. But it may well take more than that.


The first step is dispensing with the illusion that Facebook can be relied upon to protect democracy, which means governments will need to better regulate the technology company. Until then, Facebook users should direct their attention and their clicks elsewhere.

Editorials represent the views of the Boston Globe Editorial Board. Follow us on Twitter at @GlobeOpinion.