scorecardresearch Skip to main content
EDITORIAL

Facebook isn’t protecting the election

Mark Zuckerberg wrings his hands about the fate of democracy but stops short of reining in election-warping misinformation campaigns.

Facebook CEO Mark Zuckerberg.Alex Wong/Photographer: Alex Wong/Getty Im

Imagine you lived in a town where a big factory polluted the river. And imagine that your fellow townspeople and even the plant’s own employees had suggested changes to its operations so that it would pump out much less toxic sludge. But the head of the plant barely budged, even though he said he agreed that dirty water was dangerous.

Now let’s say that, one day, the CEO stood before the town and announced: “Because clean water is so important, two months from now, we will cut down on a big source of our pollution for one week!”

Wouldn’t that sound absurd? Why wait two months — and why do this for only one week?

Advertisement



That imaginary CEO proclamation is pretty similar to a real one that Facebook’s Mark Zuckerberg made last week. In order to “help secure the integrity of the US elections,” he said, Facebook would not accept new political ads in the week before the Nov. 3 election. To which a fair response is: If you acknowledge that such ads could compromise the integrity of the election, why enact this step over just a single week? And why wait?

After all — to borrow again from the clean-water analogy — a healthier information ecosystem is just as necessary now, when foreign nations are trying to discredit political candidates and democratic elections, and the president of the United States takes every opportunity to pollute the national discourse by sowing doubts about the reliability of mail-in voting and making up claims about what Joe Biden would do as president. And guarding against political mischief will be just as important a month from now when many people will begin voting early. It’s not as if the seven days before the election will be substantially more sacred and deserving of Facebook’s caution.

Advertisement



Political ads on TV are often misleading, but at least they’re widely seen, enabling opponents to respond and others, including news media, to critique their veracity. Political ads on Facebook are another matter because the company lets candidates and their supporters say obviously false things and lets them target such misinformation to relatively small audiences. Because of this potential for micro-targeted mischief, Google doesn’t allow political ads to be aimed at ultra-granular audiences — a step some Facebook employees have advocated as well. Twitter and Pinterest don’t accept ads from political candidates at all.

Not only is Facebook unwilling to follow its social media peers and prevent politicians from targeting ads to small groups of voters considered especially persuadable by certain messages, the company even will allow that practice to go on during the supposed one-week hiatus. As Zuckerberg explained in his statement last week, although “we won’t accept new political or issue ads” in the week before the election, “advertisers will be able to continue running ads they started running before the final week and adjust the targeting for those ads.” Those are pretty big loopholes.

Facebook is taking some other modest steps to counterbalance some of the misinformation posted on its site about voting — measures that Zuckerberg hypes as an effort to “protect our democracy.” Facebook will attach correction labels to posts that try to “delegitimize the outcome of the election” or claim that “lawful methods of voting will lead to fraud.” It will remove what it deems “misinformation about voting” — including posts that claim people will get COVID-19 from going to the polls — and will prominently display accurate voter information. It’ll work with Reuters to post real-time election results. And to reduce the chances that “misinformation and harmful content” will go viral, Facebook will limit the number of times people can forward posts over its Messenger service.

Advertisement



It’s better than nothing. But there’s a mismatch between the tepidness of Facebook’s policies — as evidenced by the nearly meaningless one-week semi-ban on political ads — and the very big problems Zuckerberg himself says he’s trying to address. He said the new measures were important because “with our nation so divided and election results potentially taking days or even weeks to be finalized, there could be an increased risk of civil unrest across the country.” He added: “This election is not going to be business as usual.”

Where has he been the last few years? “Business as usual” hasn’t described our civic life for a long time, and Facebook bears much of the blame.

As The Wall Street Journal reported in May, several of the company’s executives acknowledged in 2018 that Facebook’s algorithms for turbo-charging the virality of posts “were making the world more divided.” They suggested fixes, like tweaking the algorithms to reduce “the spread of content disproportionately favored by hyperactive users,” the Journal reported. Most such users are on the far right or far left politically, so if the change had occurred, “middle-of-the-road users would gain influence.” Only a watered-down version of the plan took hold, and Zuckerberg “signaled he was losing interest in the effort to recalibrate the platform in the name of social good,” according to the Journal.

Advertisement



Zuckerberg needs to decide how serious he really is about democracy. Never has one person had so much power and reach to influence what people see and believe, and that makes it important that he finally recognize and rein in these threats. In addition to the steps Zuckerberg announced slast week, Facebook could do much more to elevate fact-checked material and counter-perspectives on viral posts. Since Zuckerberg acknowledges the risks that political ads pose, Facebook should ban them entirely from now through November. And it’s past time to end completely the practice of micro-targeting political ads.

You’d think that after Facebook was used as a tool for inciting genocide in Myanmar, Zuckerberg would err much more heavily on the side of caution when it comes to protecting the social fabric in his own country. Why won’t he?


Editorials represent the views of the Boston Globe Editorial Board. Follow us @GlobeOpinion.