After Monday’s huge global outage, Facebook is back. Not that this is an entirely good thing.
Thanks to Facebook whistleblower Frances Haugen, we’re getting deep new insights into just how dangerous the giant social network has become. Maybe now the federal government will get serious about stamping out online hate speech and misinformation.
But that prospect should also terrify you. Irresponsible speech is dangerous, but even worse is the prospect that the US government will seek to protect us by stifling the free exchange of ideas.
Haugen, the former Facebook data scientist, has done us a public service by revealing thousands of pages of internal company documents. The company’s own research found that Facebook and its sister product Instagram stoke political extremism and violence in the US and abroad, undermine efforts to promote COVID-19 vaccinations, and even lead to increased depression and anxiety among millions of teenage girls.
Like those damning secret files about the tobacco industry, the Haugen leaks, first reported by the Wall Street Journal, prove that Facebook’s leaders have long known their products are often toxic.
Advertisement
But what can be done? Haugen is testifying before Congress this week. She wants new federal regulations to bring Facebook under control. We already know what some leaders would like to do. And it ought to worry you.

Back in July, White House press secretary Jen Psaki said the Biden administration was “flagging problematic posts for Facebook that spread disinformation.” This could be innocent enough — maybe a White House staffer occasionally alerting Facebook to some egregious posting that the company would have taken down anyhow. Still, it’s a little creepy to imagine the White House constantly monitoring a major media source and demanding that it delete messages that don’t meet with the president’s approval.
A far more worrisome example comes from the office of Senator Elizabeth Warren of Massachusetts. Last month, Warren announced “the alarming discovery that Amazon is peddling misinformation about COVID-19 vaccines and treatments through its search and ‘Best Seller’ algorithms.” (In other words, you can log onto the world’s largest online bookstore and find materials of which Warren disapproves.)
Advertisement
Warren demanded that Amazon change its algorithms so that these books no longer appear in the search results. She’s not quite calling for a ban on dangerous books; she just wants to make sure that people can’t find them. Perhaps Warren will next demand that such books be deleted from the electronic catalogs of public libraries.
One can imagine an unholy alliance of Washington power brokers and Big Tech titans, quietly collaborating in schemes to protect the public from dangerous ideas and bad thoughts. No messy legislation involved. Just a quiet word from the White House or the office of a powerful senator. The tech giants, eager to avoid political unpleasantness, may simply go along, and make the unwelcome speech disappear. And since it’s a business doing the deed, not a government censor, the First Amendment doesn’t apply.
Paranoid? Perhaps. But these days, we may not be paranoid enough.
Any proper regulation of Facebook ought to take the form of explicit legislation. But antitrust won’t do the trick. Big as it is, Facebook isn’t a monopoly. There are lots of other social networks — Reddit, Twitter, TikTok, and plenty more. And though you might force Facebook to sell off some businesses, like Instagram, that doesn’t do a thing to weaken Facebook itself.
Advertisement
In testimony before Congress on Tuesday, Haugen said Facebook should be compelled to reveal the secret algorithms it uses to determine what its users see. “On this foundation, we can build sensible rules and standards,” said Haugen. Maybe. But any rule that would limit the spread of hateful or false information still bumps into that pesky First Amendment.
Some legislators want a law that would let users take their social media history from one service to another, just as you can transfer your cell phone number from AT&T to Verizon. This would make it easy to move from Facebook to a competing service like Parler or MeWe, and spawn other new and perhaps healthier rivals. And there’s no censorship involved.
But every social media company would have to adopt a standard data format, a massive challenge. Besides, your social media record includes lots of data from your friends. Do you have a right to move it to a different network without their permission? Facebook has offered tentative support for some form of data portability, maybe because they doubt it will ever happen.
Perhaps the most practical option is a tweak to Section 230 of the Communications Decency Act. This federal law says that blogs and social networks aren’t liable for the crazy stuff people post on them. If someone adds a comment on Facebook claiming that a famous movie star robs convenience stores in his spare time, the actor can’t sue Facebook, just the knucklehead who wrote the post.
Advertisement
This law was written in the Internet’s infancy, when loony rants about communism or poisonous vaccines were read only by people who went looking for them. But now, if you find stuff like that on Facebook, there’s a good chance it will automatically send you more of the same. Facebook doesn’t act as a neutral public forum, but actively promotes certain messages.
So maybe we should eliminate Section 230 protections for algorithmically powered social networks. For Internet sites that let readers find their own way around, the law would remain the same. But a Facebook or Twitter or YouTube or TikTok could be sued by private citizens — not the government — for postings that defame somebody or which threaten violence.
Such a change in the law could have devastating consequences for Facebook’s way of doing business. But given the company’s indifference to its malignant global impact, they shouldn’t expect a whole lot of sympathy.
Hiawatha Bray can be reached at hiawatha.bray@globe.com. Follow him @GlobeTechLab.