fb-pixel Skip to main content

Frances Haugen has already accomplished something remarkably rare in Washington. The Facebook whistle-blower, who came forward last week and testified before a Senate panel about the ways the company prioritizes profits over the safety of even its youngest users, created a sense of urgency among a bipartisan group of lawmakers to finally hold the social media giant responsible.

Now that Haugen has caught that lightning in a bottle, Congress must seize that momentum and quickly pass common-sense measures that force Facebook to mitigate the harm its content can cause children.

At the same time, lawmakers must begin the longer, more complicated, but equally crucial task of overhauling the way the federal government regulates social media companies, to force them not only to keep children safe but also to guard against the dangers of misinformation and attacks on our democracy. Haugen, who is also set to testify before the House select committee investigating the Jan. 6 insurrection about hateful and false information the platform helped spread, can also be a powerful resource in that effort.

Facebook and its founder, Mark Zuckerberg, have become deft at skirting accountability for the harm the platform has caused, while the regulatory system has failed to keep up with the fast pace of emerging media forms. Efforts by the Federal Trade Commission to use antitrust suits to try to force Facebook to break off from Instagram have failed, as have congressional efforts so far to add sharper teeth to antitrust laws.


In a message to Facebook employees that was also posted publicly last Tuesday, Zuckerberg called Haugen’s account a misrepresentation of the company’s practices, adding “most of us just don’t recognize the false picture of the company that is being painted.”

But his denial belies the documentary evidence Haugen disclosed, first to The Wall Street Journal and then to Congress after she sought whistle-blower protections. She testified that Facebook “prioritized growth and virality and reactiveness over public safety.”


A 2020 internal document detailed how Facebook studied ways to better market products to preteens — despite the fact that the platform supposedly bars anyone under 13 from having an account.

“Why do we care about tweens?” the document read. “They are a valuable but untapped audience.”

Haugen testified that Facebook officials know how to modify the platform’s algorithms to guard against dangerous content, like ads and images that promote unhealthy body image, causing depression and even suicidal thoughts in young users. But doing so would negatively affect profitability, so officials opted against it.

And one reason Facebook can get away with it is because there’s no regulatory body with the authority or expertise to keep the company honest.

“The only people in the world who are trained to analyze these experiments to understand what is happening outside of Facebook, are people who grew up inside of Facebook or Pinterest or another social media company,” Haugen testified. “There needs to be a regulatory home where someone like me can do a tour of duty after working at a place like this and have a place to work on things like regulation.”

Setting up such a federal watchdog authority is one of several smart recommendations Haugen advocates, and lawmakers have already put forward, that would serve as key starting points to protect kids.


Those include passing the bipartisan-sponsored CAMRA Act, which would authorize the National Institutes of Health to study the physical, emotional, and mental impact of social media content on children.

Congress could also bolster the Children’s Online Privacy Protection Act to prohibit social media companies from specifically microtargeting ads or other content to children.

Lawmakers can also pass the KIDS Act, cosponsored by Senator Edward Markey of Massachusetts, which includes bans on potentially violent, inappropriate, manipulative, or dangerous content from being amplified by auto-play features, push alerts, or “influencer” marketing.

How to respond to social media’s broader impact on society and democracy is a more daunting challenge. Haugen urged caution with amending a perennial target of lawmakers: Section 230 of the Communications Decency Act, which largely shields social media platforms from liability for content posted by users. Defenders say the section protects speech on the Internet, while critics assert it has freed platforms to look away from hateful and violent content.

Haugen suggested a middle ground that would draw a distinction between the content generated by users — which would remain protected, no matter how hateful or scurrilous — and the decisions by companies about if and how their algorithms promote that content. “I strongly encourage reforming Section 230 to exempt decisions about algorithms,” Haugen said. “Modifying 230 around content, I think, is very complicated, because user-generated content is something that companies have less control over. They have 100 percent control over their algorithms.”


Striking the right balance on reforming Section 230 must be part of the longer-range plan by lawmakers to bring federal regulatory authority into the social media age. But Congress can and should act now to prioritize children’s safety over Facebook’s profits.

Editorials represent the views of the Boston Globe Editorial Board. Follow us on Twitter at @GlobeOpinion.