fb-pixel Skip to main content
Josh Bernoff

Fixing Facebook’s broken news feed

CHRISTOPHE SIMON/AFP/Getty Images

What is Facebook? To most of us as we interact with it day to day, it’s the news feed.

The news feed is what makes it fun — seeing all those baby and graduation pictures from our far-flung friends.

It’s also where the worst of Facebook shows up: the anti-vaccination links, the clueless posts about the Mueller report that no one has seen, and the memes originated by Russian trolls to influence our democracy.

And despite CEO Mark Zuckerberg’s recent feint in the direction of more privacy-focused social media and messaging, the news feed promises to remain influential for years to come.

Advertisement



Given its broad reach — the company also owns Instagram — Facebook is now the most influential voice in America. But because the company hides how its feeds work, we have no idea how that influence functions.

If the company’s recent behavior showed it was behaving in our best interests, that might be fine. But clearly, it isn’t.

Senator Elizabeth Warren’s proposal to break up Facebook won’t shed any light on how it works. Transparency will. That’s what the Federal Trade Commission should demand, now.

I don’t want the government deciding what’s shown on Facebook. I want Facebook to be clearer about how it decides. Facebook should clarify the workings of its news feed algorithm at all levels. It should provide a sophisticated technical specification for computer scientists as well as a simpler explanation for ordinary citizens.

More importantly, Facebook should open up its news feeds to testing. Anybody that wants to perform experiments on the news feeds should be able to. That means not just government regulators but academics, news organizations, watchdog non-profits like the Electronic Freedom Foundation, and even advocacy organizations like the Heritage Foundation and the Economic Policy Institute. They should be able to take a standard set of test profiles — “people” that look like a cross-section of America, with test sets of friends — and analyze what those people are seeing every day.

Advertisement



Just as Consumer Reports tests dishwashers and electric drills, we should be able to test America’s most powerful voice. We should be able to see if it is favoring liberal or conservative views, how it screens for offensive material, and whether lies spread faster than truth does.

Balance comes from sunshine. When these groups can assess and publish the results of their experiments — and of Facebook’s changing news feed policies — they’ll help quantify where Facebook is biased and how it might improve. This is the 21st century version of broadcasters’ now-abandoned “fairness doctrine.”

Facebook must also invest more in a transparent set of rules blocking pernicious content. “They need to begin to take responsibility for what’s on their platforms,” says Jonathan Taplin, author of the cautionary tale “Move Fast and Break Things: How Facebook, Google, and Amazon Cornered Culture and Undermined Democracy.” “While they say they are not doing editorial work, they need not be held to the same standards as anyone else in the information business.”

Facebook is engaged in a battle with its worst users. As Altimeter Group senior fellow Charlene Li, a close observer of Facebook for over a decade, says, “They can’t control what people want to share — it’s an absolutely impossible situation.” But Facebook does gate content: its artificial intelligence attempts to recognize and block nudity and hate speech. It kicked off the Infowars provocateur Alex Jones for inciting violence and hate. It just needs to do better. It took 29 minutes for it to take down the Christchurch shooting video. Then, while its algorithm identified and blocked 80% of uploads of edits of the video, it still allowed hundreds of thousands to go up on the platform at least briefly.

Advertisement



Facebook’s current content blocking system depends not just on AI, but also on user reports and human evaluators. As a graphic expose in The Verge revealed, Facebook outsources its “content moderation” to companies like Cognizant, where poorly paid and emotionally traumatized workers labor for starvation wages. Regulators’ next Facebook consent decree must require increased investments in better compensated human screeners working with better automated screening tools.

Finally, we must demand transparency in Facebook’s data practices and relationships. The default at Facebook is to maximize data collection and minimize restrictions and reporting. Christian J. Ward, data partnership expert and coauthor of “Data Leverage,” a book I edited, suggests replacing this with “radical transparency surrounding customer data usage.” He believes that Facebook must reveal “exactly what data it stores about Facebook users (and the non-users it tracks) . . . and who (companies and personnel) has access to that,” including advertisers. Facebook should publish and update that information monthly.

There’s a certain justice to requiring more openness from Facebook. After all, that’s what it’s asked of us all these years.

Advertisement



“Facebook has been incredibly cavalier and paternalistic when it comes to how they use people’s data,” says Forrester Research vice president and privacy analyst Fatemeh Khatibloo. “[Its] perspective is that the world must be open and connected. That’s the [original] mission statement. That is not good and right for everyone, and yet Facebook has unilaterally made the decision to make the world more open and connected for everyone.”

With 37 to 40 per cent profit margins, Facebook can afford to invest in transparency. Media companies with far less reach than Facebook are subject to regulation. Given the scale of Facebook’s influence — and its commitment to a more open and connected world — it certainly owes us all a clearer picture of how it works.

Josh Bernoffis the author or coauthor of six business strategy books. Follow him on Twitter @jbernoff. He blogs daily atBernoff.com.