fb-pixel Skip to main content
hiawatha bray | tech lab

When the Facebook trend is not our friend

A Facebook employee walked past a sign at the company’s headquarters in Menlo Park, Calif.
A Facebook employee walked past a sign at the company’s headquarters in Menlo Park, Calif.Jeff Chiu/Associated Press/file 2013

In 2014, the social network Facebook ran a controversial secret experiment that tried to change its users’ emotions. Now a startling report from the tech website Gizmodo suggests that company employees are secretly trying to change our opinions.

Facebook denies the whole creepy story, but true or not, it’s still an alarm ringing in the night. We were told that cool, dispassionate computers select what a billion people see every day on Facebook. If even a small part of it is really being done by humans with axes to grind, watch out.

The website interviewed several people hired to curate or review stories before including them in Trending Topics, a feature that shows news stories popular among Facebook users.

Advertisement



The sources, who insisted on anonymity, claimed that some curators rejected stories from right-of-center news sources. If a popular story originated at a conservative site like Breitbart.com or RedState.com, it would be ignored unless it also appeared on a “mainstream” site like CNN or The New York Times.

The workers told Gizmodo the policy wasn’t ordered by Facebook management; it just reflected the attitudes of the curation team. “We were doing it subjectively,” said one of them. “It just depends on who the curator is and what time of day it is.”

The whistle-blowers also claimed that Facebook adds stories to Trending Topics, even when they aren’t trending. For instance, editors posted news of the Charlie Hebdo terrorist attack almost immediately. Or they elevated nontrending stories about the civil rights movement Black Lives Matter. This isn’t always bad. Editors are expected to know what’s important before readers do. But marking a story as “trending” can slant the news by giving minor stories more prominence than they deserve. And it would run counter to Facebook’s claim that Trending Topics are entirely data-driven.

Advertisement



Tom Stocky, the company’s vice president of search, wrote on his Facebook page Monday night that curators are supposed to dismiss “junk or duplicate topics, hoaxes, or subjects with insufficient sources.” Everything else is left alone, he wrote.

“Facebook does not allow or advise our reviewers to systematically discriminate against sources of any ideological origin, and we’ve designed our tools to make that technically not feasible,” Stocky said. Neither does the company add nontrending stories to the list. “Our reviewers’ actions are logged and reviewed,” Stocky said, “and violating our guidelines is a fireable offense.”

I sure hope so, though we’ll have to take Stocky’s word for it. But since 61 percent of American millennials and 39 percent of baby boomers rely on Facebook for political news, according to the Pew Research Center, should his word be good enough?

It hasn’t satisfied US Senator John Thune. The South Dakota Republican has demanded an explanation from Facebook chief executive Mark Zuckerberg. Thune should buzz off. Even if Facebook slants the news, it has a right to do so, like any magazine or blog.

The possibility of personal bias is an ironic counterpoint to my column last week. There, we saw that computer algorithms introduce bias into Internet services because the algorithms provide different results depending on what’s asked and who’s asking. For instance, your Trending Topics on Facebook will differ from mine because we have different Facebook friends and have clicked on different kinds of stories. Facebook’s software generates a personalized view of the world, which inevitably adds some bias.

Advertisement



Human editors could provide a little more balance. But as the Facebook controversy shows, we risk replacing automated bias with the human variety.

There was a nationwide furor in 2014, when Facebook and researchers at Cornell University found they could alter people’s moods by sending them upbeat or depressing news stories. But some Facebook employees learned nothing from that public relations debacle. In March, dozens of them asked Zuckerberg, “What responsibility does Facebook have to help prevent President Trump in 2017?” Happily, the company replied that Facebook is politically neutral. But the Gizmodo story suggests that some Facebookers would gladly put a thumb on the scale.

Pot, meet kettle. Traditional media have always been edited by humans. Pick up the Globe, and you’re getting a view of the world that is, inevitably, colored by our personal tastes, interests, and biases.

But traditional media companies have checks and balances to keep us reasonably honest. Some have full-time ombudsmen to call out bias. Nearly all welcome letters to the editor. The names, e-mails, and phone numbers of staff members are easy to find. Most effective is the reporter’s byline. My name is attached to everything I write, so there’s a price to pay when I blunder.

But Facebook’s curators have no bylines. The software engineers who created the algorithms are equally anonymous. The entire system is relentlessly opaque, and likely to remain so. The choice is simple — trust Facebook, or not.

Advertisement



I vote for “not.”

No offense, Mr. Zuckerberg. The same attitude applies to all news media, even the good old Globe. No one news source will ever deliver a flawlessly unbiased view of reality, so never settle for just one. Look at other newspapers and magazines and Internet sites, not just the ones that Facebook or anybody else wants you to see. We’re all trying to affect the way you think. Whether we succeed is entirely up to you.