fb-pixelFacebook draws fire on ‘related articles’ push - The Boston Globe Skip to main content

Facebook draws fire on ‘related articles’ push

The question arises as to why Facebook doesn’t try to verify or debunk stories that it pushes as “related articles.”Handout

A surprise awaited Facebook users who recently clicked on a link to read a story about Michelle Obama’s encounter with a 10-year-old girl whose father was jobless.

Facebook responded to the click by offering what it called “related articles.” These included one that alleged a Secret Service officer had found the president and his wife having “S*X in Oval Office,” and another that said “Barack has lost all control of Michelle” and was considering divorce.

A Facebook spokeswoman did not try to defend the content, much of which was clearly false, but instead said there was a simple explanation for why such stories are pushed on readers. In a word: algorithms.

Advertisement



The stories, in other words, apparently are selected by Facebook based on mathematical calculations that rely on word association and the popularity of an article. No effort is made to vet or verify the content.

Facebook’s explanation, however, is drawing sharp criticism from experts who said the company should immediately suspend its practice of pushing so-called related articles to unsuspecting users unless it can come up with a system to ensure that they are credible.

“They have really screwed up,” said Emily Bell, director of Columbia Journalism School’s Tow Center for Digital Journalism. “If you are spreading false information, you have a serious problem on your hands. They shouldn’t be recommending stories until they have got it figured out.”

The incident is important, Bell said, because it illustrates the danger of having a company such as Facebook become one of the world’s most widespread purveyors of news and information.

The website relies on the idea that people trust stories posted by friends. But this recent practice, announced last December, is a departure from that ethos because no human being, much less a friend, vets related articles that are posted as a result of Facebook’s algorithms.

Advertisement



Moreover, the practice is bound to raise questions because it comes as Facebook last month announced that it is creating its version of a news service, called FB Newswire, based on social media information that it promises to verify with its partner, Storyful. These verified stories would be offered to news organizations around the world, further expanding Facebook’s influence on the way people get their news.

Storyful said on its website that it would ensure that stories are verified before they are posted on the news service, pledging that it would be “debunking false stories and myths.”

That only underscores questions about why Facebook does not similarly try to verify or debunk stories that it pushes to readers as related articles. Asked to respond, a Facebook official made clear that the company does not apply the same fact-checking standard when offering readers related stories on their news feed, such as the ones about the Obamas.

“These news feed units are designed to surface popular links that people are sharing on Facebook,” Facebook spokesman Jessie Baker said via e-mail. “We don’t make any judgment about whether the content of these links are true or false, just as we don’t make any judgment about whether the content of your status updates are true or false.”

She declined to make any other comment on the record, or to make herself or any other Facebook official available for an interview.

It is not unusual, of course, for some Facebook users to link to outrageous or false stories, and no one expects them to be verified by the company. What makes this case different is that Facebook itself posts the supposedly related articles from sources that a user never chose to trust, in effect giving them Facebook’s imprimatur.

Advertisement



Facebook has, however, taken a very different line when advising businesses what to post on the news feed. On a corporate Web page, it says that the company’s goal for the news feed “is to show the right content to the right people at the right time” and “to show high-quality posts to people.” On April 10, the company said in a press release that it had introduced new algorithms to try to stop people who try to “game news feed to get more distribution than they normally would.”

The links that Facebook itself posted on the Michelle Obama story surfaced nearly three weeks after that announcement.

Facebook’s news feed — the stream of articles, images and other content recommended by a user’s friends that greet users who log on to the service — is one of the most prized commodities in the world of digital information. An array of studies has shown that the news feed’s content can have a significant impact on public opinion and voter turnout. For example, a study published in the journal Nature found that a single, compelling News Feed message, indicating that a friend had voted, increased national turnout in 2010 by hundreds of thousands of voters.

Advertisement



A reporter came across the Michelle Obama links by clicking on an Associated Press story that had been posted on Facebook by The Boston Globe. That story was legitimate; it told how Michelle Obama accepted a resume for the jobless father of a 10-year-old girl who met the presidential spouse at the White House.

As soon as the link to that story was clicked, however, Facebook offered what it called three related articles.

The link to a story about the first couple’s supposed encounter in the Oval Office led to an article that was clearly fake and was filled with language not suitable for a family newspaper.

The link to the story saying that the president had “lost all control” of his wife quoted a supposed insider saying the first couple was “considering divorce.”

A third link, to a story saying that president’s wife “has no dignity,” was a piece of commentary.

The White House declined comment on the portrayal of the Obama family.

Nicholas Diakopoulos, a fellow at Columbia’s Tow Center who has studied the way major websites rely on data to disseminate information, said that it is not a defense for Facebook to say that it relies on algorithms when posting “related stories.”

He said that humans devise the algorithms and are responsible for their quality. An algorithm, for example, can be designed to accept stories only from a list of trusted sources. By allowing related articles from obscure and unreliable sources, Diakopoulos said, Facebook is offering its huge platform but ceding control of the content.

Advertisement



While the stories about Michelle Obama’s supposed Oval Office encounter and marriage might easily be seen as fake, the broader concern among experts is about articles that might be more subtly slanted and thus harder to detect. Such stories, pushed by their originators to make them popular on Facebook, could gradually help shape opinion.

“There is absolutely a danger here to the manipulation of an information platform,” said Diakopoulos. “The algorithms influence what we see, influence us to do certain things. It certainly could skew perceptions. It could spread misinformation.”

Google takes a different approach in compiling articles for its Google News service. In addition to using algorithms, the company said it requires news organizations to meet rigorous standards for inclusion.

Within the world of politics and social media, any move by Facebook or other major sites is closely monitored because it can change the way millions of people get information .

Tarleton Gillespie, a Cornell University associate professor who is studying what he calls “the politics of internet platforms and algorithms,” said it is increasingly important to understand how companies such as Facebook can affect the way people learn about and understand issues.

It is relatively uncommon, he said, for a company such as Facebook to use word association to recommend stories that come from different sources, with no way of knowing the legitimacy. That is much riskier than recommending articles from a news source that a user has already chosen to follow.

By doing so, Gillespie said, Facebook is veering further from being an aggregator to being a content provider, all while insisting it has no responsibility for the content.

“That walks into a very dangerous space,” Gillespie said. “It would be precarious for Facebook not to think about its public responsibility.”

Michael Kranish can be reached at kranish@globe.com or on Twitter @GlobeKranish.