You can now read 10 articles in a month for free on BostonGlobe.com. Read as much as you want anywhere and anytime for just 99¢.

The Boston Globe

Business

Tech lab

Facebook’s mind game was a violation of trust

Every so often, the giant social network Facebook does something guaranteed to rile us up or creep us out. And when it happens, we log onto Facebook and complain to our friends about it. Sometimes I wonder if it’s part of a plan, a deliberate effort by Facebook to mess with our heads. Facebook is capable of it, as we learned last weekend.

About 700,000 of Facebook’s one billion or so users recently served as test subjects in a psychology experiment. Researchers altered the users’ “news feeds” — the news stories and photos that roll across everyone’s Facebook’s home page. Some users got stories full of positive-sounding words, others with terms that sounded unpleasant and negative.

Continue reading below

The result? People who got more positive messages tended to post positive messages themselves; those bombarded with negativity were more likely to publish darker thoughts.

This experiment, recently published in the Proceedings of the National Academy of Sciences, spawned a remarkable torrent of outrage. I blame the hot-button language that Facebook data scientist Adam D. I. Kramer and his two academic partners in the experiment used to describe their achievement. The scientists wrote that they had achieved “emotional contagion,” as if congratulating themselves for spreading typhoid fever.

And the scientists also bragged about keeping users in the dark — “leading people to experience the same emotions without their awareness.”

‘’

Quote Icon

Facebook has since repented. “It was poorly communicated,” Sheryl Sandberg, the company’s chief operating officer, said Wednesday in New Delhi. “And for that communication we apologize. We never meant to upset you.”

And Kramer added — in a posting on Facebook — that he and his fellow scientists were not intending to manipulate people. Rather, he said the goal of the experiment was “to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.”

Already though, regulators in several European countries, where Internet privacy laws rules are tougher than in the US, said they’re investigating whether Facebook’s experiment violates their laws.

None of us enjoy being guinea pigs. And yet Facebook, Google and every other big-name Internet site specialize in brainwashing. They learn our online behaviors and then make billions serving up ads designed to make us buy stuff. How is this latest experiment any worse?

The lack of an explicit informed consent, for one thing. But perhaps more important is the sense of violated trust. We assume that our Facebook news appears only because it was posted by our friends, or appeared a favorite page, not because somebody at Facebook has a hidden agenda.

The clearly marked ads we understand — nothing hidden about that agenda. But for everything else, “people really are trusting them to be acting more or less in their interests,” said Harvard law professor Jonathan Zittrain.

So what happens if our Facebook news or Google search results are subtly changed, because someone is placing a thumb on the scale?

The integrity of the Internet is already in peril. In May, the Court of Justice of the European Union ruled that individuals can force Internet search engines to delete certain types of embarrassing information about them. Within days, Google had received deletion requests from over 12,000 people. Yesterday the British newspaper the Guardian reported that six of its stories have already been dropped from Google, and that many more could follow. It’s a slow-motion rewriting of history that will make Internet research far less reliable.

But Google, Facebook and other Internet titans could do far more damage. They’ve got direct access to billions of pages of data, and limitless power to manipulate it.

In a recent article in the New Republic magazine, Zittrain wrote about an experiment Facebook conducted to encourage Americans to vote in the 2010 midterm elections. It boosted nationwide turnout by about 340,000.

What if Facebook had aimed its get-out-the-vote effort entirely at Republicans or Democrats? The company could sway a close election. Facebook chief executive Mark Zuckerberg might become a political kingmaker. And millions of Americans might never realize they’d been used.

“That’s a line too far,” Zittrain said.

How to fend off such manipulation? That’s a tough one. In the case of Internet search, you can ignore Google and ... at least dip into alternate services like Bing or Yahoo, or the privacy-centric search engines DuckDuckGo and StartPage, to make sure you’re getting a fair and balanced view of online information.

But since for many there’s no substitute for Facebook, what to do about a service seemingly everyone uses, or at least visits occasionally?

The US does not have anywhere near the same attitude toward regulating the Internet as Europe, and even if we were to adopt tougher restrictions here, Zittrain points out those would likely violate the companies’ First Amendment right to publish what they choose.

So Zittrain suggests an alternative — Internet gatekeepers would voluntarily agree to abide by ethical standards similar to what doctors, lawyers and financial planners pledge. Those standards would codified in the companies’ terms of service, so they would be legally bound to follow them.

So, just as they do now, Facebook or Google could use your data to sell ads, and to give you the best possible Internet data. But they could be sued for mishandling your data to benefit themselves--by messing with your head, for instance.

Why should Facebook, Google and all the others take the pledge? Zittrain suggests offering them an inducement to do so — say, tax breaks? But the outrage over the Facebook experiment has created a more immediate incentive — a rebellion among the lab rats.

Hiawatha Bray can be reached at hiawatha.bray@globe.com. Follow him on Twitter @GlobeTechLab.

You have reached the limit of 10 free articles in a month

Stay informed with unlimited access to Boston’s trusted news source.

  • High-quality journalism from the region’s largest newsroom
  • Convenient access across all of your devices
  • Today’s Headlines daily newsletter
  • Subscriber-only access to exclusive offers, events, contests, eBooks, and more
  • Less than 25¢ a week