Social networks like Facebook know a lot about their users and deploy the data to target advertising. The vast majority of people on Facebook accept that they are voluntary prey for advertisers — it’s the soft cost of using the service. But Facebook took its algorithmic engineering to a whole new and highly questionable level, as users learned last week, by turning a small percentage of those users into unwitting guinea pigs for a study on human emotions.
Facebook undertook to explore how emotional states are transmitted on social media. Over the course of a week in January 2012, it conducted a secret psychological experiment using an algorithm to curate positive or negative words out of almost 700,000 unsuspecting users’ news feeds. In other words, Facebook customized some users’ feeds toward negative posts, others positive, to see if social media emotions are contagious.
This creepy experiment was done because, as a Facebook data scientist said, “We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out.”
Facebook may have the legal right to treat its users like lab rats — everyone who registers on the site must check off on a 9,000-word agreement, which includes consent for one’s information to be used for “research” — but it violated a good-faith understanding with its members. Facebook users trust the site to make their experience as relevant as possible. In creating a secret experiment — one that, without their knowledge, could truly affect how users feel — it took its clout and power too far.