The University of Cambridge Psychometrics Centre took one look at my life on Facebook and concluded that I am both liberal and politically libertarian, contemplative and competitive, artistic but with a strong interest in engineering. Also “organized and hard-working,” and highly intelligent.
All that just from using a software program the Psychometrics Centre wrote that analyzes all the stuff I’ve “liked” or posted about on Facebook. It’s an amusing parlor trick. But could you elect a president by analyzing Facebook likes? Maybe, if you multiply my test results by 50 million or so.
According to The New York Times and The Observer of London, political researchers from England, including several with connections to the University of Cambridge, did exactly that, on behalf of the 2016 presidential campaign of Donald Trump. Combined with the evidence that Russian conspirators used Facebook advertising in an effort to influence the race toward Trump, it starts to feel like the American who had an outsized hand in the presidential election was Facebook founder Mark Zuckerberg.
The credit or blame for Trump’s election should still fall to the 62,984,825 American voters who backed him, not foreign conspirators. But that doesn’t let Facebook off the hook. The company’s anything-for-a-buck business practices made it easy for the Russians to secretly feed voters a toxic stew of lies and distortions.
And Facebook’s disregard for its users’ privacy enabled a company called Cambridge Analytica — originally staffed by researchers out of the University of Cambridge — to scoop up reams of sensitive data and use it to predict, or maybe influence, the political behavior of millions.
According to the Times and the Observer, rogue Cambridge academic Aleksandr Kogan created a personality-profiling app in 2014, based on research conducted at the school’s Psychometrics Centre. It worked on the same principles as the version you can try at the center’s website.
Kogan received permission from Facebook to ask its users to participate in his research, and hundreds of thousands agreed to download the app and share personal information. At the time, Facebook also allowed researchers and developers to collect information on the friends of people using those apps. That allowed Kogan to significantly increase the amount of data he collected. Facebook halted this practice in 2015, but too late.
It’s hard to know why the company tolerated this kind of fishing expedition. In 2011, Facebook settled a Federal Trade Commission complaint that accused the company of being cavalier about its users’ privacy. Among other things, Facebook said it would no longer share users’ personal information without asking them first. Presumably, the company believed that revealing our friends’ likes was still permissible, because the social network kept at it for another four years.
In addition, Facebook claims that Kogan promised to use the data only for academic research. “Real academics have stronger privacy and confidentiality safeguards than corporations do,” said Christian Sandvig, a professor at the University of Michigan who has conducted research using Facebook data. Sandvig said academic researchers have to get their experiments approved by university ethics boards, must destroy the data they collect once their research is complete, and cannot resell it.
It seems nobody told Kogan. The Times and the Observer said he sold the data to Cambridge Analytica — which got funding from the wealthy GOP donor Robert Mercer and Stephen Bannon, Trump’s campaign manager, according to the Times — along with the know-how to use it. Facebook’s deputy general counsel, Paul Grewal, called the arrangement “a serious abuse of our rules.”
The Psychometric Centre built its software by examining the Twitter and Facebook posts and likes from thousands of users who also undergo standard psychological testing to measure specific traits, like diligence or open-mindedness. The researchers found that certain Internet habits correspond to particular test results. Gather enough of a person’s Facebook history and you can determine whether someone is an introvert, for instance, or relatively open-minded. Political operatives could combine this with other publicly available information, such as voter registration data, to figure out the personality types that favor their candidate, and what messages will best motivate them.
In discussing its work for the Trump campaign, Cambridge Analytica doesn’t mention its use of Facebook data. Instead, it said it combined intense polling of people with other research to forecast voter behavior.
“Every time we polled an individual, we matched their information with existing data in our database. Analyzing everything from their voting history to the car they drive, we identified behaviors that were correlated with voting decisions. These models allowed us to predict the way individuals would vote — even if we didn’t know about their political beliefs,” Cambridge Analytica says on its website.
It sounds plausible, but it’s hardly foolproof. For instance, after also saying that I have the mind of a 29-year-old, the Cambridge center’s software informed me that I’m single. My wife might have something to say about that.
It just goes to show that personality algorithms have their limits. After all, Cambridge Analytica came late to the Trump campaign. It had previously worked its magic for a rival GOP candidate, Ted Cruz, who didn’t win the nomination. Besides, whether through targeted psychometrics or goofy Russian memes, foreign interlopers did nothing more than try to change voters’ minds.
On the surface I don’t see anything particularly wrong with that — unless our would-be persuaders lie about their identities and motives, as the Russians have done. Or unless their efforts are based on a massive violation of privacy, made possible by stolen data. This seems to have happened with Cambridge Analytica.
In both cases, the ultimate responsibility rests with Facebook, a company that knows pretty much everything there is to know about one-fourth of the human race, except perhaps for how much we hate being lied to and how much we value our privacy.