NEW YORK — In April 2018, Mark Zuckerberg, Facebook’s chief executive, told Congress about an ambitious plan to share huge amounts of posts, links, and other user data with researchers around the world so they could study and flag disinformation on the site.
“Our goal is to focus on both providing ideas for preventing interference in 2018 and beyond, and also for holding us accountable,” Zuckerberg told lawmakers questioning him about Russian interference on the site in the 2016 presidential election. He said he hoped “the first results” would come by the end of that year.
But nearly 18 months later, much of the data remain unavailable to academics because Facebook said it has struggled to share the information while also protecting its users’ privacy. And the information the company eventually releases is expected to be far less comprehensive than originally described.
As a result, researchers said, the public may have little more insight into disinformation campaigns heading into the 2020 presidential election than they had in 2016.
Seven nonprofit groups that have helped finance the research efforts, including the Knight Foundation and the Charles Koch Foundation, have even threatened to end their involvement.
BuzzFeed News earlier reported on researchers’ concerns.
“Silicon Valley has a moral obligation to do all it can to protect the American political process,” said Dipayan Ghosh, a fellow at the Shorenstein Center at Harvard University and a former adviser at Facebook. “We need researchers to have access to study what went wrong.”
Last week, Oxford researchers said that the number of countries with disinformation campaigns more than doubled to 70 in the last two years, and that Facebook remained the number one platform for those campaigns.
Three months after Zuckerberg spoke in Washington last year, Facebook announced plans to provide approved researchers with detailed information about users, like age and location, where a false post appeared in their feeds and even their friends’ ideological affiliation.
Dozens of researchers applied to get the information.
The company partnered with an independent research commission, Social Science One, which had been set up for the initiative, to determine what information could be sent to researchers.
Facebook and Social Science One also brought in the Social Science Research Council, an independent nonprofit, to sort through applications and conduct a peer review and an ethical review on research proposals.
But privacy experts brought in by Social Science One quickly raised concerns about disclosing too much personal information. In response, Facebook began trying to apply what’s known in statistics and data analytics as “differential privacy,” in which researchers can learn a lot about a group from data, but virtually nothing about a specific individual. It is a method that has been adopted by directors at the Census Bureau and promoted by Apple.
Facebook is still working on that effort.
But researchers said that even when Facebook delivers the data, what they can learn about activity on the social network will be much more limited than they planned for.