There’s still much that we don’t know about the decisions that shape our daily experiences on Facebook, Google, and other online platforms, but a single watchdog nonprofit group, the Tech Transparency Project (TTP), has revealed a wealth of information that these companies have tried to keep hidden.
When news stories describe the internal machinations at big tech companies, findings from TTP often inform much of that reporting. In the past few months alone, TTP has unearthed information on dangerous Facebook groups, including militias and presidential election conspiracy theorists; Facebook and Google’s problematic conduct toward the press; and, perhaps most revealingly, communications between tech executives and government officials.
TTP originally got funding from Oracle, the huge software company, but now it says it takes no corporate money. Its donors include computer scientist David Magerman, the George Soros-founded Open Society Foundations, and Craigslist creator Craig Newmark.
I talked with Katie Paul, who directs TTP, about how her organization gets its information and how she’d like to see tech companies regulated differently. This interview has been condensed and edited for clarity.
What are your organization’s goals, and have they shifted since its founding?
TPP actually started as the Google Transparency Project, in 2016, and the vision back then was to fill a gap in coverage. People thought Google was just a search engine, but it’s actually this multibillion-dollar company with a lot of influence. Our goal was to help educate the public on what, exactly, the company was doing, so they weren’t so in the dark about where the big tech money that’s influencing Congress, and influencing policies, was coming from.
Since then, it’s evolved, and this year, we officially launched as the Tech Transparency Project. We look at more than just Google — we’re looking at Apple and Facebook, and likely Amazon as well. We want to expose some of the things that platforms said they’re going to be doing and show that they’re not following up on those promises. We’re unmasking information that the average person isn’t necessarily going to find, but what we’re finding is all through open-source research.
All of your research is open source? How does that work?
Yes, everything we do is open source. That way, everything we find can be reproduced, and we also share our findings. We do a lot of FOIA [Freedom of Information Act] requests. Those get us content and communications between officials in the government and people at Facebook, Google, and other tech companies. That is one way we can show how the major tech platforms get what they want from government. We do this as a public service and make tens of thousands of pages of these documents available on our website.
TTP uncovered some fascinating information about how Facebook dialed back its voter registration efforts after getting complaints.
Facebook set ambitious goals for registering voters ahead of the 2020 election, but the company has also shown itself to be susceptible to Republican pressure campaigns. TTP’s research found some worrying signs that Facebook scaled back the initial stages of its voter registration project, and our report sought to hold Facebook accountable for its promises.
What do you think have been some of TTP’s biggest successes?
Some of our research has led to what we feel are really impactful, important changes. For instance, we had a series on the Boogaloo movement, and how they were using private Facebook groups to coordinate and organize for a civil war. They were sharing bomb-making instructions and coordinating on a local level. We first released a report about that back in April. It was pretty concerning material, and we really tried to raise the alarm. Facebook didn’t ban the group until June 30, after somebody had been murdered and several people arrested on terrorism-related charges.
The fact that our small group of researchers routinely flags troubling content on Facebook and Google — which each have billions in profits and tens of thousands of moderators — is a sign of how ineffective their protections are.
You’ve argued for legislative solutions to these problems, correct?
Yes. We need to see accountability for allowing illicit content on a platform. If we look at the banking industry, for instance, if banks were allowing cartels and criminal organizations to openly launder money in the same place where you have your personal savings, there would be a huge public outcry. We need to look at massive tech companies the way we’ve looked at other industries that require regulation to protect consumers.
For two decades, the dominant tech platforms have argued they can regulate themselves, but we now know the harms to our society and democracy are simply too great. Governments must step in to ensure that technology works in the public interest and not just to the benefit of the companies and their shareholders.
Sarah Ruth Bates is a freelance writer in Boston. Follow her on Twitter at @sarahrbates.