Last spring, Apple launched a new privacy feature giving iPhone users greater control over how apps track their activity across other apps and websites. It streamlines into a single setting the confusing and sometimes overwhelming privacy options people often face, and the update has been extremely popular.
Yet while Apple’s new privacy feature is a step in the right direction, it is merely the latest in a long line of individualistic solutions to data privacy. Our public conversation about these issues is dominated by the flawed premise that privacy is an individual problem that can be solved by finding the right settings for users to better control their data. Although personal preferences certainly matter, this individual rights framework offers an incomplete understanding of the problem and an inadequate set of solutions.
Advertisement
Data protection is a collective problem. Each person’s choices about what data to share affect people they know, generally without the permission of those other people. If you email someone with a Gmail account, Google has a record of your message even if you aren’t a Google user. If your cousin gets her DNA sequenced, a company like 23andMe conceivably could glean some information about your genetic makeup.
Just as important, businesses infer much of what they know about us from patterns identified in massive data sets covering millions of users. A few scattered pieces of data about a single individual have almost no financial value without the context that comes from information about countless others. The ads Facebook shows you or the products Amazon recommends are not merely guesses based on a handful of data points about you but fine-tuned suggestions drawn from the behavior of scores of other people with similar likes and buying habits.
A collective problem like this requires collective responses. Addressing privacy solely by giving individuals greater control over their data makes about as much sense as trying to eliminate air pollution only in your own backyard. Because data is generated from networks, even if the vast majority of users opt into an individual privacy feature like Apple’s, the information gleaned from the small percentage who still allow tracking will reveal a great deal about the rest of the group.
Advertisement
Fortunately, policymakers around the world are beginning to pay serious attention to privacy and data sharing, with new legislation passed or pending in the European Union, several US states, India, the UK, and elsewhere.
But while these new regulations are critical, many of them fall into the same trap as Apple’s privacy feature by focusing largely on individual rights and personal choice. They give users more mechanisms to control their own information while failing to account for the fluid, interconnected nature of data. That’s why these top-down regulations should be complemented by bottom-up solutions that democratically represent our collective interests.
We, the authors, are part of a group of researchers and experts laying the groundwork for data cooperatives, a new type of organization that would serve as a sort of union to help along negotiations between the people who produce data and the companies that use it. If you chose to join a data co-op, it would negotiate with platforms on your behalf for better terms and conditions for the use of your data. These data co-ops could take a number of forms, all united by the goals of facilitating shared decisions about the use of digital services; negotiating privacy policies and terms of service on a collective rather than an individual basis; and using the collective nature of the co-op to expose problematic platform behavior.
Advertisement
Collective interventions like data co-ops are critical to righting the balance of power in our data economy. Just as a minimum wage and workplace safety rules set a basic standard for worker well-being without negating the need for workers to collectively push for better wages and conditions in their workplaces, government privacy regulations leave room for collective action that can address how different groups of consumers want to share and protect their data.
None of us can tackle this challenge on our own. Most of the time, we don’t even know how data about us is being used. We’re overwhelmed and confused by the dizzying array of privacy options we already have, and businesses are skilled at coaxing us to share more information. To make real progress, we must understand data as the product of and the property of groups and networks, not just individuals, and we must look for collective solutions.
Katrina Ligett is an associate professor of computer science and director of the Internet & Society Program at the Hebrew University of Jerusalem. Kobbi Nissim is a professor of computer science at Georgetown University and affiliated with Georgetown Law School. Matt Prewitt is a lawyer and president of RadicalxChange Foundation. They are part of the Data Co-Ops Project, a cross-disciplinary initiative designing frameworks for data cooperatives.
Advertisement