This past March, as Congress prepared to make it legal for Internet service providers to sell user data to third parties, Angela Grammatas, a coder based in San Francisco, took her two children to a local wildlife center to hear a zookeeper give a talk about camouflage.
In the wild, the zookeeper explained, prey species have evolved different strategies to hide from predators. An animal with a striped pelt might blend into tall grass; the spots on a butterfly’s wings might mimic an owl’s eyes.
As she listened, Grammatas’ mind drifted to the project she was already working on: a way to camouflage user data on the Internet.
Online, Grammatas knew, tracking algorithms stalk users through the web like predators in a jungle. And it’s getting harder and harder to hide. The digital spoor we leave behind us as we click from site to site is highly valuable to companies that want to know not just what we buy and read but how we think and how best to influence us. Forget garden-variety identity theft, or even simple surveillance. Data tracking can allow companies, lobbyists or even political parties to determine what we see online, showing us content tailored to reinforce our prejudices, or instill new ones. The result: Each person inhabits their own Internet, their online experiences controlled and curated by agents they know nothing about.
“I think about data and how it’s collected all the time,” Grammatas said. “[Data tracking] can be used for good, but I think it can also be used very creepily.”
To counter online tracking, Grammatas was working on a browser plug-in she calls Noiszy. While traditional methods of hiding activity online have involved encryption, Noiszy works differently: While you surf the web, Noiszy quietly checks it at random websites in the background of your browser, leaving “misleading digital footprints,” as Grammatas explained in a post.
Instead of making user data invisible, it leaves it out in the open — but all those extra clicks “pollute the data,” Grammatas said, drowning out the signal of a person’s web activity with meaningless, randomized “noise” the same way a fawn’s dappled spots might hide it from the gaze of a wolf, or the riot of stripes on a pack of fleeing zebras keeps a color-blind lion from singling out any one animal as prey.
We generally conceive of cybersecurity as a matter of building high walls to keep viruses, spyware, malware, and other intruders out. On your computer or phone, the icon you click to toughen up your privacy and security settings is probably a shield or a padlock. And nature has its own ingenious ways of defending the perimeter — from a turtle’s shell to a porcupine’s spikes.
But as Grammatas and others look to the animal kingdom for inspiration, they’re seizing on a very different self-defense strategy: one that strives not to be impermeable, but undetectable. What has emerged is an intriguing cross-pollination between evolutionary biology and a seemingly unrelated field of scientific inquiry.
Animal camouflage has been studied since long before “information technology” existed, but it is in itself a form of information technology. The flow of data between predator and prey is constantly being obscured, manipulated and, to use Grammatas’ word, polluted by the misleading markings, behavior, and colorations prey animals have evolved. The parallel has profound implications: Camouflage strategies perfected by nature may have applications in the digital realm, but we might also be able to predict how corporate trackers will evolve in response by looking at how predators have done so.
For Helen Nissenbaum, a technologist and professor at Cornell Tech and New York University, there’s no better example of anti-surveillance techniques than the orb-weaving spider. To catch prey, the spider has to be exposed, perched on its web, but that also leaves it vulnerable. So the spider “weaves silk around prey to create lumps that are about the same size as the spider, and the idea is to create decoys for wasps and other predators,” Nissenbaum explained. “It’s a very robust strategy.” (There are also spiders that take the concept further and build entire dummy spiders, complete with legs, out of debris in their webs.)
Back in 2006, Nissenbaum was involved in one the earliest attempts at using noise to camouflage online activity: TrackMeNot, a program that “helps protect Web searchers from surveillance and data profiling,” as Nissenbaum and her partner, technologist Daniel C. Howe, wrote. It does so, they continued, “not by means of concealment or encryption (i.e. covering one’s tracks), but instead by the opposite strategy: noise and obfuscation . . . essentially hidden in plain view.”
It’s a strategy Nissenbaum calls “obfuscation,” and she describes it as “a weapon of the weak” that depends on “obscurity, unintelligibility, and bewilderment.” In 2015, she and her NYU colleague Finn Brunton co-wrote a book-length anti-surveillance manifesto entitled “Obfuscation: A User’s Guide for Privacy and Protest.” Chapter two begins with orb-weaving spiders.
“Nature has been doing this forever,” Nissenbaum said.
Could camouflage programs eventually do the same thing — trick algorithms into thinking something that’s not true?
In the natural world, the most common form of obfuscation is simple camouflage, known to biologists as “crypsis.” The goal here is to be invisible — it’s nature’s equivalent of using an encrypted browser, hiding your activity from your Internet provider and its tracking algorithms altogether. Species that have evolved this strategy simply blend into the background — a predator scanning for prey simply won’t get any information.
But the strategy that most interests Nissenbaum is one biologists call “masquerade.” Think of an insect that has evolved to closely mimic a twig, or a fallen leaf. Masquerading animals are meant to be seen, but not recognized for what they are. The predator will register that something is there, but they won’t be able to draw the right conclusions from what they’re seeing. That’s not dissimilar to how “noisy” online obfuscation tactics work.
As computer scientists draw inspiration from biologists, the converse is true as well. Biologists are even beginning to discuss animal masquerade in technological terms: “For understanding various mechanisms of camouflage, the concept of signal-to-noise ratio provides a useful tool,” wrote Sami Merilaita, a researcher at Åbo Akademi University in Finland and an expert on animal camouflage, in a paper published in May. “In the case of mimicry and masquerade, the signal is not diminished; in this case there is increased noise arising from a salient but false signal from the prey.”
Both in the natural and the digital world, the success of such a ruse relies on exploiting the ways the algorithm or predator “thinks” — its cognition.
For example, because predators evolved neural mechanisms for discerning the edges of objects, some prey animals evolved what scientists call “disruptive” coloration: mottled markings that made it harder to discern their outlines. Others developed markings that create faux edges, so that they appear smaller or a different shape than they really are.
Disruptive markings are an example Nissenbaum and her co-author Finn Brunton name-check in “Obfuscation”: “Breaking up the outlines doesn’t make a shape disappear entirely, as when a flounder buries itself in sand or an octopus uses its mantle to masquerade as a rock,” they write. “Rather, for situations in which avoiding observation is impossible — when we move, change positions, or are otherwise exposed — disruptive patterns and disruptive coloration interfere with assessments of things like range, size, speed, and numbers.”
Of course, as animals in the wild evolve camouflage patterns to deter lions and wasps — or as consumers use similar techniques to fend off data collectors — predators are likely to adapt as well. “If [predators] have a hard time finding prey, they respond to that by adjusting their behavior,” Merilaita said by phone from Finland. “For example, they decrease their search rate, they are maybe scanning the environment slower.”
In fact, the use of obfuscation tactics in nature may eventually result in smarter predators. The more prey use camouflage tactics, the more pressure there is on predator species to become more perceptive.
And online, tracking algorithms will soon face the same pressure to adapt. Grammatas isn’t the only one working on digital camouflage: In early April, digital activist Dan Schultz released the Web app Internet Noise, which Googles random terms while you browse. Later that month, an artist named Ben Grosser created Go Rando, a browser plug-in that obfuscates Facebook user data by randomizing how you respond to posts. Around the same time, coder Cathy Deng released Noisify, which generates random searches on Facebook.
As obfuscation programs proliferate, “[Algorithms] will certainly get smarter, smart enough to filter out Noiszy-type stuff,” Grammatas said. “And then things will need to evolve.”
But there are other implications, too. For example, prey species can actually manipulate predator behavior through camouflage: False eyespots on a tail, for example, can redirect a predator’s strike, or even scare them off. Could obfuscation programs eventually do the same thing — feeding, say, misleading browsing patterns to tracking algorithms to convince corporations of something that’s not true?
Imagine you could modify Grammatas’ plugin to only overwhelmingly click on sites from a particular political party. If enough people downloaded the plug-in, would algorithms become convinced that one party was more popular than it actually was?
That’s something else Grammatas said concerns her. “What I’m most scared about as a human being is the idea of nefarious data pollution,” she said. “I’m deeply concerned about how data pollution can be used in bad ways.”
An endless struggle over data between users and corporations might seem inevitable, but Nissenbaum sees another possibility: Nature’s arms race is eternal, but ours doesn’t have to be. “There’s more than one end to the story,” she said. “With predators, their very survival depends on [catching prey]. Maybe we can create situation where the survival of these companies does not depend on such serious exploitation of data subjects.”
Corporations will never be able to create fully obfuscation-proof algorithms, Nissenbaum said. “As long as they want to engage you in some type of activity, they may remain vulnerable in a certain way,” she said. “Closing the vulnerabilities costs, and we want it to cost. The real end of the story is when they say, ‘We see people don’t want or like what we’re doing, and we see that you have some power to bother us and cost us; maybe the less costly thing is to sit down and talk about it.’
“That’s the ending I would like,” she said. “It’s like saying, ‘We don’t have to hunt you any more.’”S.I. Rosenbaum can be reached at email@example.com. Follow her on Twitter @sirosenbaum.