In his latest Silicon Valley satire, “The Every,” Dave Eggers describes a powerful Big Tech company cruelly neutralizing the threat posed by Tom, a politician with an anti-monopoly agenda. The company’s leadership invites Tom to their campus and surrounds him with workers wearing extremely tight-fitting clothing. Unable to control his wandering eyes, Tom gazes at “phalluses . . . buttocks and breasts, abs and lats.” Eye-tracking cameras record it all, and once footage of his lecherous looks goes online, the backlash ends Tom’s career.
Eggers’s fiction hits remarkably close to potential reality in light of Mark Zuckerberg’s plan to focus on the metaverse — a more fully embodied version of the Internet that people will access through virtual reality and augmented reality devices. Because Zuckerberg foresees the metaverse going mainstream in the next five to 10 years, he has changed the company’s name to Meta Platforms. And in what could become a remarkable case of life imitating art, Meta plans on releasing a virtual reality headset that tracks users’ eyes.
Many are skeptical of Zuckerberg’s metaverse vision. For one thing, the timing of the metaverse hype looks like an attempt to shift the conversation away from all the bad press Facebook is receiving — a view whistleblower Frances Haugen espouses. Meanwhile, the technology still has a way to go to do what Zuckerberg envisions. Current virtual reality headsets are too clunky to be adopted widely. And if and when the technology is ready, people may resist diving into Meta’s world because they don’t trust Zuckerberg enough to give him control of something that can expand his power further.
At the same time, we should acknowledge Meta has too much business savvy to throw away $10 billion this year on a project that lacks a serious financial upside. After all, eye tracking is the Holy Grail of advertising. Moving beyond tracking clicks on the commercial Internet could someday generate a fortune. And let’s face it, distrust has never slowed Zuckerberg down.
Although multiple companies, including Microsoft and Snap, are contributing to the creation of a metaverse, Zuckerberg emphasizes that it will have a single “defining quality”: the feeling of presence. For example, Zuckerberg’s launch video opens with him waxing poetic about users jumping into a virtual reality home space that recreates aspects of their existing homes, contains creative objects that don’t exist in the real world, and offers a dramatic view of whatever the user finds moving.
Creating a sensorily rich experience like this in a digital environment requires visually stunning settings that are easy to navigate. And eye-tracking technology embedded in virtual reality headsets and augmented reality glasses will be crucial in making that possible. As Brittan Heller notes in “Reimagining Reality: Human Rights and Immersive Technology,” programmers can optimize the digital rendering of immersive virtual settings by mirroring how the human visual system works. If virtual reality headsets track our eyes, they can put high-quality images in the places we’re looking at and set lower-quality images in the regions our peripheral vision glimpses. This technique also can reduce “simulation sickness,” the virtual reality form of motion sickness that can leave folks feeling unsettled, disoriented, and even nauseated.
Eye-tracking technology is also likely to play an important role in augmented reality glasses like future versions of the ones Facebook has rolled out with Ray-Ban. When these glasses overlay digital elements like text and images on your view of the world, their knowing where you’re looking will make it possible for them to do it in a minimally distracting way. Imagine, for example, an augmented reality sports app that shows statistics for a player only when the user is looking at them.
Additionally, as Joseph Jerome and Jeremy Greenberg note in “Augmented Reality and Virtual Reality: Privacy and Autonomy Considerations in Emerging, Immersive Digital Worlds,” eye tracking can increase the user’s “capacity to direct and control content” in augmented and virtual reality. Perhaps someday, they suggest, it might empower us to act like the Marvel character Tony Stark, who uses “simple eye movements to bring up or activate capabilities in his Iron Man suits.” It’s less strenuous and less obtrusive to swing a sword in a fantasy role-playing game by blinking than by using your arm to pantomime it.
Eye tracking also will help people — or their digital avatars — make eye contact in the metaverse. This amounts to a significant advantage over current video communications platforms like Zoom. On Zoom, you can’t look someone in the eye. If you try to, you have to look into the camera. But by looking away from the screen that shows the other person’s image, you miss seeing how they react. Eye contact, real or simulated, also will make computer-generated characters in video games seem more realistic. Probably with these goals in mind, Meta is developing a VR headset, code-named Project Cambria, that allows “your virtual avatar to maintain eye contact and reflect your facial expressions.”
Add up all of these technological rationales for eye tracking, and in a few years, the price of entering the metaverse may be that companies get to track our eyes.
Manipulation in the simulation
Some eye-tracking studies aren’t available for public scrutiny and are proprietary corporate information. Reality Labs at Meta, for instance, appears to be engaged in serious biometrics research. But we can’t fully know what Meta and other companies hope to learn about us through eye tracking.
It’s clear, however, that we will be risking quite a bit. After all, many scientific disciplines study eye tracking, including psychology, cognitive science, and neuroscience. So the possibilities are daunting for what companies will try to do with detailed information about our eye movements, iris texture, pupil size and reactions, and much more.
Jacob Leon Kröger, Otto Hans-Martin Lutz, and Florian Müller, tech researchers in Germany, unearthed some of these potential applications by reviewing the leading eye-tracking literature and some of the relevant patents and available commercial products. In an article called “What Does Your Gaze Reveal About You? On the Privacy Implications of Eye Tracking,” they pointed to several ways that eye tracking purportedly can reveal sensitive personal information. These include determining our identity (through iris recognition), our state of mind (such as if we’re focused or distracted), our personality traits (including neuroticism), our ethnicity (by correlating cultural biases with behavior), our level of skill at given tasks (based on factors like how long we look at something before acting), our age (based on factors like how visual stimuli affect pupils), our gender (based on assumptions about the types of content different demographics find interesting), our preferences and aversions (including sexual arousal to specific behaviors and body parts), our moods and emotions (through several ocular measures including pupil size), our levels of fatigue (through several ocular measures including blink rate and duration), whether we’re intoxicated or drug-impaired or have a substance abuse disorder (including by examining eye and gaze properties), and aspects of our physical and mental health (including by treating specific patterns of eye movements as signs of Alzheimer’s disease or schizophrenia). Eye-tracking researchers believe the technology not only will support real-time inferences but also has predictive value and can indicate things like whether someone will become depressed.
Kröger, Lutz, and Müller offer two disturbing observations. One of them is that it’s not necessarily true that eye tracking does reveal all these characteristics. We shouldn’t assume all of the eye tracking studies are methodologically sound. Among other complications, many of the inference methods were “tested under controlled laboratory conditions and lack evaluation in real-world scenarios,” the authors note. Unfortunately, these limitations won’t necessarily hold companies back, and we shouldn’t assume they will interpret eye-tracking data only in scientifically valid ways. Inaccurate assumptions probably will be made in the metaverse, and they’ll likely have discriminatory effects. As a cautionary tale, consider the debate over AI inferring our emotions from analyzing video. Already some employers use flawed software to study job applicants’ micro-expressions to help determine if they’ll “be a good employee.”
The second problem is that we can’t control many of the things our eyes do, such as “stimulus-driven glances, pupil dilation, ocular tremor, and spontaneous blinks.” And in the cases where eye control is possible, we readily can become too tired to maintain it. Given these constraints, Kröger, Lutz, and Müller depressingly conclude, “it can be very difficult or even impossible for eye tracking users to consciously prevent the leakage of personal information.”
In other words, it looks as if a flaw that has long plagued Facebook will intensify in the metaverse. The flaw is that the company does things without truly obtaining user consent. As Woodrow Hartzog and I have argued, when Facebook had facial recognition technology on its platform, it didn’t come close to alerting people about all the dangers the technology poses. In the metaverse, expect something similar to happen with eye tracking. It’s notable that while Facebook recently stopped the controversial use of facial recognition to tag people in photos, Meta reserves the right to use that technology in the metaverse. And that’s to say nothing about the risk that eye-tracking technologies will be turned on non-consenting people — a possible outcome if smart glasses acquire outward-facing eye-tracking capabilities.
Protecting society from eye exploitation in the metaverse will require many approaches, including new privacy regulations. What should those be? A range of possible restrictions and limitations should be debated, but one of them is already clear: We should ban targeted advertising based on eye-tracking data.
In the short term, as hype about the metaverse builds, I’d like to see more counterprogramming. To fight fire with fire, we need designers to create virtual and augmented reality experiences that perform the public service of highlighting the problems with eye tracking.
For inspiration, designers might consider a memorable virtual reality scene in “The Matrix.” Keanu Reeves’s character, Neo, is being trained to fight a hostile computer system and gets distracted when a beautiful woman wearing a red dress catches his eye. We quickly learn the woman is an enemy agent in disguise. The two take-home lessons are: Don’t trust what you see in a simulation when its creator doesn’t have your best interests at heart, and your wandering eyes can be a vulnerability in an untrustworthy environment.
By contrast, we have Zuckerberg’s promotional metaverse video. It features him playing a fun game of cards with friends in virtual reality. The scenario is a self-serving idealization because it ignores the corporate surveillance that would take place on the back end if we were there with our companions. A more responsible scene would remind us that poker players shield their eyes with dark glasses to avoid giving away valuable information. In Zuckerberg’s metaverse, will we be able to prevent our eyes from giving Meta great insight into our souls?
Evan Selinger is professor of philosophy at the Rochester Institute of Technology and an affiliate scholar at Northeastern University’s Center for Law, Innovation, and Creativity. Follow him on Twitter @evanselinger.