scorecardresearch Skip to main content
IDEAS

Q&A: The battle over face surveillance is about to heat up

The rapid expansion of facial recognition technology could unlock a dystopian future. A leading activist argues that the only solution is to ban it.

Evan Greer of Fight for the Future argues that facial recognition technology should be prohibited, largely because there's no meaningful way for people to grant their consent.Kayana Szymczak

In recent years, privacy activists and civil rights groups have persuaded governments in several cities, including Boston, to block police and other municipal agencies from using facial recognition technology as a surveillance tool. But now, the campaign to stop facial surveillance is expanding. Privacy, civil rights, and human rights organizations are calling for a ban in the private sector — which would prevent stores, sports arenas, and other businesses from automatically identifying people based on scans of their faces.

The technology appears to be spreading quickly. Amazon requires delivery drivers to sign a “biometric consent” form to keep their jobs. Banks are rolling out facial recognition to identify customers and employees. The technology has cropped up in shopping malls, bars, and even churches.

Advertisement



One of the groups leading the charge against the technology is Fight for the Future, a digital rights advocacy group founded in 2011. Evan Greer, the organization’s Boston-based director, has helped persuade music venues not to use the technology at concerts and festivals. She has also galvanized college students to protest its use on college campuses.

I’ve known Greer for some time through my own research on technology and privacy. Last year, we co-authored an op-ed about facial recognition at schools. Now that the movement against facial recognition technology is expanding, I want to better understand her thinking and strategy. While the initial protests against facial surveillance primarily focused on shutting it down in particular contexts, now the goal is to persuade lawmakers and the public to embrace an outright ban. That step, total prohibition, is rarely taken with technologies. Our conversation has been edited and condensed.

What’s the worst that can happen if the private sector continues to use facial recognition systems?

Advertisement



A casino’s slot machine scans your face to detect when you’re getting frustrated, triggering a small payout to keep you gambling and losing money. A private surveillance firm offers Big Oil companies a paid service to identify and track the movements of youth climate protesters, looking for ways to discredit them. A retail store decides to scan the faces of everyone approaching the building, barring anyone with a criminal record from entering.

Inside the store, it uses facial recognition to show you targeted advertising based on the products you look at but never buy — or even personalized pricing based on a perception of your income once they’ve identified you. You get ejected from a sporting event because a flawed algorithm decides you look intoxicated based on your eye movements. You don’t get hired for your dream job because a discriminatory algorithm decides you don’t look trustworthy. A college student is brutalized and falsely arrested outside their dormitory after a racially biased facial recognition algorithm falsely matches them with someone on a campus watch list and alerts police. A misogynist startup offers a paid service for stalkers and abusers to scour the Internet for photos or video of their prey, or for creeps to scan through porn sites looking for people they know. An anti-choice group uses facial recognition to publicly identify people entering abortion clinics and subjects them to online harassment. An evangelical group uses it to out LGBTQ people in places where doing so can get them killed.

Advertisement



An employee at a Subway store in Los Angeles stands before a PopID camera in 2020. The system can identify workers by their faces and take their temperature to screen their health.TAG CHRISTOF/NYT

How likely is it these horrible things will occur?

These nightmare scenarios are terrifying precisely because they are so plausible. One of the greatest dangers of private and corporate use of facial recognition is that seemingly mundane uses, like using your face to pay for a burrito or board a plane, normalize the practice of handing incredibly sensitive biometric information to corporations. This paves the way for ever more invasive, discriminatory, and abusive uses of our data.

This is the future we are headed toward if lawmakers don’t draw a line in the sand now and ban this uniquely dangerous technology. We are at a pivotal moment in human history, a turning point.

Is banning the technology the only way to prevent appalling outcomes?

Facial recognition is more like biological weapons than it is like alcohol or cigarettes. Its use, whether by law enforcement or private corporations, poses such a threat to the future of liberty that it cannot be effectively regulated. It must be banned.

Setting rules of the road for its use will only serve to hasten the widespread adoption and commercial spread of a technology that is fundamentally incompatible with basic human rights and democracy.

The Electronic Frontier Foundation, one of the leading defenders of digital rights, disagrees with you. It doesn’t support bans on facial recognition in the private sector. It recommends strict regulations that protect our right to give or withhold consent, limit how long facial data is stored, and require facial data to be securely stored. Why wouldn’t this work?

Advertisement



Tech companies have repeatedly made a mockery of the concept of consent. For consent to be meaningful it must be informed, and the vast majority of people don’t understand the risks associated with handing their biometric information to a private company. An opt-in consent framework also fails to address power imbalances in society. If an employer requires its workforce to agree to facial recognition surveillance to have a job, that’s not meaningful consent — especially at a time when many people are desperate to put food on the table. If opting out of facial recognition in the airport makes you seem “suspicious” or could make you late for your flight, many people will be afraid to do it, regardless of what the law says. There will always be edge cases and exceptions in a framework like this, and they will disproportionately harm marginalized people.

So you’re saying stores shouldn’t even be able to use facial recognition systems on security footage, to deter crime and catch criminals?

Yes. We need to stop trying to solve systemic problems like poverty and injustice with dystopian technology. Address the reasons people have to steal from stores rather than equipping stores with ever more invasive, racist, and discriminatory technology.

Evan Selinger is a professor of philosophy at the Rochester Institute of Technology and an affiliate scholar at Northeastern University’s Center for Law, Innovation, and Creativity. Follow him on Twitter @evanselinger.