In September, the FBI announced that it has achieved “full operational capability” of its Next Generation Identification system—a billion-dollar project to replace the bureau’s old fingerprinting system with the world’s biggest biometric database. This makes it possible for the first time to link multiple kinds of biometric identification—including voice features, palm prints, and even DNA profiles—and combines civil and criminal information within one master database.
But perhaps most controversially, it will also use state-of-the-art facial recognition technology, allowing the government to identify suspects across a gigantic database of images collected from mug shots, surveillance cameras, employment background checks, and digital devices seized with a search warrant. The technology itself is still evolving rapidly; for example, the National Institute of Justice is developing 3-D binoculars and cameras that allow facial recognition and capture in real time.
There’s something unsettling about the notion that the government is actively trying to recognize its citizens by face: It suggests that the simple liberty of going out in public anonymously could become a thing of the past. The new system has come under fire from privacy rights advocates who fear that the federal databases will eventually be cross-referenced against other data, connecting your face to your medical, financial, legal, and driver’s license records. And there is no real way to opt out; as Jennifer Lynch, a senior staff attorney for the Electronic Frontier Foundation, testified during a 2012 US Senate hearing on facial recognition technology, “Americans cannot participate in society without exposing their faces to public view.” Even Eric Schmidt, executive chairman of Google (which has secured several patents to boost facial recognition accuracy for its products) has said that he finds the technology a little “creepy.”
If all this sounds novel, it’s actually just the latest outgrowth of an art and science that has been under development for more than 150 years. As various techniques for recognizing human faces have been developed and improved, they have often encountered hesitations like these—and, ultimately, moved right along anyway.
Concerns about privacy have largely faltered before the stated goal of most facial recognition programs: to recognize danger and to keep society safe. The belief that crime can be defeated through technological means propels innovations in the field ever onward. Today, companies are experimenting with more commercial uses of the technology as well—for example, to develop “ultra-targeted advertising.”
With visions like that now on the horizon, here is a brief tour of how we got here: the moments when facial recognition shot ahead, and a few moments of concern along the way.
1. THE ‘ANGEL COPIER’
From its very origins in 1839, the camera was perceived as a tool for tracking down miscreants. The Pinkerton National Detective Agency, founded in 1850, claimed to be the first organization to photograph people it apprehended. In England, a regular system of prison photography was introduced in 1852, both to make prisoners easier to find if they escaped and to enable record-sharing with other police stations.
Prison photography had many critics in Victorian society; there were frequent cases of mistaken identity, for one thing. But given that one alternative was to brand prisoners instead, defenders argued that photography was a more humane method: Among some Victorian police, the camera was known as the “angel copier.” The 1873 engraving here depicts a prisoner resisting being photographed.
2. MEASURING HUMANITY
Alphonse Bertillon, a police official in Paris in the late 19th century, originated many principles of facial recognition, developing a sophisticated system that combined and stored bodily measurements and mug shots into a manually searchable database. Unlike scientists of the day who believed it was possible to read a person’s criminal “type,” Bertillon tried to identify individual criminals with precise measurements, and published guides on how to measure and classify their body parts; an 1896 English-language edition included this table of ears. He also pioneered the “Bertillon card,” which paired photos of the suspect’s full face and profile with his or her name, measurements, and other information.
The Bertillon mug shot was picked up by police worldwide, becoming an iconic trope in Hollywood and the art of Andy Warhol and Marcel Duchamp. Still, as Bertillon himself noted, locating one face among many images is extremely difficult “if you have no other means but your eyes to search for the photograph among the thousands in an ordinary collection.”
3. CAMERAS EVERYWHERE
The rise of cheaper and more portable methods of popular amateur photography by 1900 meant that ordinary people began taking photographs of themselves and others in public—sometimes without their subjects’ consent. A flood of lawsuits was lodged against photographers who circulated or publicly exhibited people’s portraits without permission. “Amateur Photographic Pest,” an 1890 cartoon in the satirical magazine Punch, captures the invasions of privacy often associated with photography in an age when, as Punch noted, the camera seemed to thrust its “prying” eye into everything.
4. HAVE YOU SEEN THIS MAN?
Before the development of the halftone process made it easier to reproduce photographs on paper, police had to glue photographs of conspirators onto their “Wanted” posters. (One of the earliest such posters, in 1865, bore the portrait photograph of Abraham Lincoln’s assassin, John Wilkes Booth.) The first FBI “identification order,” signed by FBI assistant director Frank Burke, was circulated in 1919, with an attached photostat of a portrait of a man wanted for Army desertion. Meanwhile, in cases where no photo of the suspect was available, law enforcement relied on hand-drawing to represent witnesses’ or victims’ memories of a suspect’s face.
5. RECOGNIZED BY COMPUTER
The first experiments with semi-automated computer-based facial recognition were made during the mid-1960s by Woodrow Wilson Bledsoe, a pioneer of artificial intelligence, who devised a system for noting key facial landmarks on each picture (for example, the width of the mouth or between eyes). In the 1980s and 1990s, research by mathematicians Michael Kirby and Lawrence Sirovich at Brown University and computer scientists Matthew Turk and Alex Pentland at MIT led to a linear algebra-based system called Eigenfaces, which can plot a human face efficiently by focusing on the ways it deviates from the average. This screenshot shows some of the faces that can be created through adjusting those variables.
6. BIOMETRICS TO GO
By the 1980s, a loose collective of companies and academics had begun working in the field of automated personal identification, or biometrics. The tragedy of 9/11 created an opening for biometrics developers to expand their sales in America, a nation that saw itself as vulnerable precisely because it could not identify its enemy. The wars in Afghanistan and Iraq were a windfall for the biometrics industry, with occupying armies using portable systems to identify and track members of the local populations. As with many innovations produced by the Department of Defense, high-tech facial recognition technology is now being used by domestic police as well, including at large-scale American events and gatherings.
7. FACIAL IDENTIFICATION GOES COMMERCIAL
Today, biometrics is big business: The global market grew from $400 million in 2000 to $5 billion in 2011, and is projected to increase to $23 billion by 2019. A major driver is the spread of digital camera technology (notably through smartphones) and Internet photo sharing sites, where users willingly post and tag images of themselves.
As this huge trove of data accumulates, companies are experimenting with ways to use facial recognition technology to drive sales and attention. In September 2009, for example, Coca-Cola Zero launched a Facial Profiler app on Facebook that scanned photos for people who looked like you. The tagline for the project was: “If Coke Zero has Coke’s taste, is it possible someone out there has your face?” Increasingly, we have the technology to find out.
Jennifer Tucker is an associate professor of history and science in society at Wesleyan University, and the author of “Nature Exposed: Photography as Eyewitness in Victorian Science.” She is working on a book about Victorian facial recognition, photographic evidence of identity, and the law.