fb-pixel

Duron Harmon of the New England Patriots: three-time Super Bowl champion, or candidate for a police lineup? How about Brad Marchand? Stanley Cup winner or a guy with an arrest record? And is that Chris Sale, World Series star, or somebody awaiting trial?

Apparently, Amazon can’t tell the difference.

Among the Internet titan’s many technology businesses is a leading facial-recognition software system called Rekognition, which Amazon has marketed to police agencies for use in their investigations. And according to the Massachusetts chapter of the American Civil Liberties Union, Rekognition mistakenly matched 27 well-known athletes from Boston sports teams to a database of mugshots of real people who had been arrested. Among the misidentified: Harmon, Marchand, and Sale.

Advertisement



“Even athletes who are immediately identifiable to people across New England can be misidentified by facial-recognition technology that is in use in Massachusetts right now,” said Kade Crockford, director of the Technology for Liberty Program at the ACLU of Massachusetts.

The civil rights group ran a test of Amazon Rekognition as part its campaign to have Massachusetts impose regulations on the use of facial-recognition technology, releasing the results ahead of a State House hearing Tuesday on legislation that would impose a moratorium on the use of such software.

“We are not asking the Legislature to ban this technology,” Crockford said. “We are asking the Legislature to press pause.”

The ACLU test is similar to one it conducted last year, which found that Amazon software mistakenly matched 26 California state legislators to mugshots in a database of 25,000 photos of people who’d been arrested.

This time, the testers compared photos of 188 New England athletes from the Boston Bruins, Boston Celtics, Boston Red Sox, and New England Patriots with a database of 20,000 mugshots. The software delivered 27 false positives.

Two Boston Celtics made the list: Tacko Fall and Gordon Hayward. Rekognition also singled out six Red Sox, including Chris Sale and Hector Velazquez; five Bruins, including Sean Kuraly and Marchand; and 14 Patriots, including Stephen Gostkowski, James White, Phillip Dorsett, and Harmon.

Advertisement



In a statement provided by the ACLU, Harmon said: “If it misidentified me, my teammates, and other professional athletes in an experiment, imagine the real-life impact of false matches. This technology should not be used by the government without protections.”

But Amazon is fighting back, accusing the ACLU of “knowingly misusing and misrepresenting Amazon Rekognition to make headlines.” The company complained that the ACLU ran the test improperly, saying the advocacy group used less precise settings that produce a higher failure rate.

Amazon Rekognition can be adjusted to make it more or less precise in identifying faces, a so-called confidence threshold. The ACLU tests were at an 80 percent confidence threshold, the software’s default setting. But the online documentation for Rekognition says that public safety agencies like police forces should use the software at a tougher 99 percent confidence setting, to reduce the risk of misidentification.

“When used with the recommended 99 percent confidence threshold and as one part of a human driven decision, facial recognition technology can be used for a long list of beneficial purposes, from assisting in the identification of criminals to helping find missing children to inhibiting human trafficking,” Amazon said.

Amazon also said it favors regulating government use of facial-recognition software.

Crockford acknowledged the ACLU used the 80 percent threshold, but noted it’s the default setting and there’s no guarantee a police agency would use the stricter setting.

Advertisement



Facial recognition has become common in consumer products, such as smartphones that unlock themselves by recognizing their owners. It’s also used for checking in airline passengers and identifying public school students. But the systems have flaws: They’re much less accurate, for example, at identifying people with dark skin. And since people have no way of knowing when they’re being scanned, government agencies could routinely monitor citizens’ activities without their permission — a practice already standard in China.

The ACLU favors further study of the technology.

But Woodrow Hartzog, a professor of law and computer science at Northeastern University, believes the temptation to abuse facial recognition is so irresistible that he favors a ban.

“I can’t foresee any scenario where the benefits even come close to outweighing its significant risks and inevitable, unacceptable abuses,” he said.


Hiawatha Bray can be reached at hiawatha.bray@globe.com. Follow him on Twitter @GlobeTechLab.