fb-pixelWhen software doesn’t play fair - The Boston Globe Skip to main content
hiawatha bray | tech lab

When software doesn’t play fair

Amazon boxes were stacked for delivery in Manhattan.Mike Segar/Reuters/File

When it comes down to a fight between politicians and geeks, I usually root for the ones with the thickest eyeglasses. Usually, but not always. Sometimes, the pols have a point.

Consider the case of Amazon.com, which has been browbeaten by the mayors of Boston, Chicago, and New York into providing its Prime same-day delivery service in low-income neighborhoods previously denied this benefit. A recent report in Bloomberg BusinessWeek revealed that Amazon froze out certain ZIP codes, such as Roxbury or the South Side of Chicago, areas inhabited by large numbers of black people.

Advertisement



Amazon told Bloomberg that too few customers lived in the areas, or that making deliveries there would cost too much. There’s no evidence the company was motivated by racial animus; it was motivated by math. Amazon holds billions of sales records collected from millions of customers, and it doesn’t make a move without crunching those numbers. It’s “big data” at its biggest, the kind of statistical analysis used these days to make key decisions at every major company on Earth.

But there’s more to life than big data. Sometimes, software should be overridden by common sense. After all, single moms in poor neighborhoods might derive more benefit from same-day service than pampered millennials in trendy ZIP codes. So a chastened Amazon has expanded its same-day service offerings to the neglected neighborhoods.

It was a minor embarrassment, easily resolved. But bigger controversies are brewing. Even the fairest software can produce biased results. And it’s not clear that we can do much about it.

The issue caught fire three years ago. That’s when Latanya Sweeney, a computer scientist and professor of government at Harvard University, discovered that Google searches of her name were usually accompanied by ads for companies offering criminal background checks. An African-American woman of spotless character, Sweeney got curious. She soon found that names that sounded like they belonged to black people, like Latanya, were 25 percent more likely to trigger ads for criminal records than names that sounded white, say, Kristen.

Advertisement



Is Google’s search software racist? Not likely. It merely contains algorithms — step-by-step instructions — that alter the program’s performance, depending on how humans use it. Imagine thousands of searches like “Latanya criminal records.” The searchers might be racist white people, or black people trying to check out a new best friend. Either way, if Google gets many such requests, background-check ads will appear whenever someone types “Latanya.” Google’s algorithms learn from our behavior, whatever its motive.

Similar cases abound. Last year, computer scientists at Carnegie Mellon University found that Google’s advertising service seems to discriminate by gender. Internet users who identified themselves as female were less likely to see ads for high-paying jobs than male users. Meanwhile, researchers at the University of Maryland found that Google searches of Democratic presidential candidates produced more flattering results than those of Republicans. Searching, say, “Hillary Clinton” delivers more upbeat Web pages than “Donald Trump.”

Again, Google is probably just responding to its users’ behavior. If more men click on high-paying job ads, they’ll see more such ads and women will see fewer. If most Donald Trump searchers click on anti-Trump sites, they’ll get still more of them. It’s the algorithms’ way of reflecting reality.

Advertisement



Trouble is, algorithms also create reality. If Amazon chooses not to offer same-day service in a marginal neighborhood, the neighborhood becomes a little more marginal. If Web searches of Latanya are consistently accompanied by ads for background checks, a potential employer might decide she’s not worth the risk. And if the top five results of a Hillary Clinton search are always positive, could that even sway voters?

In addition, the software is beyond our control. Users have no idea how it works. Even its creators aren’t sure what those algorithms will do next, because the software constantly varies its performance, in response to our requests. Amazon’s decision on free delivery was one big mistake, but the software in our social networks, search engines, and advertising services could be feeding us biased data millions of times a day.

What to do? Christian Sandvig, an associate professor of communication studies and information at the University of Michigan, says academics, journalists, and the Federal Trade Commission should run algorithm audits, a high-tech version of the methods used to identify racial discrimination in housing. Computers could run simulated queries on Google, Facebook and other popular services, in a bid to spot biased responses.

Specialists say there is no law that explicitly prohibits algorithmic bias, and Sandvig doubts that any such law could be enforced. But public exposure and political pressure worked with Amazon, and it might be our only defense against digital prejudice.


Hiawatha Bray can be reached at hiawatha.bray@globe.com. Follow him on Twitter @GlobeTechLab.

Advertisement