scorecardresearch
DANTE RAMOS

Prevent discrimination? Facebook algorithms alone aren’t enough

There’s nothing good about classified ads that contain illegal “whites only” clauses, or subtler code language that still tells black and Latino job or housing applicants to get lost. But at least the authors’ intentions are easy to spot.

That’s more than you can say about an innocuous-looking Facebook ad that minority users never even get to see.

The world’s dominant social-networking site soaks up advertising dollars by the billion because of its impressive ability to target ads to highly specific demographics. Yet the news organization ProPublica recently reported something troubling : Facebook allows advertisers to avoid certain users based on their “ethnic affinity.” Testing out the system, reporters placed an ad for a housing-related event and, with a few clicks, could keep the ad from anyone whom Facebook deemed to have a black, Latino, or Asian-American affinity. In theory, the owner of a big apartment complex, or a company seeking employees for a new location, could do the same.

Civil rights groups are appalled, and rightly so, because protecting access to housing, employment, temporary lodging, and transportation for Americans of all colors has been a decades-long battle. Hard-won gains in brick-and-mortar America won’t translate into equal opportunity on the Internet — unless the public, lawmakers, and tech firms themselves demand as much.

“It’s just a 21st-century twist on an old problem that’s still with us,” said Oren Sellstrom, the litigation director of the Boston-based Lawyers’ Committee for Civil Rights and Economic Justice, when I asked him about ProPublica’s Facebook expose.

Advertisement



The promise of Silicon Valley is that everyone benefits when information flows freely and sources of friction in the economy disappear. But in some ways, disruptive new technologies are perpetuating old inequities. Earlier this year, Sellstrom’s organization took on Airbnb over the refusal of some home-sharing hosts to rent to minority guests. Amazon rolled out same-day delivery across the Boston area — except in much of Roxbury.

And just this past week, a study conducted by researchers at MIT, the University of Washington, and Stanford concluded that Uber drivers in Boston cancel on riders with African-American sounding names such as Rasheed and Keisha twice as often as they cancel on riders named Brendan or Allison. The New Economy, this research suggests, still has plenty to learn about guaranteeing everybody the same quality of service.

The Old Economy has its own problems, of course. While traditional taxis are generally required to pick up all customers and serve all neighborhoods, minority residents — especially black men — know there’s a gulf between what the law requires and the way many drivers behave.

Advertisement



Uber and its rival Lyft have, or can get, the data necessary to identify drivers who disproportionately cancel on minority passengers. As new laws in Massachusetts and elsewhere bring these transportation networks out of a legal gray zone, regulators can push them to tune up their software to prevent driver misconduct.

Until recently, tech companies have avoided much of the legal scrutiny to which old-school businesses routinely submit. Newspapers have been held liable for publishing classified ads with discriminatory language. In contrast, Craigslist has mostly avoided that responsibility by portraying itself as a mere bulletin board for other people’s postings.

Facebook — whose algorithms serve up a tailored experience for each user — probably couldn’t use that excuse, but offending ads will be harder to detect from the outside. “With Craigslist, it’s unfortunate that [discriminatory] stuff occurs,” says Robert Terrell, executive director of the Fair Housing Center of Greater Boston. “But at least we know where all the bad guys are.”

On the upside, Facebook’s ability to sort users into categories is far from perfect. With the help of a Google Chrome browser extension developed by ProPublica, I checked into what Facebook knows about me. The social network ascribes no ethnic affinity to me, but it does think — for reasons I cannot fathom — that I’m interested in fish farming, good-guy professional wrestlers, and Puffy AmiYumi, a Japanese pop duo I’d never heard of until Googling their name just now.

Advertisement



Then again, the social network has correctly guessed which electronic devices I use, what my hobbies are, and much more. Today, it’s a rough portrait; soon enough, it’ll be uncanny.

So it’s all the more urgent for Facebook, and for the other tech giants now transforming the way we do business, to reckon with how much power they exert. They’re starting to do so: Airbnb launched an antibias initiative; Amazon relented on same-day delivery in Roxbury. Facebook, responding to the ProPublica report, says the discriminatory use of ethnic affinities violates its policies. “If we learn of advertising on our platform that involves this kind of discrimination,” the company’s head of multicultural affairs declared, “we will take aggressive enforcement action.”

These firms should get used to more scrutiny from regulators and civil rights groups. But as they mature, it’s also incumbent on companies like Facebook to anticipate the most obvious abuses of their technology, rather than waiting for someone else to complain. Fairness doesn’t just happen algorithmically. It takes human judgment and human intervention.


Dante Ramos can be reached at dante.ramos@globe.com. Follow him on Facebook: facebook.com/danteramos or on Twitter: @danteramos.