SUBSCRIBE

HIAWATHA BRAY | TECH LAB

Is digital justice colorblind?

By Globe Staff 

For a defendant standing before a judge, hoping to be granted bail, a piece of software might play a significant role in the decision. And that could be bad news, especially if the defendant is black.

On Monday, ProPublica ran a disturbing story asserting that COMPAS, a “risk assessment” program used in courtrooms throughout the United States, produces outcomes that are biased against African-American defendants. A judge using COMPAS might deny bail to a black suspect who ought to get it or spring a white suspect who needs a time-out, according to the online investigative news site.

Advertisement

It’s one example of the problems that may arise when we let statistics and computer algorithms do our thinking for us. But there are plenty of others.

Academic researchers have found evidence that Google’s search technology can inadvertently generate results that are biased against black-sounding names, women, and conservative politicians. Under pressure from politicians, Amazon.com agreed to provide same-day delivery service to black neighborhoods throughout the United States after a data-driven decision that excluded those zip codes spawned national outrage. And on Monday, Facebook announced an overhaul of its Trending Topics software, in response to claims of built-in liberal bias in its selection of news stories.

COMPAS and other such risk-assessment programs were created with the best of intentions: They’re supposed to ensure that decisions are based on data rather than human prejudices, and they could help reduce the number of defendants languishing behind bars before their trials.

Northpointe Inc., the company that sells COMPAS, flatly denies that its software is biased and points to a 2010 Florida State University study that found no racial bias in it.

“I think that they misapplied various statistical methods,” Jeffrey Harmon, general manager of the Michigan company, said of ProPublica. “The results from their analysis are incorrect.”

Advertisement

Who to believe? It’d be a lot easier if we could see exactly how the software works. But we can’t because Northpointe won’t open its code to outsiders. And when it’s software that can determine a person’s freedom, I think total transparency should be mandatory.

Using data to predict criminal behavior isn’t a bad idea. Judges and parole boards must do it all the time, though they’re inevitably tainted by biases and errors. In the 1920s, courts began using statistical analyses of past criminal behavior to boost their batting average. In those days, the models often took race into account.

Nowadays computers scour millions of criminal cases, in search of risk factors — a suspect’s neighborhood, his prior arrests, alcohol consumption, and employment status, among others. Questions about race are off limits.

The software uses historical data to build profiles of high-, medium- or low-risk suspects. For instance, the program might conclude that a suspected petty thief with a job, a spouse, and no prior arrests can safely be granted bail, while an unemployed burglar with two previous arrests poses more of a threat. A judge still makes the final decision, but a digital risk assessment could tip the scales.

Dozens of these risk assessment programs are used in courts throughout the country. COMPAS, which stands for Correctional Offender Management Profiling for Alternative Sanctions, is one of the more popular programs. It’s used by the Massachusetts Department of Correction.

Meanwhile the state’s probation department is implementing other programs to assess defendants before trial.

To test COMPAS, ProPublica researchers compared its predictions about 10,000 criminal defendants in Broward County, Fla., to their actual behavior in the two years after their arrests. About 61 percent of the time, COMPAS accurately predicted who would break the law again. But it flunked at predicting who would commit violent crimes, getting it right only 20 percent of the time.

ProPublica also claimed that COMPAS was frequently wrong about the risks posed by black and white offenders. Blacks were more likely than whites to be mistakenly identified as high-risk suspects; whites were more likely to be mistakenly marked as low risk. So a judge using COMPAS might deny bail to a relatively harmless black suspect while granting it to a far more dangerous white defendant.

Is COMPAS really prejudiced? To make sure, independent experts could study the program’s raw source code. But as with most commercial software, the raw code for COMPAS is a trade secret. That’s fine for Microsoft Office, but shouldn’t criminal justice software be held to a different standard?

That’s Eric Loomis’s argument. In 2013, Loomis was arrested in connection with a drive-by shooting in Wisconsin. After pleading guilty to several charges, he was sentenced to six years in prison. Loomis objected to the sentence, because the judge based it partly on Loomis’s COMPAS score. Loomis, who happens to be white, was ranked “high risk” by COMPAS, which seems reasonable under the circumstances. But because he has no access to the inner workings of the software, Loomis and his lawyers can’t challenge its accuracy. It’s like a witness that they can’t cross-examine.

A Wisconsin appeals court concluded that Loomis might have a point. It kicked the case to the state’s supreme court, which will rule on the matter sometime this year.

For some COMPAS applications, total transparency may not be so important. For instance, the Massachusetts Department of Correction uses COMPAS to profile people after they’ve been sentenced to prison, to design support programs that’ll keep them out of trouble.

But if software is helping put people into jail, it ought to be an open book.

“It has to be possible for citizens to understand how government decisions are arrived at,” said Eben Moglen, a law professor at Columbia University and founder of the Software Freedom Law Center. “If it’s done by math, they ought to be able to check the math.”


Hiawatha Bray can be reached at hiawatha.bray@globe.com
Follow him on Twitter @GlobeTechLab.