Software Used to Predict Crime Can Now Be Scoured for Bias
Predictive-policing startup CivicScape published its code online, allowing anyone to help ensure that the algorithm doesn’t unfairly target certain groups of people.
Predictive policing, or the idea that software can foresee where crime will take place, is being adopted across the country—despite being riddled with issues. These algorithms have been shown to disproportionately target minorities, and private companies won’t reveal how their software reached those conclusions.
In an attempt to stand out from the pack, predictive-policing startup CivicScape has released its algorithm and data online for experts to scour, according to Government Technology magazine. The company’s Github page is already populated with its code, as well as a variety of documents detailing how its algorithm interprets police data and what variables are included when predicting crime.
“By making our code and data open-source, we are inviting feedback and conversation about CivicScape in the belief that many eyes make our tools better for all,” the company writes on Github. “We must understand and measure bias in crime data that can result in disparate public safety outcomes within a community.”
Algorithms are playing an increasing role in our lives and society. They’re responsible for much of the information we surface online, and are being relied upon to help governments make policy decisions. What’s troubling is how little we know about how they work.
Related: Will Predictive Policing Make Militarized Police More Dangerous?
Also read: Thanks, America! How China’s Newest Software Could Predict, Track, and Crush Dissent
That’s why opponents of predictive policing, such as the American Civil Liberties Union, have made transparency their primary concern. The ACLU has argued that when these algorithms are developed behind closed doors and then applied to the public, citizens can’t accurately understand how they’re being policed—meaning they can’t hold police accountable for potentially discriminatory practices.
“The natural tendency to rush to adopt new technologies should be resisted until a true understanding is reached as to their short and long term effects,” the ACLU wrote in August 2016. “Vendors must be subject to in-depth, independent, and ongoing scrutiny of their techniques, goals, and performance.”
While CivicScape can’t control what police departments do after buying the software, Anne Milgram, chair of the company’s board of directors and former New Jersey attorney general, told GovTech that they’re going to develop it in the open.
“We don’t want to say, ‘Trust us, and we’re going to build an algorithm behind closed doors,’” Milgram said.
CivicScape claims to not use race or ethnic data to make predictions, although it is aware of other indirect indicators of race that could bias its software. The software also filters out low-level drug crimes, which have been found to be heavily biased against African Americans.
While this startup might be the first to publicly reveal the inner machinations of its algorithm and data practices, it’s not an assurance that predictive policing can be made fair and transparent across the board.
“Lots of research is going on about how algorithms can be transparent, accountable, and fair,” the company writes. “We look forward to being involved in this important conversation.”