panel

DEF CON American police and the judiciary are increasingly relying on software to catch, prosecute and sentence criminal suspects, but the code is untested, unavailable to suspects’ defense teams, and in some cases provably biased.

In a presentation at the DEF CON hacking conference in Las Vegas, delegates were given the example of the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS) system, which is used by trial judges to decide sentencing times and parole guidelines.

“The company behind COMPAS acknowledges gender is a factor in its decision-making process and that, as men are more likely to be recidivists, so they are less likely to be recommended for probation,” explained Jerome Greco, digital forensics staff attorney for the Legal Aid Society.

“Women [are] thus more likely to get probation, and there are higher sentences for men. We don’t know how the data is swaying it or how significant gender is. The company is hiding behind trade secrets legislation to stop the code being checked.”

These so-called advanced systems are often trained on biased data sets, he said. Facial recognition software is often trained on data sets filled with predominantly white men, he said, making it less effective at correctly matching up people of color, according to research by academics.

“Take predictive policing software, which is used to make decisions for law enforcement about where to patrol,” Greco said. “If you use an algorithm based on data from decades of racist policing you get racist software. Police can say ‘It’s not my decision, the computer told me to do it,’ and racism becomes a self-feeding circle.”

It’s not just manufacturers who are fighting disclosure around their crime-fighting tools – the police are too. The use of stingray devices, which mimic cellphone towers to catch and analyse data, was kept quiet for years – the New York Police Department used such a device over 1,000 times between 2008 and 2015*, and never mentioned it, Greco said.

While the use of Stingray is now established, the equipment has been upgraded and similar kit cannot also analyse mobile messages and data streams, he said. They are also using password cracking code for mobile phones that hasn’t been assessed and which cannot be assessed – because it is only ever sold to law enforcement, he claimed.

“Software needs an iterative process of debugging and improvement,” said Dr Jeanna Matthews, associate professor and fellow of Data and Society at Clarkson University. “There’s a huge advantage to independent third party testing, and it needs teams incentivised to finding problems, not those with an incentive to say everything’s fine.” ®

* The statistic is backed by information obtained from the NYPD via a Freedom of Information Law request by the New York Civil Liberties Union.

source:-.theregister