Advertisement

technologyTechnology
clockPUBLISHED

Over 1,000 Experts Call Out "Racially Biased" AI Designed To Predict Crime Based On Your Face

James Felton

James Felton

James Felton

James Felton

Senior Staff Writer

James is a published author with four pop-history and science books to his name. He specializes in history, strange science, and anything out of the ordinary.

Senior Staff Writer

comments41Comments

Shutterstock / metamorworks

Have you ever messed up so badly at work that 1,000 experts band together to tell your publishers to stop, rescind the offer of publication and thoroughly explain themselves? No? Well, spare a thought for Harrisburg University who find themselves in this exact situation today.

In an upcoming book to be published by Springer Nature, Transactions on Computational Science & Computational Intelligence, the team from Harrisburg University outlined a system they created that they claimed (in a press release that has now been removed from online), "With 80 percent accuracy and with no racial bias, the software can predict if someone is a criminal based solely on a picture of their face. The software is intended to help law enforcement prevent crime."

Advertisement

Alarmed by the many and immediate problematic assumptions and repercussions of using "criminal justice statistics to predict criminality," experts from a wide range of technical and scientific fields including statistics, machine learning, artificial intelligence, law, history, and sociology responded in the open letter, categorically stating:

"Let’s be clear: there is no way to develop a system that can predict or identify 'criminality' that is not racially biased — because the category of 'criminality' itself is racially biased," adding "data generated by the criminal justice system cannot be used to “identify criminals” or predict criminal behavior. Ever."

The authors of the letter write that research like this rests on the assumption that data on criminal arrests and convictions are "reliable, neutral indicators of underlying criminal activity," rather than a reflection of the policies and practices of the criminal justice system, and all the historical and current biases within it.

"Countless studies have shown that people of color are treated more harshly than similarly situated white people at every stage of the legal system, which results in serious distortions in the data," the group calling themselves the Coalition for Critical Technology write.

Advertisement

"Thus, any software built within the existing criminal legal framework will inevitably echo those same prejudices and fundamental inaccuracies when it comes to determining if a person has the 'face of a criminal.'"

Essentially – as with so many other forms of technology – the system will replicate the inherent racial biases of the data it's been fed. The system would identify the face of someone who the police may profile, a jury may convict, and a judge may sentence. All of which is tainted by prejudice.

The letter points out that "police science" has been used as a way to justify racially discriminatory practices. Despite being "debunked numerous times throughout history, they continue to resurface under the guise of cutting-edge techno-reforms, such as 'artificial intelligence.'"

The letter asserts that any AI systems that claim to predict criminal behavior on physical characteristics are a continuation of the long-discredited pseudoscience of phrenology. As well as being used by Europeans as a "scientific" justification for their racist beliefs of their superiority over non-white peoples, the authors state phrenology and physiognomy were and are "used by academics, law enforcement specialists, and politicians to advocate for oppressive policing and prosecutorial tactics in poor and racialized communities."

Advertisement

The Coalition for Critical Technology asks that Springer Nature condemn the use of criminal justice statistics to predict criminality and acknowledge their role in "incentivizing such harmful scholarship in the past".

A statement from Harrisburg University says that "All research conducted at the University does not necessarily reflect the views and goals of this University," and the faculty is "updating the paper to address concerns raised".


ARTICLE POSTED IN

technologyTechnology
FOLLOW ONNEWSGoogele News