Advertisement

technologyTechnology
clockPUBLISHED

Thousands Of People Labeled As "Criminals" By Facial Recognition Software Were Actually Innocent

John Gomez/Shutterstock

Last year, British police departments in South Wales and Leicestershire started trialing a facial recognition technology to track down suspected criminals when they are out and about. In theory, this should cut the amount of time spent looking for and identifying lawbreakers. In reality, it is a bit of a mess.

That's because the facial recognition technology is not actually that good at recognizing faces.

Advertisement

Take this one example. During a 2017 football (or soccer) match between Real Madrid and Juventus, over 2,000 fans were mistakenly identified as potential offenders – of 2,470 individuals flagged by the system, 2,297 (92 percent) were “false positives”. 

The software relies on cameras to scan and identify faces in a crowd and check them against a photo bank of custody images. If there's a match, the person on shift will consider it and disregard it or, if they agree with the algorithm, dispatch an intervention team to question the suspect. However, a big problem lies in the fact that these images are often poor quality and blurry. This means you only have to vaguely resemble a person in one of the custody images to be flagged on the system as a possible felon.

South Wales Police admitted “no facial recognition system is 100% accurate” in a statement

This is a bit of an understatement. There have been not one but several instances when false positives have vastly outnumbered true positives, including an Anthony Joshua fight where 46 fans were incorrectly identified and a Wales vs Australia rugby match where 43 fans were incorrectly identified.

Advertisement

"I think the false positive rates are disappointingly realistic," Martin Evison a forensic science professor at Northumbria University, told Wired

"If you get a false positive match, you automatically make a suspect of somebody that is perfectly innocent."

There are also concerns about privacy, particularly as there is so little legal oversight regarding this type of technology. Big Brother Watch, a UK civil rights group, is in the process of planning a campaign against facial recognition, which they intend to bring to parliament later in the month.

"Not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool," the group tweeted. 

-

However, others argue that this type of mass surveillance is needed to keep the public safe in crowded spaces.

“We need to use technology when we’ve got tens of thousands of people in those crowds to protect everybody, and we are getting some great results from that,” Chief Constable Matt Jukes told the BBC.

“But we don’t take the use of it lightly and we are being really serious about making sure it is accurate.”

It hasn't been a total failure. South Wales Police claim it has helped catch and arrest 450 criminals, since it's launch in June 2017. They also say that no one has been wrongly arrested. 

Advertisement

“With each deployment of the technology we have gained confidence in the technology and has enabled the developers at NEC to integrate our findings into their technology updates,” a spokesperson for South West Police explained.


ARTICLE POSTED IN

technologyTechnology
  • tag
  • criminals,

  • technology,

  • police,

  • facial recognition,

  • surveillance

FOLLOW ONNEWSGoogele News