BC80A1F1-D017-44AB-9882-29A03F58AA10.jpeg

A police department that relies on facial recognition software has admitted that it has a false positive rate of over 90 percent. What this means is that nearly every person who is marked as a suspect by this system is actually an innocent person who will be interrogated by police, or possibly worse, because they were wrongly identified by this faulty technology.

According to a report from the Guardian, the South Wales Police scanned the crowd of more than 170,000 people who attended the 2017 Champions League final soccer match in Cardiff and falsely identified thousands of innocent people. The cameras identified 2,470 people as criminals but 2,297 of them were innocent, and only 173 of them were criminals, a 92 percent false positive rate.

According to a Freedom of Information request filed by Wired, these are actually typical numbers for the facial recognition software used by the South Wales Police. Data from the department showed that there were false positive rates of 87 percent and 90 percent for different events. Further, it is not clear how many of these suspects were actually nonviolent offenders.

[READ MORE]

You may also like

There is something wrong with Feed URL