New Data Shows Police State Facial Recognition Is WRONG Over 90% Of The Time



By John Vibes

A police department that relies on facial recognition software has admitted that it has a false positive rate of over 90 percent. What this means is that nearly every person who is marked as a suspect by this system is actually an innocent person who will be interrogated by police, or possibly worse, because they were wrongly identified by this faulty technology.

According to a report from the Guardian, the South Wales Police scanned the crowd of more than 170,000 people who attended the 2017 Champions League final soccer match in Cardiff and falsely identified thousands of innocent people. The cameras identified 2,470 people as criminals but 2,297 of them were innocent, and only 173 of them were criminals, a 92 percent false positive rate.

According to a Freedom of Information request filed by Wired, these are actually typical numbers for the facial recognition software used by the South Wales Police. Data from the department showed that there were false positive rates of 87 percent and 90 percent for different events. Further, it is not clear how many of these suspects were actually nonviolent offenders.

The South Wales police have responded to these findings by issuing the following statement:

Of course no facial recognition system is 100 percent accurate under all conditions. Technical issues are normal to all face recognition systems which means false positives will continue to be a common problem for the foreseeable future. However since we introduced the facial recognition technology no individual has been arrested where a false positive alert has led to an intervention and no members of the public have complained.

In relation to the false positives this is where the system incorrectly matches a person against a watch list. The operator considers the initial alert and either disregards it (which happens on the majority of cases) or dispatches an intervention team as the operator feels that the match is correct. When the intervention team is dispatched this involves an officer having an interaction with the potentially matched individual. Officers can quickly establish if the person has been correctly or incorrectly matched by traditional policing methods i.e. normally a dialogue between the officer/s and the individual.

The Wales Police don’t see a problem with targeting random people for search and interrogation, but these findings are raising obvious red flags for privacy advocates.

Silkie Carlo, the director of a group called Big Brother Watch, is launching a campaign against facial recognition this month.

“These figures show that not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool. South Wales’ statistics show that the tech misidentifies innocent members of the public at a terrifying rate, whilst there are only a handful of occasions where it has supported a genuine policing purpose,” Carlo said.

Similar numbers were released by the FBI in 2016, with the agency also admitting that their facial recognition database consisted of mostly innocent people since they use drivers license and passport photos for their searches, in addition to mug shots. In fact, there is a 50/50 chance that your picture is in a facial recognition database. Also in 2016, another study found that facial recognition software disproportionately targeted people with dark skin.

Facial recognition is becoming a part of everyday life whether people realize it or not, especially in major cities. The rollout of these systems has been mostly accepted by the general public under the pretense that they are highly sophisticated tools with a very high rate of accuracy. But in reality, they seem to just be an expensive and elaborate excuse to search innocent people.

Police departments across the world are always expanding their facial recognition databases and putting cameras in every possible place they can. Just last week, The Free Thought Project reported that police will soon be scanning everything in their path with facial recognition software installed in their body cameras.

John Vibes is an author and researcher who organizes a number of large events including the Free Your Mind Conference. He also has a publishing company where he offers a censorship free platform for both fiction and non-fiction writers. You can contact him and stay connected to his work at his Facebook page. John just won a 3-year-long battle with cancer, and will be working to help others through his experience, if you wish to contribute to his treatments consider subscribing to his podcast to support. This article first appeared at The Free Thought Project.

Also Read: Predictive Algorithms Are No Better At Telling The Future Than A Crystal Ball




Activist Post Daily Newsletter


Subscription is FREE and CONFIDENTIAL

Free Report: How To Survive The Job Automation Apocalypse with subscription

Source Article from http://feedproxy.google.com/~r/ActivistPost/~3/lTwrxBMQ894/new-data-shows-police-state-facial-recognition-is-wrong-over-90-of-the-time.html

You can leave a response, or trackback from your own site.

Leave a Reply

Powered by WordPress | Designed by: Premium WordPress Themes | Thanks to Themes Gallery, Bromoney and Wordpress Themes