Commenting on Essex Police pausing live facial recognition as a result of accuracy and bias risks, Jake Hurfurt, Head of Research and Investigations at Big Brother Watch said:
“Almost a year ago, Big Brother Watch warned that Essex Police’s failure to check the accuracy of its live facial recognition (LFR) algorithm would put the rights of thousands of people at risk. These warnings have now been borne out, but only after 2.5 million people across the county have had their faces scanned by intrusive technology with known bias and accuracy issues.
“LFR as a tool of general mass surveillance has no place in a democracy like Britain, but if police are going to use it the very least the public can expect is that it doesn’t racially discriminate against people.
“It’s deeply concerning that Essex Police appear to have taken this facial recognition company’s claims at face value and deployed a system they had not even tested.
“LFR for everyday policing is authoritarian, inaccurate, and ineffective in equal measure. These concerns are borne out both by this report on the Essex Police’s use of the tech and by a Cambridge study which found no noticeable impact on crime during or after it deployment.
“Police across the country must take note of this fiasco. Essex Police have serious questions to answer about how and why this tech was used so widely. AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets.
“The government should urgently reconsider its shocking plans to squander millions of pounds of public money on a five-fold increase of LFR across the country.”
NOTES