The vast majority of matches recorded by the Metropolitan Police from its deployments of live facial recognition over the years have been false.
- 85%, or six out of every seven matches, have been false.
- 15%, one in seven alerts, were a correct match.
Not all of the matches the Met claims to be true have been confirmed as definite true matches, meaning the false match figure may be even higher. In Big Brother Watch’s observations of LFR deployments in London, we have seen a number of people trigger an alert who were not then stopped by officers, yet these matches have sometimes been recorded as true without additional verification.
The 84.7% figure is the number of false matches, as a percentage of the total number of facial recognition matches obtained by the Met Police since its first deployment in 2016. There have been 175 matches in total, of which 150 have been false and 25 have been recorded as true.
Professor Peter Fussey, from the University of Essex, used similar methodology to calculate the accuracy rate of the Live Facial Recognition deployments he assessed in the Independent Report On The London Metropolitan Police Service’s Trial Of Live Facial Recognition. The study, commissioned by the Met Police, found that in the limited number of deployments it observed, 63.64% of matches leading to a stop were inaccurate (14 of 22 total matches), and just 36.36% (8 of 22) were accurate.
Similarly South Wales Police has returned false matches for 2,825 of its 3,140 LFR flags, giving it a false match rate of 89.9%.
The Metropolitan Police chooses to use different metrics which present Live Facial Recognition as much more accurate than it is.
The False Positive Identification Rate (FPIR) used by the Met Police is measured as the number of false matches against the total number of faces seen, with the figure quoted by the Met Police being 1 in 6,000 or 0.017%. This figure is reached independently of the number of true matches, allowing the Metropolitan Police to overstate the algorithm’s accuracy.
These figures are correct as of May 16th 2023, and will vary slightly following the publication of new deployment figures, however the methodology used to calculate inaccuracy remains the same.
At a LFR deployment in February 2020 at Oxford Circus the LFR cameras saw 8,600 faces, and generated seven false alerts and one genuine alert. Using the Met Police’s logic the inaccuracy (FPIR) rate would be 0.081% – despite seven of eight (87.5 per cent) of matches being incorrect.
Clearly, the Met Police’s figure serves to misrepresent the frequency of false matches – as their figures would potentially allow a deployment which biometrically scanned 100,000 people and had 17 false matches, with no true ones, to present itself as having an inaccuracy rate of just 0.017% – rather than showing the reality of every match being wrong.
|Where||Date||True Positives||False Positives|
|Charles III Coronation||06/05/23||2||0|
|Highbury and Islington Station||20/04/23||1||0|
|Romford Town Centre||14/02/2019||3||13|
|Romford Town Centre||31/01/2019||3||7|
|Central London (Piccadilly and Leicester Square)||17/12/2018 – 18/12/2018||2||12|
|Port of Hull docks (with Humberside Police)||13/06/2018 – 14/06/2018||0||0|
|Remembrance Day 2017||11/11/2017||1||6|
|Notting Hill Carnival 2017||26/08/2017 – 27/08/2017||1||95|
|Notting Hill Carnival 2016||28/08/2016 – 29/08/2016||0||1|
Have any questions about these statistics, or any other details about our campaign to stop facial recognition? Get in touch at email@example.com.