Landmark legal challenges launched against facial recognition after police and retailer misidentifications

Big Brother Watch Team / May 24, 2024

Landmark legal challenges launched against facial recognition after police and retailer misidentifications

  • An anti-knife crime community worker was wrongly flagged, detained, and questioned by police for almost 30mins following a misidentification by the Met’s live facial recognition system 

  • A teenage girl was misidentified by live facial recognition and wrongly accused of being a shoplifter in a Home Bargains store, where she was searched, removed from the store and told she was barred from a number of shops across the UK

  • In a UK first, both are bringing separate legal challenges to seek to roll back facial recognition surveillance and “defend people’s rights”

  • Director of Big Brother Watch and co-claimant against the Met Police, Silkie Carlo, warned that facial recognition is “inaccurate and dangerously out of control” and “must be banned”

An anti-knife crime community worker from London and young woman from Manchester have launched separate legal challenges against the Metropolitan Police and surveillance company Facewatch respectively after being “grossly mistreated” as a result of live facial recognition cameras wrongly flagging them as criminals.

38 year old Londoner Shaun Thompson was returning home from a volunteering shift with Street Fathers, a community organisation that provides a positive male presence to young people and tackles knife crime, when he was wrongly flagged as a person on the Metropolitan Police’s facial recognition database outside of London Bridge station. He was held by officers for almost 30 minutes, who repeatedly demanded scans of his fingerprints and threatened him with arrest, despite him showing multiple identity documents further evidencing that he was not the individual on the facial recognition database. 

In a separate incident in February, 19 year old “Sara” (not her real name) was shopping in Home Bargains in Manchester when she was wrongly flagged as a suspected shoplifter by automated facial recognition cameras, searched, publicly thrown out of the store and told by staff that she was a thief and banned from shops and supermarkets up and down the country using the same spying software, operated by UK firm Facewatch.

Facewatch was investigated by the UK data watchdog, the Information Commissioner, who concluded last year that the firm had breached data protection laws on 8 different counts. However, the Policing Minister Chris Philp, who had private meetings with the firm whilst it was under investigation, has “lobbied” regulators to act “favourably” towards the company and vowed he would “do everything possible to promote (its) widespread use”.

Sara says she was left feeling anxious after being wrongly, publicly accused of being a criminal. After taking legal action with the support of civil liberties campaign group Big Brother Watch, Facewatch admitted she had been misidentified by their facial recognition system. Big Brother Watch said this case is the “tip of the iceberg” and that more and more people are seeking help from the campaign group after being falsely accused as a result of live facial recognition.

Home Bargains as well as a number of other shops including Southern Co-op supermarkets, Flannels and Sports Direct have installed live facial recognition cameras in stores across the country. The technology allows shop security workers to covertly take photos from CCTV feeds of people they suspect have shoplifted or been “antisocial” in stores and add them to facial recognition blacklists, causing an automated alert to staff if a so-called “subject of interest” walks into the store.

The system can also be used to share photos of “subjects of interest” with other companies that buy access to the facial recognition software – even though the facial biometric data is as sensitive as passport data. Shoppers’ photos can be shared without their knowledge in an 8 mile radius from where they are taken from stores in London, or up to a 46 mile radius in rural locations.

In Europe, data protection authorities have fined supermarkets for using live facial recognition, stating it is unlawful under the GDPR and violates individual’s data rights. In the US, multiple cities and states have banned the use of live facial recognition after misidentifications, while the FTC has barred companies from using the technology in their stores. Earlier this year, the European Union agreed to impose stringent restrictions on the use of facial recognition by law enforcement under the AI Act. Research shows that the technology can be “dangerously inaccurate”, particularly with people of colour and women.

The Metropolitan Police has been using live facial recognition sporadically in London since 2016, but following Government plans for a significant expansion of the surveillance, the force is on track to increase deployments by 600% in 2024 compared to 2023. Critics have raised concerns about which people are added of the police’s so-called “watchlists”, which include not only suspects of crime but victims, people thought to pose a “risk of harm” to themselves, and associates of any of those people. Police have previously populated watchlists with protestors not wanted for any offences whatsoever, and people with mental health issues not suspected of any offences.

Police use of facial recognition is not enabled by any specific piece of legislation and has not been authorised by parliament. Police forces have written their own policies about how and where it can be used. Last year, 65 parliamentarians and 130 civil society organisations called for an urgent stop to the use of live facial recognition, citing concerns about injustice, democratic rights, and an insufficient legislative basis to monitor the public with the technology.

The Government recently announced they are soon to launch a facial recognition ‘strategy’, and are encouraging police forces and retailers to use the controversial technology.


Silkie Carlo, director of Big Brother Watch and co-claimant against the Metropolitan Police, said:

These legal challenges are a landmark step towards protecting the public’s privacy and freedom from Orwellian live facial recognition. Live facial recognition surveillance turns our faces into barcodes and makes us a nation of suspects who, as we’ve seen in these disturbing cases, can be falsely accused, grossly mistreated and forced to prove our innocence to authorities. It is a total inversion of our most basic civil liberties.

Facial recognition is inaccurate and dangerously out of control in the UK. No other democracy in the world spies on its population with live facial recognition in the cavalier and chilling way the UK is starting to, and it is alarming that the Government is seeking to expand its use across the country. 

“Shaun and Sara’s stories are proof that facial recognition surveillance poses a real threat to the public’s rights and should be urgently banned. It’s vital we roll back facial recognition and defend people’s rights with this groundbreaking legal action.”

Sara*, who is bringing legal action against Facewatch and Home Bargains, said:

“I was visiting Home Bargains and had only just got into the entrance of the store when a member of staff came up to me and told me I had to leave the store immediately because I was a thief. I had my bag searched and was escorted out of the shop, and was told due to facial recognition I was banned from shops across the whole country.

I have never stolen in my life and so I was confused, upset and humiliated to be labelled as a criminal in front of a whole shop of people.

I think it is wrong and dystopian to treat every shopper like a potential criminal who needs to have their face scanned. Shops should be banned from using this technology.”

Shaun Thompson, who is bringing a joint judicial review against the Metropolitan Police, said:

“I work with Street Fathers which helps the youth of London. I patrol to help keep kids safe and protected and to get knives off the streets. I was coming home from a street patrol in Croydon, when I was pulled out of the street at London Bridge due to facial recognition. They were telling me I was a wanted man, trying to get my fingerprints and trying to scare me with arrest, even though I knew and they knew the computer had got it wrong.

Instead of working to get knives off the streets like I do, they were wasting their time with technology when they knew it had made a mistake.

“I was angry that I had been stopped by a flawed technology that misidentified me as someone else and was treated as though I was guilty.

“I’m bringing this legal challenge because I don’t want this to happen to other people. Facial recognition is like stop and search on steroids and doesn’t make communities any safer. It needs to be stopped.”



  • Spokespeople are available for interview – please contact 07730439257 or

  • Big Brother Watch is at the forefront of the UK campaign to stop live facial recognition surveillance – our campaign page, with details our latest research and investigations, including facial recognition inaccuracy statistics, is available at

  • We are crowdfunding to enable this legal action