Police and private companies in the UK have been quietly rolling out facial recognition surveillance cameras, taking ‘faceprints’ of millions of people — often without you knowing about it. This is an enormous expansion of the surveillance state — and it sets a dangerous precedent worldwide. We must stop this dangerously authoritarian surveillance now.
Our director Silkie Carlo is challenging the Met Police's use of live facial recognition surveillance alongside Shaun Thompson, a community worker who was misidentified by the use of this Orwellian technology at London Bridge.
This is a David vs Goliath fight and the fight against live facial recognition expansion needs you now more than ever.
Support Big Brother Watch
With your support we will:
Back groundbreaking legal action against police and shops' use of facial recognition
Demand politicians roll back live facial recognition
Give legal advice & support to people affected by live facial recognition
Every penny you donate will go directly towards our work fighting the spread of this Orwellian surveillance technology and protecting your privacy.
Tens of millions of people in the UK have been scanned by live facial recognition surveillance and this number is increasing daily. Police, shopping centres, concerts, museums, stadiums, bars found to have used the surveillance.

Research shows that facial recognition discriminates against women and people of colour. 80% of people misidentified by facial recognition in London in 2025 were Black.

For a decade, police have used live facial recognition technology without any specific laws passed by Parliament. The UK's human rights regulator (Equality and Human Rights Commission) believes the Met Police’s use of live facial recognition is unlawful.

The Home Office has launched a consultation on the use of facial recognition and similar technologies.
Now is your chance to make your voice heard and tell the Home Office about the dangers facial recognition poses to our civil liberties.
You can use the tool we have provided below to tailor your response using your own background, experiences, skills, and concerns. The Home Office is more likely to consider responses that are unique instead of those that are copied and pasted directly.
You must respond by 12/02/2026.
We’re sending a message to the Home Secretary and Met police chief Mark Rowley that facial recognition has no place in British policing.
We’re urging them to immediately stop using authoritarian live facial recognition, and if enough of us join the call, we can make them rethink. Sign (and share!) here:
SIGN HERE2
SIGN THE PETITION
Tell the Met Commissioner and Minister for Policing to stop using facial recognition surveillance now.
3
SUBSCRIBE
Don't miss out on live facial recognition alerts! Subscribe to our newsletter to stay informed on campaign actions.
Got a tip? If you have any information about facial recognition surveillance in the UK that might help our investigation, please let us know. Email us at info@bigbrotherwatch.org.uk or text us on Signal at +44 7514913266
Our investigation into facial recognition in the UK is ongoing and we put information into the public domain as soon as we can. We work with media partners, including the BBC, to widely publish our findings. We have also passed on all of our findings to the data regulator, the ICO, and pressed them to investigate.
From unravelling the stories of two people who were wrongly accused of being criminals following facial recognition misidentifications to documenting the first ever trial of a city-wide facial recognition zone in Cardiff, we're on the ground collecting evidence to fight against the expansion of mass facial recognition surveillance in the UK.
When we are tipped off about future facial recognition deployments we attend to demonstrate, hand out leaflets, and observe any police/security action. It’s important we observe the police, so we can record and intervene in incidents like this. Want to join in? Sign up to our email updates.
We’re taking the campaign to the heart of power. We’ve launched two groundbreaking reports in Parliament, hosting high-profile MPs and Peers. We’ve since supported MPs tabling questions in parliament, we’ve circulated briefings for debates, and we’ve exposed the dangers of facial recognition in evidence for parliamentary committees and government working groups. (Find it all here.) We’ll continue pushing MPs and Peers to speak out against the use of facial recognition wherever we can.
We’re leading a national campaign against facial recognition, bringing together the country’s rights and race equality groups and politicians from across the political spectrum to stand together against this surveillance expansion. In September 2023, we issued a call for an urgent stop to facial recognition surveillance by police and private companies supported by over 180 tech experts and organisations.
We’re leading the fight back against the use of facial recognition surveillance in the UK – and our campaign is built on extensive expertise and analysis. We’ve published two groundbreaking reports, in 2018 and 2023, that lay out where, when and how facial recognition is being used and what needs to change.
This works by rapidly creating a biometric “faceprint” of your face – sensitive data that uniquely identifies you – much like a fingerprint, and comparing this for similar matches on a database.
Some facial recognition companies even check faces against internet data in real-time, trawling the entire internet to identify any photos of you, posted anywhere.
We have investigated the Metropolitan Police’s use of live facial recognition since ‘trials’ began in 2016.
Big Brother Watch has attended dozens of London deployments, where we have witnessed multiple misidentifications and alarming uses of the technology, including the decision to repeatedly target Notting Hill Carnival and preventing people with suspected mental health issues, who were not wanted for arrest, from attending a Remembrance Sunday event.
In February 2024 Shaun Thompson, a Black anti-knife crime community volunteer, was misidentified by live facial recognition technology. Shaun is now bringing a legal challenge, alongside our Director Silkie Carlo, against the Metropolitan Police.
In 2025, an unprecedented network of fixed live facial recognition cameras were installed across Croydon town centre. The same year, the UK's human rights regulator said that it believes the Met Police’s use of LFR is unlawful. Force webpage
South Wales Police is the UK's national lead on facial recognition, supported by millions of pounds in Home Office funding. Its use was subject to a legal challenge by Cardiff resident, Dr. Ed Bridges, and in 2020, the Court of Appeal found the technology had not been used in accordance with the law.
Since February 2025, South Wales Police have been turning Cardiff city centre into a zone of surveillance by installing a network of live facial recognition cameras around the city centre for sporting events.Force webpage
The force began experimenting with live facial recognition in October 2023. Big Brother Watch uncovered that Essex police failed to consider the potentially discriminatory impacts of its deployments.Force webpage
The force started using live facial recognition on 21 October 2025, deploying this Orwellian technology in Sale, Bolton, and Manchester City centres.Force webpage
The force first deployed live facial recognition technology at the Bedford River Festival in 2024, scanning the faces of over 400,000 people in just two days.Force webpage
The force began experimenting with biometric surveillance in Portsmouth, Southampton, Basingsoke, Winchester during four deployments in September 2024, taking over 320,000 biometric face scans. Force webpage
The force piloted live facial recognition for the first time in February 2025 in Ipswich town centre. They borrowed vans from Essex police to trial the technology, scanning over 47,000 people in 6 hours and arresting only 1 in every 80 people with whom they engaged.Force webpage
The force deployed LFR for the first time in Crawley Town Centre on 13 November - no arrests were made, despite the police taking biometric face scans from over 23,000 innocent members of the public. Force webpage
The force used LFR for the first time on 13 November 2025 in Redhill Town Centre and 60% of those flagged for further inquiries by the police were not arrested. Force webpage
The force first used facial recognition surveillance in Oxford in December 2025 and has since deployed the technology in High Wycombe, Milton Keynes and Reading. Force webpage
The force experimented with live facial recognition for the first time in Leeds City Centre over four deployments between November and December 2025. The force made a single arrest , despite dozens of officers spending over 13 hours of policing time and scanning over 31,000 faces. Force webpage
The force have deployed live facial recognition on three occasions in 2024, including at a football match, scanning over 40,000 people and making no arrests. The cost of these deployments was £18,625.11 (obtained via Freedom of Information request). Force webpage
Northamptonshire Police first trialled LFR in July 2023, as part of the policing of the F1 Grand Prix at Silverstone Park. Prior to the deployment, the force said it would be used to combat "unlawful protest".
Our Freedom of Information request request found that of the 790 people on the watchlist, just 234 people were “wanted for arrest, either on a warrant and/or suspicion of criminal activity”, making it likely that many people on the watchlist were innocent protesters. There were no positive alerts and no arrests made, despite over 400,000 people being scanned. The force deployed the technology again at the 2024 Grand Prix and again, made no arrests. Force webpage
Asda launched a live facial recognition trial in five of its stores in the Greater Manchester area in 2025. The trial represents the first time a UK-wide supermarket has installed live facial recognition. In response, we sent the world’s largest digital ad van to tour the five stores, warning shoppers that Asda is “rolling back your privacy”.
We also filed a legal complaint with the Information Commissioner’s Office, arguing that Asda’s use of live facial recognition cameras in its supermarkets is “unlawful”.
Our legal complaint alleges that Asda’s facial recognition system, provided by surveillance firm FaiceTech, is processing “data with a high degree of risk to data subjects’ rights for private benefit”.
In June 2025, supermarket chain Iceland announced that it is deploying live facial recognition in two of its shops in the north of England, with plans to expand it further later in the year.
In September 2025, Sainsbury's announced trial of live facial recognition technology in two of its stores in London and Bath and in January 2026, it was reported that the retailer would expand its technology to five additional retail stores.
The Southern Co-op has been using live facial recognition since 2021 in the south of England. 35 stores across Portsmouth, Bristol, Hove, Bournemouth and London are using the technology to spy on shoppers. To our knowledge, this is the first supermarket in the UK to permanently install facial recognition. The technology is provided by Facewatch.
In 2022, we issued a legal complaint to the Information Commissioner’s Office about Facewatch and the Southern Co-op’s use of this technology.
Frasers Group, owned by Mike Ashley, also use Facewatch’s live facial recognition technology in a number of its stores, which include Flannels, House of Fraser, Sports Direct and USC. In 2023, we wrote to Mike Ashley with nearly 50 parliamentarians, calling on him to stop scanning customers.
PimEyes is an online facial recognition search engine, which trawls the open internet for facial images. The technology allows anyone to upload an image of a person to their website, which is then processed using facial recognition technology to find potential matches from an index of billions of photos from the internet.
PimEyes places no limits on the type of images that may be used for search and has no safeguards to prevent people using their service to extract a library of photos of someone other than themselves, including children. In 2022, we submitted a legal complaint to the Information Commissioner’s Office about the risk PimEyes poses to data rights and privacy.
| Company | Supplier |
|---|---|
| Birdworld | Facewatch |
| Booths | |
| Budgens | Facewatch |
| B&M | Facewatch |
| Cadbury Garden Centre | |
| Costcutter | Facewatch |
| EAT 17 | Facewatch |
| Flannels | Facewatch |
| Gatwick Airport | |
| Gieves and Hawkes | Facewatch |
| Gordons Wine Bar | Facewatch |
| Haskins Garden Centres | Facewatch |
| Heathrow airport | |
| Hobbycraft | Facewatch |
| Home Bargains | Facewatch |
| Iceland | Facewatch |
| Ladbrokes/Coral | |
| Lawrences Garages | Facewatch |
| Leicester Racecourse | Facewatch |
| Londis | Facewatch |
| Luton Town Football Club Shop | Facewatch |
| Manchester Airport | |
| Middlesborough Empire | OYN-X |
| Mole Avon Country Stores | Facewatch |
| Nisa Local | Facewatch |
| Palm Beach Casino | Facewatch |
| QD Stores | Facewatch |
| River Island | Facewatch |
| Rowans Bowling Alley | Herta Security |
| Ruxley Manor | Facewatch |
| RWB Auctions | Facewatch |
| Sainsbury's | Facewatch |
| Southern Co-Op | Facewatch |
| SPAR | Facewatch |
| Sports Direct | Facewatch |
| Symposium | Facewatch |
| The Gym | |
| Tian Tian Market | Facewatch |
| USC | Facewatch |
| Village Wholefoods, Clapham | Facewatch |
| Welcome Break | |
| Whitehall Garden Centres | Facewatch |
The London council ran a secret trial of facial recognition surveillance for 3 days in 2019, without the knowledge of residents. It used 4 cameras in public spaces, including a train station and the borough’s main shopping street.
Read more
We discovered that The Broadway shopping centre trialled facial recognition in August 2018, working with Customer Clever, Omega Security and Axis Communications. Customer Clever tweeted that the trial would identify “problem” individuals and “provide demographic information to the shopping centre”.
We found evidence that facial recognition has been used at the Amex football stadium. This appears to be confirmed for the Albion v Crystal Palace match in January 2018.
POLICE USE OF LIVE FACIAL RECOGNITION
Live facial recognition (LFR) matches faces on live surveillance camera footage against a police watchlist in real time. Before a deployment, the police will prepare a watchlist, which can comprise of police-originated images, such as custody images from the police national database, and non police-originated images, for example publicly available, open source images or information shared by other public bodies.
At the deployment, a camera will capture a live video feed, from which the LFR software will detect human faces, extract the facial features and convert them into a biometric template to be compared against those held on the watchlist. The software generates a numerical similarity score to indicate how similar a captured facial image is to any face on the watchlist. Police set a threshold for these scores, and any matches above this threshold are flagged to the police, who may then decide to stop the individual.
It is important to draw a distinction between live facial recognition and retrospective facial recognition. Live facial recognition uses a real-time video feed to biometrically scan the faces of members of the public almost instantaneously. Retrospective facial recognition (“RFR”) uses facial recognition software on either still images or video recordings taken in the past and compares the detected faces to photographs of known individuals held on photo databases.
Whilst RFR may have limited use as a forensic investigatory tool, subject to strict regulation, live facial recognition amounts to constant generalised surveillance. The British public would not be comfortable with fingerprint scanners on our high streets; there is no reason we should accept being subjected to a biometric police line-up either.
It is noteworthy that LFR is most enthusiastically embraced by authoritarian regimes, like Russia and China, whilst other democratic countries have taken measures to restrict its use. Several US states and cities have implemented bans and restrictions on the use of LFR and the EU has implemented the AI Act, which prohibits the use of LFR for law enforcement purposes, except in the most serious and strictly defined cases with a requirement of judicial authorisation. States must pass domestic law in order to use LFR and each use must be reported to data protection authority.
This is a far cry from the UK’s unregulated approach and lack of oversight.
Explore the full list of questions and answers here.
SHOPS' USE OF LIVE FACIAL RECOGNITION
Live facial recognition (LFR) matches faces on live surveillance camera footage against a watchlist in real time. Companies add individuals who they want to exclude from a store to a tailored watchlist, which generally comprises of images taken from the customers’ previous visits to a store. Facial recognition software companies also offer ‘National Watchlists’ comprised of uploads of images and reports of incidents of crime and disorder from its customers across the UK.
A camera placed at the entrance of a store will then capture a live video feed, from which the LFR software will detect human faces, extract the facial features and convert them into a biometric template to be compared against those held on the watchlist. The software generates a numerical similarity score to indicate how similar a captured facial image is to any face on the watchlist. Any matches above this pre-set threshold are flagged to shop staff, who then deal with the individual in line with the retailers’ policy.
In the retail context, LFR is often touted as a solution to combat shoplifting and anti-social behaviour. Following the ICO’s investigation of Facewatch, the regulator held that in order to comply with data protection legislation and human rights law, retailers could only place individuals on a watchlist where they are serious or repeated offenders. The evidence we have collated demonstrates that, in practice, members of the public are placed on retailers watchlists for very trivial reasons, including for accusations of shoplifting valued at only £1. This shows that not only are LFR companies not complying with regulatory decisions, but also that the technology being used disproportionately as it is not just targeted at the most harmful perpetrators.
In the Justice and Home Affairs Committee Inquiry on ‘Tackling Shoplifting,’ Paul Gerrard, Public Affairs and Board Secreteriat Director at The Co-op Group, gave oral evidence that the company has no plans to implement LFR because it “cannot see what intervention it would drive helpfully.”1 Gerrard highlighted the ethical implications of employing a mass surveillance tool in a shop, as well as the heightened risk of violence and abuse to retail employees who have to confront shoppers if the LFR system flags them. His evidence reflects our position that there is no place for this invasive software from both the perspective of shoppers and retail workers.
There is also a significant divergence between the level of intrusion associated with traditional security systems versus facial recognition surveillance. LFR is an invasive form of biometric surveillance, which is linked to a deeply personal identifying feature (i.e., an individual’s face) and is deployed in public settings, often without the consent or knowledge of the person being subjected to checks. Additionally, unlike “traditional” blacklists held by shops, which might comprise of photographs of known local offenders, LFR could flag an individual in a shop they have not previously visited, producing a far greater magnitude for surveillance.
The private use of live facial recognition creates a new zone of privatised policing. It emboldens staff members to make criminal allegations against shoppers, without an investigation or any set standard of proof, and ban them from other stores employing the software. Clearly, when errors are made, this has profound implications for the lives of those accused, with little recourse for challenging the accusations. The lack of oversight and safeguards means that vulnerable individuals, including young people and those with mental health issues, are particularly at risk of being included on watchlists and leaves the door open to discriminatory and unfair decisions with significant impacts.
Explore the full list of questions and answers here.
2
Tell the Met Commissioner and Minister for Policing to stop using facial recognition surveillance now.
3
Don't miss out on live facial recognition alerts! Subscribe to our newsletter to stay informed on campaign actions.
Got a tip? If you have any information about facial recognition surveillance in the UK that might help our investigation, please let us know. Email us at info@bigbrotherwatch.org.uk or text us on Signal at +44 7514913266