Live facial recognition: what to do if you are stopped by facial recognition cameras

Big Brother Watch Team / July 12, 2023

If you find yourself incorrectly identified by a facial recognition camera, Big Brother Watch can help.

Get in touch via email or our social media accounts, where we can offer advice, legal support and potentially assist you in seeking redress. As facial recognition technology becomes more commonly used across the UK, wrongful stops are increasingly likely.

Live facial recognition is a mass surveillance tool. Police forces in London and Cardiff are using live facial recognition technology, as are shops such as the Southern Co-op, Sports Direct, House of Fraser, and Flannels. Facial recognition cameras have been deployed at shopping centres, concerts, football matches, rugby matches, where thousands of innocent people have been scanned by AI-powered surveillance. Face recognition technology is notoriously inaccurate, and many studies have found that it is less accurate on darker skins tones, leading to people of colour being disproportionately misidentified.

Facial recognition technology scans the faces of people who pass in front of a camera and compares them against a watchlist. As well as issues with the accuracy of the algorithms that power facial recognition, people have been falsely stopped by facial recognition if they have been wrongly placed on a watchlist. We have witnessed and received reports of people who have been wrongfully placed on police and retailer watchlists being stopped by this technology, and have supported people with legal advice and representation.

Police use of facial recognition in the UK

The Metropolitan Police and South Wales Police have used live facial recognition in London and Cardiff, at gigs, sporting events, Notting Hill Carnival, royal events and more. There have been many incorrect facial recognition matches by the police, which have led to innocent people being stopped by police, having their fingerprints scanned, asked for ID and held until they can prove their innocence. Data from the Met and South Wales Police shows that over 80% of facial recognition flags have been incorrect since the technology was first deployed in 2017.

Other police forces have also experimented with facial recognition, including Humberside Police, South Yorkshire Police, Leicestershire Police and Greater Manchester Police.  Northamptonshire Police have also used live facial recognition vans borrowed from South Wales Police to monitor the Grand Prix at Silverstone, explicitly stating that the technology was being used to identify protesters. We are particularly concerned that facial recognition could be increasingly used at protests, due to the chilling effect it has on freedom of expression and assembly.

Shops using facial recognition in the UK

Major retailers, such as Frasers Group and the Southern Co-op are using facial recognition provided by UK company Facewatch. Facewatch have also provided their technology to Costcutter, Budgens, Eat 17 and Spar stores. This privatised policing will likely see many people falsely flagged by facial recognition CCTV and wrongly barred from stores. People have already been wrongly stopped by facial recognition cameras in supermarkets, leading to embarrassing and distressing encounters. Big Brother Watch has supported those affected by wrongful facial recognition stops, advised people on their rights and provided people with legal support.

We are aware that as facial recognition surveillance is used more widely, more people will be harmed by this intrusive and inaccurate technology. If you are wrongly stopped by facial recognition technology, either through a misidentification or being wrongly included on a watchlist, we strongly encourage you to reach out to us, as we will be able to provide you with advice, support and legal assistance.

After the incident, make a record of what happened and, if you are comfortable doing so, please email it to us at or reach out to us on social media, where a member of our team can advise you on next steps.