Britain’s obsession with surveillance is reaching new heights. Several of the UK’s largest retailers have quietly installed facial recognition checkpoints on their doorways and inside their shops. It means that automated identity checks are taking place on our high streets without customers even being aware of it.
You won’t be informed if your photo is taken and added to a watchlist, and no police report is required
The cameras look like any other CCTV cameras, except they take a biometric scan of every customer’s face, like at a passport e-gate. The facial recognition scans are then compared against a private database run by the software company Facewatch. The database is populated by various shops’ CCTV images of people security staff think may be involved in ‘crime or disorder’. You won’t be informed if your photo is taken and added to a watchlist; neither is a police report required for your face to be added. The system operates entirely separately to law enforcement.
Many retailers have started to use Facewatch, including Budgens, Morrisons Daily, Sports Direct, Flannels and Home Bargains. As shoplifting rates rise, more companies are installing Facewatch’s software. According to the British Retail Consortium, only 8 per cent of incidents of violence and abuse are prosecuted and since shops are giving up on the police (they aren’t the only ones), it is estimated that only a third of such incidents are even reported.
While I understand that security staff have a difficult job to do, it would be plainly dangerous to let them replace the police, judges and juries. Yet that is effectively what has happened. Members of the public are being technologically blacklisted from major retailers on the say of security guards, often without knowing what they have been accused of or how to clear their name. The truth is, they can’t.
The story of one woman whom I will call ‘Anna’ gives a chilling insight into how Facewatch’s surveillance system works. Anna is a quiet woman in her sixties with physical health problems. She was taken to the shops by her adult children several months ago, when she was picked out and stopped by staff who told her to leave. When she asked why, security refused to tell her but said she had been flagged by their facial recognition cameras and the system said she was banned. The family were escorted out and shown a tiny sticker on the store’s entrance – obscured by racks of garden plants – indicating that Facewatch’s facial recognition was in operation.
With the support of her children, Anna contacted Facewatch to ask why she had been blacklisted. For her request to even be processed, she had to send a legal ‘Subject Access Request’ to the company with copies of her photo ID. Given that the company had already covertly taken and misused photos of her, this seemed like an absurd and insulting demand, yet it was a legal requirement to find out what data they held on her.
It turned out that a security guard had suspected her of taking goods from a store worth £1 – a mere £1 – some weeks earlier and had then added her photo to the database. When asked if the company had any video evidence of the alleged incident, the answer was no. Anna says that’s because it never happened. It certainly would be out of character for a lady in her sixties to launch a criminal career by taking an item worth so little from a shop where she regularly spends considerable amounts.
Nevertheless, Facewatch insists she is a legitimate target of their hi-tech system and that she will remain blacklisted from this retailer and other shops in the area that use its surveillance system. Anna doesn’t know where she can and can’t go without risking the public humiliation of being flagged by facial recognition cameras and accused of being a thief. She is stuck in limbo: treated like a criminal and with no ability to prove her innocence.
As well as the risks of relying on security guards’ hunches to build a hi-tech privatised policing system, the technology itself is also not free from error. ‘Sara’ is a teenager who was flagged by a facial recognition camera in a Home Bargains store she had never visited before. She was accosted by staff who proceeded to search her bag before throwing her out of the store. They told her she was banned from shops and supermarkets using the surveillance system. After Sara went through the same laborious complaint process as Anna, it transpired that on this occasion Facewatch was willing to admit that there was a system fault: Sara simply looked like someone else on the database.
One of the problems of relying on security guards’ intuition is that they risk falling prey to prejudice and stereotypes. Both Anna and Sara are women of colour. It’s perhaps less likely that a middle-class white woman would be added to a blacklist for an unevidenced £1 theft allegation.
Facewatch collects photos of what it calls ‘subjects of interest’ using ‘local intelligence’. This type of language reflects a deeper transformation taking place in society. Shop staff are now ‘frontline workers’ armed with body-worn cameras. I remember when checkout staff were people you had a chat with and who offered to pack your shopping for you. Now, we check out our shopping under the watch of a camera and a screen broadcasting potential transgressions to other customers. Suspicion is creeping into everyday life and is being normalised to help protect corporate profits.
Live facial recognition in shops is considered unlawful across most of the democratic world. A Spanish supermarket was fined €2.5 million by its data watchdog for using it. The UK has the same data laws as the EU, in the form of the GDPR, but the UK Information Commissioner has let surveillance tech spread, under pressure from ministers in the last government to act in Facewatch’s favour. What faith can we have in the regulators anyway? The last Biometrics Commissioner left his post to become a director at Facewatch.
– Silkie Carlo, Big Brother Watch Director