Update: Big Brother Watch’s complaint to the ICO on retailer facial recognition

Big Brother Watch Team / June 28, 2023

In July 2022, Big Brother Watch filed a legal complaint to the Information Commissioner’s Office against the Southern Co-op and Facewatch over the supermarket’s use of Facewatch’s live facial recognition software in its stores. FaceWatch is a UK-based start up that also provides the technology to the Frasers Group.

The legal complaint outlined how Facewatch’s system involves highly invasive processing of personal data, creating a biometric profile of every store visitor. Facewatch’s watchlists of blacklisted individuals are created by store employees and shared between Facewatch users in the area. There is no criminal threshold for being placed on a watchlist, and Facewatch do not receive information from the police, meaning innocent people are at serious risk of being wrongly flagged by the technology. Our legal complaint also raised the potential for bias and discrimination within the algorithms that power the system, due to the widely documented issues facial recognition technology has with misidentifying people of colour. 1

Following this, the Information Commissioner’s Office announced it had launched an investigation into Facewatch. The investigation concluded in March 2023 and found that Facewatch’s policies had breached data protection law, vindicating our legal complaint. Correspondence between the ICO and Facewatch (attained through our Freedom of Information request) reveals that Facewatch had breached data protection law on a considerable number of points.

“We concluded that Facewatch’s processing of personal data failed to balance the legitimate interest of Facewatch and their subscribers against the rights and freedoms of individuals. The ICO advised Facewatch that the following data protection legislation had been breached:

  • Article 5(1)(a) – lawfulness, fairness and transparency;
  • Article 5(1)(b) – purpose limitation;
  • Article 5(1)(e) – storage limitation;
  • Article 6 – lawfulness of processing;
  • Article 9 – processing of special categories of data;
  • Article 10 – processing of personal data relating to criminal convictions and offences; and
  • Recital 38 – the rights of children
  • Schedule 1, Part 2, s10 of the DPA 18.”

In response, Facewatch was forced to overhaul its approach to data processing by taking steps which were redacted from the correspondence. The ICO also outlined a considerable number of further steps Facewatch should consider, including regularly reviewing its policies, data protection impact assessments, risk matrix, and legitimate interest for processing data.

However, after relaying these findings to Facewatch, the ICO told us that “no further regulatory action is required.” 2 It is deeply disappointing that the ICO decided not to take anything other than advisory action over a significant number of serious data protection breaches. We maintain that Facewatch’s data processing does not meet data protection and human rights standards.

In a (now deleted) post on its website and social media pages, Facewatch claimed that the “ICO Judgement Clears the Way for Facewatch” and that the ICO has “concluded that Facewatch is fully compliant with UK Data Protection law.”3 Facewatch also marketed itself with banners at events that similarly read “ICO Judgement Clears the Way for Facewatch”, with the ICO’s logo underneath.

Facewatch stand at Retail Risk London, shared on LinkedIn

As we have set out above, we now know from our FOI requests that this is untrue. Further, the ICO explicitly stated in its blog announcing the decision: “Nor does this decision [not to take regulatory action] give a green light to the blanket use of this technology.”4 The ICO later made a public statement confirming that “we had not provided blanket approval of the company” and asking Facewatch to revise their public statement and remove the ICO’s logo from their promotional materials.5

We remain deeply concerned about the use of live facial recognition on the British high street, and are disappointed by the ICO’s permissive approach to upholding the rights of data subjects. Other data protection authorities, also operating under the GDPR, have come to far more robust conclusions about the use of live facial recognition in retail environments. In Spain, the data protection regulator fined a supermarket chain 2.5 million EUR for using live facial recognition in stores, stating that it did not have a lawful basis for data processing and that the processing was not necessary or proportionate.

We continue to urge retailers to drop their use of invasive live facial recognition technology and for anyone impacted by it to get in touch.


  1. Facial recognition fails on race, government study says – BBC News, 20th December 2019
  2. Correspondence from the ICO to AWO Legal, 30th March 2023, available on request
  3. CO Judgement Clears the Way for Facewatch – Facewatch website, 31st March 2023, archived by the Wayback Machine, accessed 15th June 2023
  4. Blog: Balancing people’s privacy rights with the need to prevent crime – Stephen Bonner, the Information Commissioner’s Office website, 31st March 2023, accessed 21st June 2023
  5. The Information Commissioner’s Office, Twitter, 22nd May 2023, accessed 21st June 2023
  6. Spanish DPA Fines Supermarket Chain 2,520,000 EUR for Unlawful Use of Facial Recognition System – Privacy and Information Security Law Blog, 30th July 2021