Financial Times – Law Society warns that police algorithms ‘put justice at risk’

The Law Society has released a new report and an interactive map compiling evidence on the growing role algorithms play in the criminal justice system. The report criticises the police’s use of facial recognition and mobile extraction technology, warning that “an uncritical reliance on technology could lead to wrong decisions that threaten human rights and undermine public trust in the justice system.”

Big Brother Watch submitted written and oral evidence to the Law Society’s Technology and the Law Policy Commission, and spoke at the launch event on Tuesday.

Big Brother Watch director, Silkie Carlo, said:

This report is a stark exposé of the risks authorities pose to justice in the UK by their dangerously rapid uptake of AI and algorithms.

People’s rights and the very foundations of British justice are being quietly eroded by high tech surveillance, mass data profiling and experimental predictive analytics in our criminal justice system. Millions of people will affected by this, but few are informed about it. The growth of this technology use by our authorities has been recklessly undemocratic.

The Law Society is right that oversight and better frameworks can help mitigate risks but the bigger question is, why is the public now faced with these risks at all? How has this country allowed individual councils and police forces to introduce Minority Report style AI and predictive analytics in secrecy, whilst small NGOs like ours are left to expose and challenge them?

Authorities have been let completely off the leash where technology is concerned and it is high time they are reined back in.

It is not possible to mitigate the risks of intrusive mass surveillance tools like live facial recognition – they simply have no place in a democratic society.

The emergence of these new technologies and mass data collection practices warrants a much deeper consideration of what sort of country we want to be, rather than solely how much of this current mess can we clear up.

Read more in the Financial Times.