The Independent – Police testing technology to ‘assess the risk of someone committing a crime’ in UK

Big Brother Watch Team / July 17, 2019

Police are creating a new predictive machine-learning analytics system to predict the risk of someone committing a crime in future. The system uses police crime data, commercial profiling marketing data, and even health and education information, and will be used to make decisions on pre-emptive police action, before someone has committed any crime.

Our Legal and Policy Officer, Griff Ferris, said:

“It’s shocking that the Home Office is squandering millions from the public purse on this dystopian predictive policing system. It poses a major risk of criminalising innocent people and lays waste to the presumption of innocence.

“This AI system is being fed data that reflects a policing strategy in which black people are stopped and searched 9 times more than white people across the country, to supposedly predict whether people are likely to commit a crime in future.

“The Home Office’s support for this system is even more surprising in light of the fact that West Midlands Police’s own Ethics Committee refused to approve its use because of serious legal and ethical concerns. A further independent review commissioned by the police questioned whether it was ethical to use data to pre-emptively intervene in people’s lives, and criticised the reliability and legitimacy of the entire system.

“There is no public appetite for Minority Report style policing in this country. It should be scrapped immediately.”

 

The Independent – Police testing technology to ‘assess the risk of someone committing a crime’ in UK

DONATE TO BIG BROTHER WATCH