BIG BROTHER WATCH WRITES TO UN SPECIAL RAPPORTEUR ON EXTREME POVERTY AND HUMAN RIGHTS AHEAD OF HIS UK VISIT
Big Brother Watch has written to the UN Special Rapporteur on Extreme Poverty and Human Rights to raise the alarm about the hidden use of automation and predictive analytics for decisions about benefits entitlements and social care, and the negative human rights impact on the UK’s poorest people. The submission follows our sending of over 1,000 Freedom of Information requests to authorities about their uses of AI, algorithms and big data in decision-making.
In our submission to the UN Special Rapporteur, we explain that authorities’ lack of transparency and inept legal frameworks mean that the rapid adoption of new technologies are engaging human rights in ways that are difficult to analyse or challenge.
“Claimants already have to deal with a frequently changing and punitive assessment process – now they are being affected by complex technological systems they rarely know about, cannot understand, and are not able to challenge.”
The new tools being used by authorities include risk analysis and profiling; automated fraud detection; predictive analytics, spanning from late rent payments to ‘child abuse’ and ‘youth offending’; benefit entitlement calculations; and has previously even included voice analytics to detect claimants who are ‘lying’.
These new technologies are largely supplied by private contractors and are fuelled by big datasets, sometimes involving hundreds of millions of data items and including sensitive fields such as health data, education records and ethnicity.
Human rights and welfare
The UK’s welfare system touches on a spectrum of rights: the right to life, the right to health, the right to be free from inhuman or degrading treatment, freedom from discrimination, the right to education, the rights of children, the right to fair work, access to justice, and the right to peaceful enjoyment of property.
In the context of welfare estimations and decisions, the stakes could not be higher. This myriad of rights is engaged and people’s lives, health and social integration are often at risk.
Welfare and social care decisions should be transparent, accessible and challengeable to officials and claimants – not just for highly trained lawyers, but for the members of the public actually affected by the decisions.
Loopholes in the law
As is stands, UK citizens can be subject to purely automated decisions by the authorities – even where those decisions engage their rights. This is a step change and could have huge implications with the dawn of new and emerging technologies.
Big Brother Watch lobbied for vital protections during the passage of the Data Protection Act 2018 that would ensure that citizens would always be entitled to human decisions where their rights are engaged. However, the Government rejected this safeguard.
Crucially, citizens should be notified when an automated decision has been made about them and given a right to appeal – but a loophole in the new Act allows authorities to circumvent those minimal safeguards if there is merely tokenistic human involvement in the decision. In our submission to the Special Rapporteur, we raised concerns that this new legal framework is:
“leaving claimants vulnerable to welfare decisions that are for all intents and purposes automated decisions, without individuals being notified of this fact or their right to appeal.”
We raised additional concerns about the Digital Economy Act 2017, which permits mass data sharing between public authorities and private companies for “public benefit”. In fact, the Act allows broad data sharing to improve citizens’ “physical and mental health”, “emotional well-being”, “the contribution made by them to society”, and “their social and economic well-being”, enabling new forms of paternalistic intrusion on the private lives of those who are most vulnerable. We have raised concerns that data sharing to evaluate and improve the “contribution” one makes to society risks amplifying the punitive potential of some welfare sanctions, such as work schemes that have adversely impacted those with disabilities and ill health.
In fact, the Act also allows data sharing between the state and private companies to prevent or detect crime or anti-social behaviour, for criminal investigations, for legal proceedings, for “safeguarding vulnerable adults and children”, for HMRC purposes, or if required by EU obligations. We told the Special Rapporteur:
“In effect, personal data can be shared across government departments to investigate, penalise or otherwise intrude on the lives of those in receipt of welfare, pensioners, and some of the country’s most vulnerable people.”
There is no mechanism currently in place for authorities to report their use of algorithms, despite the UK claiming the place of a world leader in technology innovation. Indeed, our new legal frameworks fail to require that vital transparency.
We issued over 1,000 Freedom of Information requests in attempt to provide that transparency, but we were unable to gain a comprehensive picture of authorities’ use of technologies in welfare. We explore some of the reasons for this in our submission, including a lack of shared definitions and understandings of new technologies within authorities; their reluctance to share information; their use of highly integrated and complex systems; and their agnosticism towards the functions and impact of new systems.
New technologies are being rapidly and enthusiastically adopted by local authorities, partly due to the climate of austerity and partly due to the general trend of technological solutionism. Our research reveals agnosticism within authorities towards the inner workings of the technologies they use, sometimes with a wilful blindness to the data fed to the tools, and frequently with apathy towards their impact on citizens.
Human rights are engaged in complex ways by new and emerging technologies. The impact must be assessed – but transparency is currently being obstructed, and our rights are at risk. Big Brother Watch looks forward to engaging with the UN Special Rapporteur further and shining a light on hidden new technologies.
You can read about Monitoring, Suspicion and Welfare in our report, The State of Surveillance in 2018 (p.49).