Secretive DWP Welfare Algorithms Put Millions’ Rights at Risk
Millions of people across the UK are being profiled by biased algorithms every year by the Department for Work and Pensions (DWP), with the high-tech tools threatening data rights on a mass scale while officials fight to keep details out of public view.
‘Suspicion by Design’, a new report from the civil liberties campaign group Big Brother Watch, details the massive expansion of AI and algorithm supported decision-making at the heart of the benefits system, and lays out the key questions the government refuses to answer about the digital welfare state.
Key Findings
- Around one million people were profiled by the Universal Credit Advances machine learning model last year, which is riddled with algorithmic bias.
- The DWP went to court to try to keep details on the model’s data risks secret.
- New machine learning models in development by the DWP contain significantpotential for discrimination.
- The DWP refuses to meet its obligations to publish details about its algorithms.Internal DWP documents obtained by Big Brother Watch show that the Universal Credit Advances model, used to risk score almost a million Advances claims each year, displays consistent, statistically significant bias. Fairness analyses of the Advances model and a string of other pilot tools show that algorithmic disparities have been found for age, nationality, relationship status, and reported illnesses – even more concerning as these characteristics are also used as proxies for ethnicity, marital status, and disability.Automation is also being used to select people for the Targeted Case Review scheme, which leads to the DWP demanding documents from Universal Credit recipients with minimal explanation. More than two million Universal Credit recipients are expected to be reviewed by the end of this decade. One man on the receiving end of a review told Big Brother Watch that the demands for evidence from officials placed him under serious “psychological strain”, which was exacerbated by the DWP’s lack of explanation.
Despite the scale of these automation-driven reviews, the DWP refuses to be transparent about how people are flagged, or why. Documents providing vital detail about data and equalities protections are either not disclosed, or redacted heavily, when published by the Department.
As well as laying out how the DWP is creating a suspicion machine at the heart of the welfare state, the report addresses key accountability questions the government refuses to answer, and exposes the transparency backsliding taking place inside Whitehall. It examines three major DWP programs using algorithms and automation, and outlines how transparency has regressed over time – with examples of previously disclosed information now withheld.
Vital details necessary for affected individuals, civil society, and the wider public to question the mass-scale use of algorithms, data profiles, and AI are shielded from public scrutiny. Explanations of potential data-rights risks, the scale of algorithmic bias, sources of information, and even details as minor as coding languages, were all deemed too sensitive for public consumption.
Big Brother Watch is calling for the government to commit to much greater transparency about how it uses high-risk data tools to influence decisions about people’s lives, and demands a halt to the use of any tool where unexplained bias exists.
Jake Hurfurt, Head of Research and Investigations at Big Brother Watch and the report’s lead author, said: “The DWP’s ongoing rollout of high-tech algorithmic tools, which its own assessments have found to be riddled with bias, is alarming. This becomes even more concerning when the DWP is hiding behind a wall of secrecy and refuses to disclose key information that would allow affected individuals and the public to understand how automation is used to affect their lives, and the risks of bias and to privacy involved.
Instead of pressing forward the DWP should take a step back and pause the use of any model containing unexplained disparities, and it must become more transparent about how it uses high-tech tools. It is wrong to subject millions of innocent people to shadowy automated or algorithmic decisions, and refuse to explain how these work.”
NOTES:
- Read the report ‘Suspicion by Design: What we know about the DWP’s algorithmic black box and what it tries to hide’
- Spokespeople are available for interview. Please direct enquiries or requests for interviews to info@bigbrotherwatch.org.uk or 07730439257