Computer says no: Impact of automated decision-making on human life

Big Brother Watch Team / June 8, 2025

Computers are making life-changing decisions about healthcare, welfare and education with minimal or no human input. Automated decisions could become more common with the Data (Use and Access) Bill that the Labour Government has just passed.

We’d like to hear from you – If you were impacted by the A-levels scandal or by NHS algorithms deciding if you can get an organ transplant, get in touch with us at info@bigbrotherwatch.org

Healthcare – Life on the line

Organ transplants:

Algorithms are deciding whether a patient receives an organ transplant or not. An NHS algorithm assigned liver transplants based on a computer-generated score called the ‘Transplant Benefit Score’ (TBS).

This is what patients are facing with this algorithm:

  • No idea of its existence
  • No human involvement
  • No appeals process in place

Algorithms like this have proven biases against age, gender and race. In this case, the computer favoured older patients, leaving younger patients in the dark.

This includes Sarah Meredith, a 31-year-old patient, who was denied a life-saving liver transplant by this algorithm. The algorithm assigned her a score that was too low to warrant the transplant despite her being gravely ill. Sarah finally got the call she was waiting for last year, after her family campaigned for four years. She was assigned a liver directly by surgeons in Cambridge – no computer was involved.

Sarah’s story shines a light on how solely machine-led decisions can cost patients years before receiving treatment.

Welfare – Penalising the poor

Dutch childcare scandal:

Tens of thousands of families were wrongly flagged as fraud risks by an algorithm that factored in nationality and socioeconomic status. While low-risk individuals were approved automatically, those flagged faced intense scrutiny and debt, whilst the stress of the investigations resulted in family separations, and in some cases, suicide.

Chermaine Leysner was billed a staggering £84,165 by the Dutch tax authority because an algorithm flagged her for childcare fraud. The scandal drove Chermaine into depression and costed nine years of her life.

Margreet ten Pas, a mother of six, is one of the victims of this scandal. While she was among the first to be compensated, it was not nearly enough for her to lead a life of dignity. Apart from experiencing psychological trauma, she was isolated, living off of eating nettles and chestnuts in the woods for years.

This scandal reveals the truly destructive power automated systems can have on innocent people.

UK Housing Benefit algorithms:

Closer to home, a DWP system wrongly flagged over 200,000 innocent housing benefit claimants for potential fraud, our investigation found. The algorithm uses age, gender and even the number of dependents someone has as predictive characteristics to target the algorithm.

Once flagged, an intrusive, high-stress Full Case Review is triggered. Thousands of people have seen their benefits suspended or terminated because they did not respond to a review in time—a review potentially triggered with little human input.

Maya (not her real name), a Wandsworth resident and a mother of three was wrongly cut off from receiving her housing benefit in 2021. Her lawyers believe this was due to the Housing Benefit algorithm.

Not receiving benefit payments meant not being able to pay rent and bills. This led to around £20,000 in rent arrears that built up over three years. They eventually led to Wandsworth Council attempting to evict her from her home.

After experiencing high levels of anxiety and many sleepless nights fighting this injustice for three years, Maya receiving good news. The council finally admitted having wrongly cutting her off her benefit and paid her £19,281.06 in benefits.

Maya’s story reveals the cruel, dehumanising nature of automated decisions in the welfare system and their toll on life.

Education – Marked down by machines

A-level grading scandal:

During the pandemic, an algorithm was used to predict school exam grades in the UK. Thousands of students received lower grades than expected, especially in lower-performing schools. Private schools saw an increase in top grades. After the backlash, the government scrapped the algorithm.

This automated decision had a serious impact on students’ mental health and caused anxiety over their future.

Ophelia Gregory, a student from Kent, was among the hundreds of students impacted by this algorithm. She organised a student protest in London, calling on the Government to make a u-turn on machine allotted grades and award students teacher-predicted grades.

Law enforcement – Policing using AI

Durham Constabulary’s risk tool:

The now-defunct tool used 34 categories of data – including sensitive postcode data – to assess the risk of reoffending. The police could use such algorithmic tools to make automated decisions based on where people live, inferred emotions, or even regional accents.

Our investigation into the dataset by Experian, a credit referencing company, revealed an astoundingly detailed profile of neighbourhoods, households and even individuals.

This greatly expands the possibilities for bias, discrimination and lack of transparency.

Why this matters now

The Government has passed legislation that would weaken protections against solely machine-led decisions impacting all aspects of our lives.

Stripping away the few safeguards we currently have makes the risk of another Horizon-style catastrophe even greater.

The new legislation will weaken safeguards that prevent all of us from being subjected to solely automated decisions, which risks exacerbating the likely possibility of unfair, opaque and discriminatory outcomes from computers.

Earlier this year, we brought together 30 rights groups to warn against machine-made decisions in policing. We also expertly briefed politicians on the impact of automated decisions on the public and urged them to reject the watering down of safeguards.

The fight is not over. We want to hear from you if you’ve been affected in the A-levels scandal or if your chance at getting an organ transplant was decided by a computer.

Get in touch at info@bigbrotherwatch.org.uk to share your experiences. Together we can shine a light on automated decisions impacting human lives.

DONATE TO BIG BROTHER WATCH