The Telegraph – Report shows facial recognition’s racial and gender bias

Big Brother Watch Team / January 7, 2020

A new study by a US government agency has found that facial recognition technology has significant problems with recognising women and black and Asian people. Black women are the least likely to be identified correctly, and in some cases, the software was 100 times more likely to correctly identify a white man than an individual from an ethnic minority. Considering how broadly this technology is being adopted by police forces across the world, it raises serious concerns about false arrests.

The Director of Big Brother Watch, Silkie Carlo, says “This authoritative report confirms that facial recognition has dangerous inaccuracies, particularly for ethnic minorities, women and older people. The fact that live facial recognition is used for public surveillance in the UK means the British public are effectively being used as guinea pigs in a dystopian mass surveillance experiment. No other country in Europe has live facial recognition in use on the same scale. It’s imperative that the Government puts an urgent stop to live facial recognition if we’re to avoid descent towards a total surveillance state.”

The Telegraph – ‘Racist and sexist’ facial recognition cameras could lead to false arrests

DONATE TO BIG BROTHER WATCH