Police face recognition software misidentifies over 2,000 people as ‘criminals’


This video says about itself:

German police tests face recognition software | DW English

25 August 2017

Authorities trial new surveillance cameras with facial recognition technology at a train station in Berlin. Supporters say the new system is needed to prevent future terrorist attacks. But not everyone agrees.

From the BBC:

2,000 wrongly matched with possible criminals at Champions League

4 May 2018

More than 2,000 people were wrongly identified as possible criminals by facial scanning technology at the 2017 Champions League final in Cardiff.

South Wales Police used the technology as about 170,000 people were in Cardiff for the Real Madrid v Juventus game.

But out of the 2,470 potential matches with custody pictures – 92% – or 2,297 were wrong … according to data on the force’s website. …

But civil liberties campaign group Big Brother Watch called for the system to be scrapped, adding it was “outrageous” that more than 2,000 at the event had been wrongly identified.

“Not only is real-time facial recognition a threat to civil liberties, it is a dangerously inaccurate policing tool”, said director Silkie Carlo.

“The tech misidentifies innocent members of the public at a terrifying rate, leading to intrusive police stops and citizens being treated as suspects.”

Misleading, incompetent and authoritarian: the [British] Home Office’s defence of facial recognition: here.

It looks like that British police software is as ‘reliable’ as the fraudulent software of Volkswagen and other car corporations mismeasuring exhaust pollution. And as dangerous to human rights as the ‘Palantir’ Big Brother software of United States Donald Trump-loving and women’s suffrage-hating billionaire Peter Thiel.

Face recognition at Dutch private businesses: here.

An AI Lie Detector Is Going to Start Questioning Travelers in the EU: here.