This video says about itself:
German police tests face recognition software | DW English
25 August 2017
Authorities trial new surveillance cameras with facial recognition technology at a train station in Berlin. Supporters say the new system is needed to prevent future terrorist attacks. But not everyone agrees.
According to the BBC, police face recognition software misidentified over 2,000 people at a football match in Britain as ‘criminals’.
It looks like that British police software is as ‘reliable’ as the fraudulent software of Volkswagen and other car corporations mismeasuring exhaust pollution. And as dangerous to human rights as the ‘Palantir’ Big Brother software of United States Donald Trump-loving and women’s suffrage-hating billionaire Peter Thiel.
From daily The Independent in Britain today:
Facial recognition to be deployed by police across London, sparking human rights concerns
The Independent on the scene as police trial controversial new technology
Lizzie Dearden, Home Affairs Correspondent
Millions of people face the prospect of being scanned by police facial recognition technology that has sparked human rights concerns.
The controversial software, which officers use to identify suspects, has been found to be “staggeringly inaccurate”, while campaigners have branded its use a violation of privacy.
But Britain’s largest police force is set to expand a trial across six locations in London over the coming months.
Police leaders claimed officers make the decision to act on potential matches with police records and images that do not spark an alert are immediately deleted.
But last month The Independent revealed the Metropolitan Police’s software was returning “false positives” – images of people who were not on a police database – in 98 per cent of alerts.
The technology, which has previously been used at Notting Hill Carnival and Remembrance Sunday services, was used on thousands of shoppers in Stratford, east London.
Scotland Yard said the Stratford operation would be “overt” and that members of the public passing the cameras would be handed leaflets, but The Independent did not observe any information being proactively given out.
The majority of those passing through a line of police officers straddling a bridge appeared not to see posters saying facial recognition technology was being used through the throngs of shoppers.
Sophia Pharaoh said she felt “uncomfortable” knowing the software was in use at the busy intersection, which sits between two shopping centres near Stratford Tube and railway station, adding: “It’s an invasion of privacy and there’s no way around it.”
Hannah Couchman, an advocacy and policy officer at Liberty who monitored the trial in Stratford, described the technology as “lawless”.
“There’s no dedicated legislation, there’s no guidance, there’s no good practice,” she said. “It’s staggeringly inaccurate and this sort of technology has been shown in America has shown to be actively biased and misidentify women and black people.
“Liberty believes the use of this technology in a public place is not compatible with privacy, and has a chilling effect on society.”
Liberty has threatened legal action against South Wales Police over its facial recognition programme, while campaign group Big Brother Watch is attempting a case against the Met.
The latest trial came as the government announced the creation of a new oversight and advisory board for facial recognition in law enforcement, which could be expanded to ports, airports, custody suites and police mobile devices.
The Home Office’s strategy on biometrics, which also include fingerprints and DNA, said the board would make recommendations on policy changes and oversight arrangements for technology that is currently being purchased ad hoc by police forces.
Meaning that if a computer says that your face looks vaguely like someone suspected of terrorism, you may be shot dead as a ‘terrorist‘ without trial. Look at what happened to innocent Brazilian Jean Charles de Menezes in London.
Wednesday, July 25, 2018. Big Brother Watch launches fight against ‘Orwellian’ facial recognition: here.