Facial recognition in Great Britain: Innocent suspects

In London and Manchester, two people were wrongly treated as criminals due to facial recognition AI. This has led to lawsuits.

Save to Pocket listen Print view
Gesichtserkennungs-KI im Einsatz

(Image: Trismegist san/Shutterstock.com)

4 min. read
Contents
This article was originally published in German and has been automatically translated.

In the UK, several people have been wrongly accused of being criminals due to errors in facial recognition software. One of those wrongly accused works for the anti-violence organization "Street Fathers". He was stopped by the police on his way home from work because they had mistakenly identified him as a wanted person. He was then detained for 20 minutes and had to give fingerprints, although he was able to prove his identity with various documents. This was reported by the BBC.

The Metropolitan Police use cameras to capture thousands of facial images and match them with people on a police watch list. According to the BBC, the Metropolitan Police have already made 192 arrests this year using the system.

In another case, a woman in Manchester was accused of theft after the facial recognition software "Facewatch" installed in a "Home Bargain" branch, a kind of DIY store, falsely recognized her as a suspect. As a result, she was banned from all other stores. Facewatch is also used in other stores in the UK to identify shoplifters.

Facewatch is used in many places in the UK.

(Image: Facewatch Limited)

The Metropolitan Police told the BBC that the probability of the software falsely accusing passers-by is one in 33,000. However, the error rate is much higher once someone has been flagged as suspicious.

According to the BBC, Michael Birtwhistle, researcher at the Ada Lovelace Institute, describes the current legal situation as the "Wild West" and believes that the technology is still so new that the legal situation regarding the use of such software is still unclear. According to the civil rights organization Big Brother Watch, Facewatch has committed eight breaches of data protection laws. Despite this, Chris Philp, the Minister of State responsible at the Home Office, is campaigning for a further expansion of the use of the controversial technology.

Civil rights groups are expressing serious concerns about the accuracy of live facial recognition. Time and again, reports are making the rounds that facial recognition algorithms are not working reliably, especially for black people, for example at Uber Eats. These cases illustrate the consequences that facial recognition system errors can have for those affected. The organization "Big Brother Watch" is therefore calling for an immediate stop in order to counteract injustice, violations of civil rights and the lack of a legal framework. It wants to help those affected to take legal action against the Home Bargain branch and the Metropolitan Police, where the software is used. According to information from Big Brother Watch, those affected want to take legal action independently of each other so that this does not happen to others.

Silkie Carlo, Director of Big Brother Watch, warns in a press release that facial recognition is becoming the norm and urges caution. "Live facial recognition surveillance turns our faces into barcodes and turns us into […] suspects who […] can be falsely accused, grossly mistreated and forced to prove our innocence to the authorities," says Carlo.

"In Europe, data protection authorities have fined supermarkets for using live facial recognition", writes Big Brother Watch. The EU wants to severely restrict the use of facial recognition by law enforcement agencies with AI regulation. In the USA, several cities and states have also banned the use of live facial recognition following misidentifications. According to Big Brother Watch, the UK government recently announced its intention to introduce a facial recognition strategy soon.

(mack)