In Depth

Is facial recognition technology racist?

Concerns are expressed over Amazon face ID system that confused 28 members of the US Congress with police suspects

Facial recognition technology is an increasingly common part of modern life, used for everything from unlocking iPhones to advanced surveillance technology. But concerns are being raised that face ID technology can exhibit racial biases and this has potentially serious ramifications.

Commercial face recognition software has repeatedly been shown to be “less accurate” on people with darker skin, writes Gizmodo. Civil rights advocates worry about the “disturbingly targeted” ways face-scanning can be used by police. 

The most recent example of this involved Amazon and its new technology Rekognition.

The facial recognition tool, which Amazon sells to web developers, wrongly identified 28 members of the US Congress – a disproportionate amount of them people of colour – as police suspects from mugshots, reports Reuters.

This is not the first such incident of this kind. As face ID technology becomes increasingly widespread, more and more companies have found that their algorithms have a racial bias.

Earlier this year, Google came under fire for failing to entirely fix a racist algorithm that was originally pointed out in 2015 by software engineer Jacky Alciné. He noticed that the image recognition algorithms in Google Photos were classifying his black friends as gorillas. Instead of fixing its facial recognition technology, Google blocked its image recognition algorithms from identifying gorillas altogether.

A similar issue occurred with Apple’s new face ID technique for unlocking its phones.

Last December, it was found that Apple’s Face ID tech couldn’t tell two Chinese women apart. Apple boasts that its Face ID technology is the most advanced in the world and says the probability that a random person could successfully use it to unlock a smartphone is “approximately 1 in 1,000,000”, according to the Inquirer. But the company was forced to issue a refund to a Chinese woman who reported that her co-worker was able to unlock her iPhone X using the face-scanning tech, “despite having reconfigured the facial recognition settings multiple times”.

The woman, known as Yan, was issued a refund and given a new phone – but encountered the same problem again.

This begs the question: why do these issues occur and can they be solved?

Gizmodo reports that MIT researchers Joy Buolamwini and Timnit Gebru have found that darker-skinned faces are “underrepresented” in the datasets used to train them. This leaves facial recognition “more inaccurate” when looking at dark faces.

Solving the issues of racial bias will require not only technical interventions, but also “hard limits” on how and when face-scanning can be used to protect vulnerable communities.

Even then, they say, face recognition will be “impossible without addressing racism in the criminal justice system it will inevitably be used in”.

Recommended

What will Sue Gray’s report reveal?
Boris Johnson
Today’s big question

What will Sue Gray’s report reveal?

What powers does Sadiq Khan have?
Sadiq Khan
Today’s big question

What powers does Sadiq Khan have?

Has the Conservative Party made the housing crisis worse?
Michael Gove
In Brief

Has the Conservative Party made the housing crisis worse?

‘The great inflation swindle’
Today's newspaper front pages
Today’s newspapers

‘The great inflation swindle’

Popular articles

The mysterious Russian oligarch deaths
Vladimir Putin has previously deployed ‘extreme measures’ to crush opposition
Why we’re talking about . . .

The mysterious Russian oligarch deaths

Depp v. Heard: what the latest battle has revealed
Amber Heard
In Depth

Depp v. Heard: what the latest battle has revealed

Is Vladimir Putin seriously ill?
Vladimir Putin
Why we’re talking about . . .

Is Vladimir Putin seriously ill?

The Week Footer Banner