Home Office's facial recognition software 'systemically racist'

Home Office’s facial recognition software is branded ‘systemically racist,’ as study finds it is twice as likely to reject a picture of a black woman than a white man

  • Passport check technology is twice as likely to reject black women as white men
  • Concerns have been raised over the software, which is used by the Home Office
  • Black women were told their pictures were of poor quality 22 per cent of the time

Facial recognition software used to check passports has been called ‘systemically racist,’ after a study found it was twice as likely to reject a picture of a black woman than a white man.

An investigation into the technology, which is used by the Home Office, found 22 per cent of dark-skinned women were told their pictures were poor quality.

That compares with just nine per cent of light skinned men.

Student Elaine Owusu, 22, from London, was wrongly told her mouth appeared open on five different photos she tried to use.

Student Elaine Owusu tried five different pictures on the Home Office website, but the software kept rejecting them, saying that her mouth was open

She claimed the error is evidence of system racism.

Ms Owusu told the BBC: ‘If the algorithm can’t read my lips, it’s a problem with the system, and not with me.’

White women were slightly more likely (14 per cent) to have their picture flagged than black men (15 per cent), according to the BBC. 

 The software is used by the Home Office as part of its online application process, determining whether photographs are appropriate for a passport.

An investigation has revealed the technology has flaws.

It can flag people’s eyes are being closed when they were not, or mistake people’s lips for an open mouth.  

The Home Office brought the service live in 2016, despite being aware of the issues.

A recent study by the BBC used more than 1,000 photographs of politicians from around the world on the checker.

Black women were the most likely group to have a picture rejected on the Home Office website, followed by white women, and then black men

It revealed women with the darkest skin tone were four times more likely to receive a poor quality grade than their lighter skinned colleagues. 

A Home Office spokesman said: ‘The indicative check [helps] our customers to submit a photo that is right the first time.

‘Over nine million people have used this service and our systems are improving. We will continue to develop and evaluate our systems with the objective of making applying for a passport as simple as possible for all.’

Source: Read Full Article