An Uber driver who lost his job when automated face-scanning software failed to recognise him is accusing the firm of indirect race discrimination in a legal test case.
The black driver, who worked on the Uber platform from 2016 until April 2021, has filed an employment tribunal claim alleging his account was illegally deactivated when facial-verification software used to log drivers on to the ride-hailing app decided he wasn’t who he said he was.
The Independent Workers of Great Britain trade union, which is backing the action, claimed at least 35 other drivers had had their registration with Uber terminated as a result of alleged mistakes with the software since the start of the pandemic. It is calling for Uber to scrap the “racist algorithm” and reinstate terminated drivers.
An Uber spokeswoman said the firm “strongly refutes the completely unfounded claims” and said it is “committed to fighting racism and being a champion for equality—both inside and outside our company.”
She said the checks were’ “designed to protect the safety and security of everyone who uses the app by ensuring the correct driver is using their account.”
Drivers can choose human verification of their picture and when technology is chosen, “there is always a minimum of two human expert reviews prior to any decision to remove a driver”, she said.
Uber has used the software since April 2020. In 2019 Microsoft, which makes the software, conceded facial recognition software didn’t work as well for people of colour and could fail to recognise them.
Studies of several facial recognition software packages have shown that error rates when recognising people with darker skin have been higher than among lighter-skinned people, although Microsoft and others have been improving performance. Uber said its software does not rely on scanning large numbers of faces, which has been blamed for introducing error. Rather it verifies an already uploaded picture of the driver against their freshly submitted selfie.
In London, nine out of 10 private hire drivers are black or black British, Asian or Asian British or of mixed race, according to a recent survey by TfL.
“Uber’s continued use of a facial recognition algorithm that is ineffective on people of colour is discriminatory,” said Henry Chango Lopez, general secretary of the IWGB. “Hundreds of drivers and couriers who served through the pandemic have lost their jobs without any due process or evidence of wrongdoing.”
A Nigerian driver who worked on the Uber Eats platform in Manchester until he was locked out in March after several failed attempts using the facial verification software, said his family had faced “serious suffering” as a result.
Abiodun Ogunyemi, an Uber Eats courier in Manchester who has been locked out of the system after it failed to recognise his face on several occasions. Photograph: Supplied
Abiodun Ogunyemi, a married father of three, said he had run up debts so large he couldn’t afford his son’s bus fare to get to school. He says the photo on Uber’s records did not feature the longer hair or beard he currently has, but he has a distinctive scar over one eye and the rest of his face is visible.
“I feel the algorithm is discriminatory to people of colour,” he said. “I know about five black people the same thing has happened to.”
Uber said anyone who is removed from the platform can to appeal the decision with an additional human review.
On 10 April the driver in the test case, who asked not be named, tried to log on for work by submitting a photo through the app, but received a message from Uber saying he had failed to verify his identity and he was locked out of the system for 24 hours. He submitted a second photo after that period, but it didn’t work either.
According to his claim, four days later his account was deactivated and he was sent a message stating: “Our team conducted a thorough investigation and the decision to end the partnership has been made on a permanent basis. The matter is not subject to further review.”
His case is also being backed by the Black Lives Matter organisation which said in a statement: “The gig economy, which already creates immense precarity for Black key workers, is now further exacerbated by this software.”
Microsoft declined to comment on an ongoing legal case.