fbpx

Courier sues Uber Eats over ‘racist’ facial recognition dismissal

Uber Eats

A former Uber Eats courier has brought legal action against the food delivery company, alleging he was unfairly dismissed because of the company’s “racist” facial recognition software.

Uber Eats drivers are required to take a selfie before starting a shift to verify their identity. However, Pa Edrissa Manjang said that the Uber Eats app told him to take multiple selfies a day because the software incorrectly thought he was someone else.

“Your algorithm, by the looks of things, is racist,” said Manjang, who is Black.

On Wednesday an employment tribunal rejected Uber’s motion to dismiss the discrimination claim, which means the case will move to a full hearing.

An Uber spokesperson told UKTN that “automated facial verification was not the reason for Mr Manjang’s temporary loss of access to his courier account”.

The spokesperson added that Uber Eats’ Real-Time ID Check is used for safety and security purposes.

“The system includes robust human review to make sure that we’re not making decisions about someone’s livelihood in a vacuum, without oversight,” the spokesperson said.

Manjang told the Guardian that Uber doesn’t “value” or “respect” its delivery drivers and that it acts in an “aggressive manner”.

“So, as a result, for the people that work for them, we are just numbers,” said Manjang.

AI bias

It’s not the first time that Uber has faced accusations that its facial recognition software is racially biased. In October last year, Abiodun Ogunyemi filed an employment tribunal case after Uber’s automated face-scanning software failed to recognise him, leading to his dismissal.

AI experts say that facial recognition systems generally are not as accurate at identifying people from ethnic minority backgrounds. They say that this is often because the data used to train the algorithm tends to skew towards white men, which makes the software more accurate at identifying people from that demographic compared to others.

Uber is not the only company facing legal issues over the use of facial recognition technology. Yesterday, Big Brother Watch filed a complaint against Southern Co-op over the use of the tech in 35 of its stores.

Clearview AI was fined £7.5m by the Information Commissioner’s Office (ICO) earlier this year for illegally collecting billions of images. The ICO is currently investigating if AI has a racial bias.

It comes at a time when the use of digital identification applications are gradually increasing across the country. Some cinemas have begun accepting digital IDs as a method of proving age.