Eek.
A Dutch court has ordered Uber to reinstate and pay compensation to six workers who claim to have been fired as a result of mistakes in Uber’s algorithms.
This so-called 'robo-firing' is an example of workplace surveillance technology gone wrong, according to the ADCU, who brought the case against Uber.
What are the implications of workplace surveillance on workers' rights?
A district court in Amsterdam has ordered Uber to reinstate six drivers who it determined were unfairly dismissed by the company’s algorithms. It has also ordered Uber to pay the driver’s compensation totaling more than €100,000.
Uber, which has its European headquarters in Amsterdam, failed to contest the so-called ‘robo-firing’ case — which was brought by the App Drivers and Couriers Union (ADCU) supported by Worker Info Exchange — so a default judgement was entered in favor of the drivers.
The six drivers – five British and one Dutch – were fired from Uber for alleged fraudulent actions, including account sharing. Uber relies on real-time facial recognition ID software and location data to authenticate its drivers, with the aim of keeping its platform safe for users.
However, the six drivers claim their dismissal was unfair and based on mistakes made by Uber’s technology and algorithms. The ADCU also argued that Uber had failed to provide the drivers with proper supporting evidence explaining their dismissal.
The court agreed and ruled they were fired “based solely on automated processing, including profiling”, citing Article 22 (1) of General Data Protection Regulation (GDPR). This article protects individuals against purely automated decisions that have a legal or significant impact on their lives. The ADCU believes this is the first case to invoke that Article of GDPR and rule to overturn ‘robo-firing’ based on technology.
Therefore, the court ordered Uber to re-activate their driver accounts within one week.
Uber would be required to pay the six drivers a penalty of €5,000 for every day they fail to comply with the court reinstation order until a maximum of €50,000 has been reached.
Worker Info Exchange director James Farrar said: “For the Uber drivers robbed of their jobs and livelihoods, this has been a dystopian nightmare come true. They were publicly accused of ‘fraudulent activity’ on the back of poorly governed use of bad technology.
“In the aftermath of the recent UK Supreme Court ruling on worker rights, gig economy platforms are hiding management control in algorithms.”
Uber claims it was not aware of the case before this week – the deadline for it to contest the suit was 29 March – and therefore argues that correct procedure was not followed by the ADCU, according to reporting by TechCrunch.
It also told ITV that no accounts are deactivated by Uber without a human examining the algorithm’s decision.
The company now plans to contest this judgement, citing a ruling on a similar case from the same Dutch court that ruled in its favor in late March.
In that ruling, the court did not find that its automated routing and scheduling systems had a “legal or similarly significant” effect for drivers under European Union law.
In responding to that March case, an Uber spokesperson told TechCrunch:
“The Court also confirmed that Uber’s processes have meaningful human involvement.
“Safety is the number one priority on the Uber platform, so any account deactivation decision is taken extremely seriously with manual reviews by our specialist team.”
At the time of publication, Uber had not replied to UNLEASH’s request for comment on the recent Amsterdam court ruling.
As a result of Uber’s dismissal of one of the UK drivers based in London, Abdifatah Abdalla, the ride-hailing company alerted Transport for London (TfL) and Abdalla lost his private hire license from the local government body, meaning he was also unable to drive for other ride-hailing apps – Kapten and Bolt. Abdalla, therefore, lost all of his income and was forced to take a lower-paid delivery job.
As reported by the Guardian, TfL said Uber claimed Abdalla had been removed because there were suggestions someone else was using his account. Abdalla insists that he did not share his log-in details and does not know how Uber could have found someone else had logged into his account.
However, earlier this week, a City of London magistrate court ordered TfL to reinstate Abdalla’s license. As reported by ITV, TfL was criticized by the chair of the bench for its willingness to accept Uber’s evidence and not investigating the dismissal.
ADCU president Yaseen Aslam responded: “I am deeply concerned about the complicit role Transport for London has played in this catastrophe.
“They have encouraged Uber to introduce surveillance technology as a price for keeping their operator’s license.”
Uber first introduced workplace electronic surveillance and ID systems in 2019 when it was seeking to regain its license from TfL, which would allow it to operate in London.
TfL has been concerned about the company’s failures to protect passengers from risk, particularly that Uber’s platform allowed unauthorized drivers to upload their photos to other drivers’ accounts, as well as for dismissed or suspended drivers to create a new Uber account and accept trips.
A TfL spokesperson said: “The safety of the travelling public is our top priority and where we are notified of cases of driver identity fraud, we take immediate licensing action so that passenger safety is not compromised.
“We always require the evidence behind an operator’s decision to dismiss a driver and review it along with any other relevant information as part of any decision to revoke a licence.
All drivers have the right to appeal a decision to remove a license through the magistrates court.”
There is evidently a need to rebalance the need to both protect Uber users and passengers and workers rights against unfair dismissal by technology.
“This case is a wake-up call for lawmakers about the abuse of surveillance technology now proliferating in the gig economy,” added Farrar.
Employee surveillance is a hot topic at the moment, particularly in the gig industry.
For example, at the end of March, Amazon hit the headlines when it required drivers to consent to the use of Netradyne AI-powered cameras in delivery vehicles.
This has led to scrutiny by five US senators, including Elizabeth Warren and Bernie Sanders, who have called on the company’s CEO Jeff Bezos to provide more details about its latest employee surveillance initiative.
In a letter to Bezos, the senators wrote: “We need a better understanding of how your company will protect against potential new safety hazards stemming from increased worker surveillance”.
In a statement, Amazon spokesperson Deborah Bass said: “Netradyne cameras are used to help keep drivers and the communities where we deliver safe.
“Don’t believe the self-interested critics who claim these cameras are intended for anything other than safety.”
In addition, the UK’s Trade Union Congress (TUC) has expressed general concern that the use of artificial intelligence (AI) and other monitoring technology is creating discriminatory, dehumanized workplaces.
Research by the TUC found that workers believed that AI was flawed and made mistakes, creating a lack of trust in relationships between employee and employer.
Therefore — going back to the Uber case – the TUC called all workers to have the legal right to a human review of decisions made by AI technologies and provided them with the opportunity to challenge the decisions.
Get the Editor’s picks of the week delivered straight to your inbox!
Chief Reporter
Allie is an award-winning business journalist and can be reached at alexandra@unleash.ai.
"*" indicates required fields
"*" indicates required fields