top of page

Historic digital rights win for WIE and the ADCU over Uber and Ola at Amsterdam Court of Appeal

  • Uber and Ola found to violate driver rights in robo-firing of workers.

  • Court REJECTS Uber and Ola arguments that disclosure of information about fraud allegations made against the workers would undermine platform security & expose trade secrets.

  • Uber and Ola Cabs ordered to provide information to workers on automated decision making relating to work allocation and fares including dynamic pay & pricing.

  • Court rules that secret worker profiling and management assessments are personal data and must be disclosed.

  • Ruling a bittersweet victory as the UK government advances a bill through parliament this month which strips workers of the rights successfully claimed in this ruling.


In a series of historic and wide-ranging digital rights rulings, the Court of Appeals in Amsterdam has found in favour of workers and against Uber and Ola Cabs. Worker Info Exchange brought the cases in support of members of the App Drivers & Couriers Union in Great Britain and a driver based in Portugal.

The cases were brought under the GDPR which guarantees everyone the right to demand access to their personal data processed by any organisation and to receive meaningful information about the processing of such data. In addition, the GDPR gives everyone certain protections from automated decision making where there are significant negative consequences.

The first case involved four drivers who were found to be effectively robo-fired by Uber without recourse.

The second case involved the denial of access to personal data upon requests made to Uber by six drivers.

The third case involved the denial of access to personal data upon requests made to Ola Cabs by three drivers.


Unfair automated decision making

The drivers faced spurious allegations of ‘fraudulent activity’ by Uber and were dismissed without appeal. When the drivers requested an explanation for how Uber systems had surveilled their work and wrongly determined they had engaged in fraud, Uber stonewalled them.


The court found that the limited human intervention in Uber’s automated decisions to dismiss workers was not "much more than a purely symbolic act". The decision to dismiss the drivers was taken remotely at an Uber office in Krakow and the drivers were denied any opportunity to be heard. The court noted that Uber had failed to make“clear what the qualifications and level of knowledge of the employees in question are. There was thus insufficient evidence of actual human intervention.”


The court found that the drivers had been profiled and performance managed by Uber: “This example illustrates, in the court's view, that it involves automated processing of personal data of drivers whereby certain personal aspects of them are evaluated on the basis of that data, with the intention of analysing or predicting their job performance, reliability and behaviour.”


Uber and Ola Cabs must reveal how automated decision making is used to determine pay and allocation of work.

The court ordered that Uber must explain how driver personal data and profiling is used in Uber’s upfront, dynamic pay and pricing system. Similarly, the court ordered Uber to transparently disclose how automated decision making and worker profiling is used to determine how work is allocated amongst a waiting workforce. Last year, a Harvard Business review paper called for dynamic pricing systems to be closely regulated due to the risk of exploitation and tacit collusion.

In ruling in favour of the drivers, the court noted that Uber’s dynamic pricing"taken as a whole, affects the drivers to a considerable extent. This system is applied to every passenger they carry. These are therefore successive decisions, each with financial consequences that determine the income they can earn.”

Ola Cabs was also ordered to disclose meaningful information about the use in automated decision making of worker earnings profiles and so called ‘fraud probability scores’ used in automated decision making for work and fares allocation.

The court also ruled that internally held profiles relating drivers and associated performance related tags must be disclosed to drivers.


Abuse of rights?

The court rejected Ola’s far-reaching argument that the requests for data and the involvement of Worker Info Exchange and the ADCU trade union amounted to an abuse of the data protection rights of the individual appellants.


Platforms cannot rely on ‘trade secret’ arguments to deny transparency to workers.

The court rejected arguments by both Uber and Ola Cabs that to explain allegations and automated decision making negatively effecting workers would threaten their rights to protect trade secrets. The court ruled that such claims were entirely disproportionate relative to the negative effect of unexplained automated dismissal and disciplining of workers.


Significance in the UK and beyond


Beyond the data protection violations, the evidence of performance profiling is a strong indication that Uber workers are indeed employees but are still denied employment protections across Europe which would include protections from unfair dismissal.


Uber and Ola Cabs have relied on technology and automated processes for driver workforce management to the extent that they have singularly failed to convince the court that decisions relating to dismissals, pay and work allocation contained any meaningful human intervention.


It is clear in their so-called fraud detection procedures, Uber and Ola Cabs relied on a high degree of automation without the proper checks and balances of due process including an appeals process. In the case of the innocent drivers represented in this case and for many others, the consequences of such solely automated decision making can have devastating effects as the court recognised:

"In the opinion of the court of appeal, it is evident that these decisions affect the [appellant] to at least a considerable extent, as they have the consequence that the drivers can no longer provide for their income through the use of the Uber Driver app and can thus no longer recoup the investments they have made. In addition, the allegations in this case are serious and may also have repercussions (criminal or otherwise) have for [appellant] et al, in particular for their further or future activities, such as, for example, with regard to their taxi licence. Similarly, in its email to [appellant] dated 4 August 2020, Uber added: In certain circumstances we may also notify the police if your activity could constitute a criminal offence. Moreover, the present decisions also entail a legal consequence for [appellant] et al, as Uber thereby terminated the agreement with them.”


Ironically, in the UK the government this month will be introducing the Data Protection and Digital Information Bill for its second reading in parliament. If passed, the bill will strip workers of protections against abusive employer automated decision making as successfully claimed in this case. In addition, workers will face an increased hurdle to access personal data and to receive an explanation of processing, the importance of which has also been illustrated by this case.


James Farrar, Director of Worker Info Exchange said:


This ruling is a huge win for gig economy workers in Britain and right across Europe. The information asymmetry & trade secrets protections relied upon by gig economy employers to exploit workers and deny them even the most basic employment rights for fundamentals like pay, work allocation and unfair dismissals must now come to an end as a result of this ruling. Uber, Ola Cabs and all other platform employers cannot continue to get away with concealing the controlling hand of an employment relationship in clandestine algorithms.
Too many workers have had their working lives and mental health destroyed by false claims of fraudulent activity without any opportunity to know precisely what allegations have been made let alone answer them. Instead, to save money and avoid their responsibility as employers, platforms have built unjust automated HR decision making systems with no humans in the loop. Left unchecked, such callous systems risk becoming the norm in the future world of work. I’m grateful for the moral courage of the courts expressed in this important ruling.

The ruling comes as a bittersweet victory considering that the UK government plans to strip workers of the very protections successfully claimed in this case. Law makers must learn important lessons from this case, amend the bill and protect these vital rights before it is too late. Similarly, the Council of the European Union must not hesitate in passing the proposed Platform Work Directive as passed by the European Parliament. "


Azeem Hanif, Chair of ADCU Nottingham and member of the ADCU National Executive Committee said:


"I am gratified to see that the courts have resisted Uber and Ola Cabs to uphold the rights of trade unions to assist workers in their demands for algorithmic transparency at work individually and for data aggregation for the purposes of collective bargaining."


Anton Ekker of Ekker law representing in this case said:


"Today’s judgments are a big win for Uber and Ola drivers and for all people working in the platform economy. Transparency about data processing on Uber and Ola's platforms is essential for drivers to do their jobs properly and to understand how Uber makes decisions about them. The practical and legal objections raised by Uber and Ola were largely rejected by the Amsterdam Court of Appeals.

Of great importance, in addition, is the Court's finding that several automated processes on Uber and Ola's platforms qualify as automated decision-making within the meaning of Article 22 GDPR. These include assigning rides, calculating prices, rating drivers, calculating 'fraud probability scores' and deactivating drivers' accounts in response to suspicions of fraud. The judgments clearly establish that drivers are entitled to information on the underlying logic of these decisions."


BACKGROUND:

Rulings in Dutch can be accessed here:


Dutch rulings translated to English here: (Please note that this is an unofficial translation)

Proposed Data Protection and Digital Information Bill due for second reading in Parliament this month. The bill will gut protections from automated decision making and increase the hurdle to access rights – both successfully claimed in this bill: https://bills.parliament.uk/bills/3430

Opmerkingen


bottom of page