Gig Companies Are Manipulating Their Workers. Dark Patterns Laws Should Step In

By Kathryn Taylor

February 7, 2023

Have you ever clicked on what you thought was a deal for concert tickets or a hotel room, only to find after reaching the final checkout page that the total price was far higher than advertised due to additional fees? Tried to unsubscribe from an email list, but the button was hidden in a tiny, lightly colored font? Given a website your email address in exchange for a discount, only to subsequently be asked for your phone number before you could unlock your offer code? As with all online consumers today, your answer is almost certainly yes, meaning you have been the target of what are called dark patterns”— user-facing web designs that trick you into doing things that you would not otherwise have done, such as spending more money or giving up more data than you intended.1The formal definition of “dark patterns” is nascent. Harry Brignull, the user experience expert who coined the term in 2010, defined them simply as design choices that “trick users into [doing] things they wouldn’t otherwise have done.” The California Privacy Rights Act, the first state law to cover the subject, defines a dark pattern as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.” Instead of demanding a single definition, recent scholarship has homed in on a set of core themes that unite most conceptions of dark patterns, such as that they are usually “deceptive” and “covert.” Arunesh Mathur, Jonathan Mayer & Mihir Kshirsagar, What Makes a Dark Pattern…Dark? Design Attributes, Normative Considerations, and Measurement Methods, 2-4 Proc. 2021 CHI Conf. on Hum. Factors in Computing Sys. (May 2021), https://dl.acm.org/doi/pdf/10.1145/3411764.3445610. Recent privacy laws in three states, multiple proposed federal bills, as well as ramped up enforcement activity by the FTC, have sought to protect consumers from some of the most frustrating dark patterns. While a welcome improvement, these efforts have left open a glaring—and rapidly expanding—gap: they fail to address the dark patterns that are being deployed in the workplace.

In a landmark new study of the gig worker economy, U.C. Hastings Law Professor Veena Dubal has exposed a constellation of technology-driven practices that are being deployed by companies to subvert worker autonomy and keep wages low, varied, and difficult to predict or calculate.2Veena Dubal, On Algorithmic Wage Discrimination (Jan. 19, 2023), https://ssrn.com/abstract=4331080. For example, ride-share companies like Uber do not pay workers an hourly wage, but rather pay them on a variable per-ride basis, which is then supplemented with bonuses that are available to various drivers at various times. According to drivers, what their companies call “bonuses” (also called “offers,” “surges,” or “quests”) are actually essential wages, without which their income would be intolerably low.3Id. at 17. But these bonuses are unpredictable in their timing, often appearing only as a manipulation tactic to get drivers to stay online when they otherwise wouldn’t. Uber has been known to incentivize workers who are trying to stop driving for the day to continue by offering them a bonus the moment they try to close out. Further, according to a driver named Domingo who Professor Dubal interviewed, even when drivers are aware of an upcoming bonus and are actively working to earn it, their employer can use that against them. Once, when Domingo was one trip away from hitting 96 rides in a week, for which he had been promised a $100 bonus, Uber suddenly stopped sending him rides, presumably to keep him in the available driver pool for longer that evening.4Id. at 36-37.

Together, these kinds of practices amount to what Dubal has termed “algorithmic wage discrimination,” where “individual workers are paid different hourly wages—calculated with ever-changing formulas using granular data on location, individual behavior, demand, supply, and other factors—for broadly similar work.”5Id. at 5. The powerful enabler of this data-intensive approach is the recent explosion of workplace surveillance. From deploying productivity-tracking software to analyzing employee messaging platforms to mandating AI-enabled video monitoring to engaging in live location tracking, employers are gaining a staggering amount of insight into workers’ actions and mindsets. In the highly technologized gig-economy, this means companies like Uber and Amazon enjoy the lion’s share of information and power, while their workers are kept guessing at the algorithmically-determined rules of their workplace. Algorithmic wage discrimination is ultimately a means of control that keeps workers on an unsteady and opaque financial footing, subjugates them to the whims of their employer, and pits them against one another as they wonder why they received different pay for the same work.6Id. at 27-28.

This financially and psychologically devastating state of play must be remedied through legislation. While Dubal’s article broadly calls for legislative action, I am specifically proposing that dark patterns laws be part of the solution. This is because, according to many of Dubal’s study subjects, the feeling of being tricked or of having to gamble in order to earn money is core to the experience of algorithmic wage discrimination.7Id. at 2, 36-41. When these feelings are caused by deliberate technology design, dark patterns are afoot. Concealing how much a driver could stand to earn in a day if they completed X number of rides until the moment they are clocking out is likely a dark pattern. Blocking a driver from receiving any rides while making it look like they could receive one at any moment, in an effort to delay them from reaching their bonus, is likely a dark pattern. Another Uber practice mentioned in Dubal’s article—notifying drivers of a “surge” at a particular location to induce drivers to go there, when no surge was in fact occurring—is certainly a dark pattern; the company is using deception to trick drivers into moving locations, when they otherwise wouldn’t have.8Id. at 37 Dark patterns is clearly a useful lens through which to assess the practices of companies that engage in algorithmic wage discrimination.

At both the state and federal level, new laws and enforcement efforts should prohibit employers from deploying dark patterns against employees in the technologies they use to do their jobs. It would be difficult and of questionable utility for these laws to prescriptively list each possible interface choice that employers should avoid. Rather, employment dark patterns laws should aim to eliminate employee-facing interface designs that embody any of the following commonly recognized attributes of dark patterns:

  • Asymmetric (making options that are detrimental or less appealing to employees more visible while making more beneficial options harder to access );
  • Restrictive (overly reducing or eliminating choices that should be available to employees);
  • Disparate in treatment (purposely disadvantaging a subset of employees);
  • Covert (hiding the mechanism of influence over employees);
  • Deceptive (inducing false beliefs about some aspect of the employment); or
  • Information hiding (obscuring or delaying access to information essential to the employment).9See Mathur et al., supra note 1, at 6.

Practices that have these characteristics create a sense that “the structures and functions of the machine boss are designed to take advantage of [workers] by providing the illusion of agency,” while in reality holding them to a demoralizingly and detrimentally low pay rate.10See Dubal, supra note 2, at 28. While there are many aspects of the algorithmically-governed workplace that dark patterns efforts cannot remedy, the elements of overt deception and manipulation that pervade the stories in Dubal’s study are well within reach.

There is growing precedent for applying consumer-oriented policies and authorities to the employment context. In addition to many components meant to protect consumers, the White House’s 2022 “Blueprint for an AI Bill of Rights” included multiple calls to protect job applicants and employees from algorithmic discrimination. The FTC, traditionally tasked with protecting consumers from unfair and deceptive corporate practices, recently proposed a ban on employers from imposing noncompete clauses on their workers. This inclusion of employment in domains that are generally more consumer-centric could indicate an appetite for expanding the heretofore consumer-oriented conception of dark patterns to include the workplace.

Employees are not a separate population from consumers – in fact, most of us occupy both roles on a daily basis. If we deserve to be protected from technologically-driven ills in our private lives, we certainly should be protected at work, where much more is often on the line, both financially and psychologically. Bringing employment under the purview of a consumer-protective concept like dark patterns is more than appropriate.


Kathryn Taylor, J.D. Class of 2023, N.Y.U. School of Law.

Suggested Citation: Kathryn Taylor, The Next Round of Dark Patterns Laws Should Address Algorithmic Wage DiscriminationN.Y.U. J. Legis. & Pub. Pol’y Quorum (2023).

  • 1
    The formal definition of “dark patterns” is nascent. Harry Brignull, the user experience expert who coined the term in 2010, defined them simply as design choices that “trick users into [doing] things they wouldn’t otherwise have done.” The California Privacy Rights Act, the first state law to cover the subject, defines a dark pattern as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.” Instead of demanding a single definition, recent scholarship has homed in on a set of core themes that unite most conceptions of dark patterns, such as that they are usually “deceptive” and “covert.” Arunesh Mathur, Jonathan Mayer & Mihir Kshirsagar, What Makes a Dark Pattern…Dark? Design Attributes, Normative Considerations, and Measurement Methods, 2-4 Proc. 2021 CHI Conf. on Hum. Factors in Computing Sys. (May 2021), https://dl.acm.org/doi/pdf/10.1145/3411764.3445610.
  • 2
    Veena Dubal, On Algorithmic Wage Discrimination (Jan. 19, 2023), https://ssrn.com/abstract=4331080.
  • 3
    Id. at 17.
  • 4
    Id. at 36-37.
  • 5
    Id. at 5.
  • 6
    Id. at 27-28.
  • 7
    Id. at 2, 36-41.
  • 8
    Id. at 37
  • 9
    See Mathur et al., supra note 1, at 6.
  • 10
    See Dubal, supra note 2, at 28.