Scale AI under fire in suit filed by former worker alleging unlawful business practices
A new class action lawsuit alleges poor working conditions and “exploitive” behavior by AI data processing company Scale AI, saying that workers responsible for generating much of its product were mischaracterized by the company as independent contractors, rather than full employees.
Scale AI’s core business centers on using human input to label and shape AI responses to queries, helping to make responses more accurate and usable. For instance, a worker might label images from a car’s LIDAR detector to help create an AI that more accurately identifies objects.
To get this kind of human input, according to a complaint filed Tuesday in the Superior Court of California, Scale AI outsources work through services like Outlier, where named plaintiff Steve McKinney worked until June. Tasks for Scale AI, the complaint alleges, were assigned algorithmically, with payments reduced or denied for projects that exceeded a designated time limit. McKinney’s suit said that this amounts to a bait-and-switch in terms of promised compensation. In addition, it noted, workers were not paid for peripheral functions such as reviewing project guidelines, seeking clarification, or attending required training webinars.
Moreover, the subject matter of many prompts, some of which involved suicidal ideation and violence, among other disturbing topics, coupled with restrictions from Scale AI around break times and outside research, created a grueling, authoritarian workplace in which workers could be terminated for complaining about working conditions, payments, or company processes, the complaint said.
Additionally, the suit says that McKinney and the many others in his position were misclassified under California law as independent contractors, rather than employees. Generally speaking, employers have fewer legal responsibilities to independent contractors than they have to full employees, who are more likely to be subject to state and federal laws about overtime payment, among other things.
California’s legal standard for deciding which workers are independent contractors and which are employees is fairly strict, and is referred to as an ABC test, for its three-pronged nature. According to the California Labor and Workforce Development Agency, workers are employees unless they are free from the control and direction of the hiring entity, are doing work outside the usual course of the hiring entity’s business, and are “customarily engaged” in an independent business of the type they’re being hired for. None of those standards, the lawsuit argues, are met in the case of McKinney and the other Scale AI workers in his position.
The suit, which describes Scale AI as the “sordid underbelly propping up the generative AI industry”, was filed on McKinney’s behalf by the Clarkson Law Firm, based in Malibu, California. The firm has been at the forefront of civil litigation against the tech industry where AI is concerned, appearing for multiple plaintiffs in cases around copyright, privacy, and more.
Ryan Clarkson, the firm’s managing partner, said that the rapid growth of generative AI as a business has had corrosive effects on tech workers around the world.
“Scale AI has built its business on a model of exploitation, relying on thousands of workers from across the globe to be paid less than a living wage to train AI applications for hours on end,” he said in a statement. “These workers operate under strict company control and are being cheated out of labor code protections. It’s unlawful and unacceptable.”
Scale AI’s marketing materials advertise that it works with some of the biggest players in the AI space, including Microsoft, Meta, Alphabet, and Nvidia, although none of these companies had responded to requests for comment about the matter by the time this article was published. Earlier this year, Scale AI shut down its RemoteTasks subsidiary in several countries, including Nigeria, Kenya and Pakistan, without notice to its regular gig workers in those countries.