Key Takeaways From the Mobley v. Workday Lawsuit (Explained in Plain Language)

  • February 2023: Derek Mobley filed a lawsuit against Workday, alleging its automated resume screening tool discriminates based on race, age, and disability status.
  • January 2024: The initial suit was dismissed due to insufficient evidence of Workday being classified as an “employment agency.”
  • February 20th, 2024: The amended lawsuit redefines Workday’s role, aiming to clarify its liabilities under anti-discrimination law.
  • The case highlights the need for AI governance controls and processes.
  • The case will be fascinating to watch as it underscores ongoing legal and societal challenges in ensuring AI hiring tools do not perpetuate bias, with broader implications for regulatory oversight and the technology’s ethical use.
  • Update: April 9th, 2024: The U.S. Equal Employment Opportunity Commission (EEOC) told a court that Workday should face claims regarding the biased algorithm-based applicant screening system.
  • Update: July 12th, 2024: A federal judge has allowed a lawsuit against Workday to proceed, stating:

    “Workday’s software is not simply implementing in a rote way the criteria that employers set forth, but is instead participating in the decision-making process by recommending some candidates to move forward and rejecting others. Given Workday’s allegedly crucial role in deciding which applicants can get their ‘foot in the door’ for an interview, Workday’s tools are engaged in conduct that is at the heart of equal access to employment opportunities.”

What Does the Mobley v. Workday Lawsuit Allege Regarding Automated Resume Screening Tools?

The Mobley v. Workday lawsuit alleges that the company’s automated resume screening tool discriminates based on race, age, and disability status, highlighting the broader issue of bias within AI-based hiring processes.

Details of the Mobley v. Workday Lawsuit

On February 20, 2024, a man named Derek Mobley filed an amended lawsuit against Workday, claiming the company’s automated resume screening tool, which can evaluate resumes without human oversight and decide whether to reject the applicant, discriminates along the lines of race, age and disability status.

Mobley filed his first version of the suit in February 2023 in the United States District Court for the Northern District of California. The plaintiff, who is African American, over 40, and disabled, claimed to have applied for over 80 jobs believed to be using Workday’s screening tool, and his application was rejected every time.

The suit cited multiple examples of bias in AI-based screening tools (like the Amazon tool that learned to favor male candidates over female) and alleges that Workday’s resume screening tools also display bias based on Mobley’s experience.

Mobley seeks a declaration that Workday’s tools violate anti-discrimination laws, an injunction preventing Workday from using discriminative hiring algorithms, and monetary damages on behalf of himself and others with similar protected traits.

In January 2024, a judge dismissed the case because the original lawsuit did not offer enough evidence to classify Workday as an “employment agency” subject to liability under anti-discrimination law. The dismissal was not a statement that Workday’s software did not discriminate but a point of legalese about how anti-discrimination law should apply to the case. The Court also noted gaps in Mobley’s claims about what details about the protected traits he offered and his qualifications for the jobs he sought.

The amended lawsuit claims Workday acts as an “agent” for its customers rather than a “hiring agency.” This is a point of legalese that clarifies Workday’s liabilities under anti-discrimination law.

Update April 9, 2024

On April 9th, the EEOC intervened in the Workday case by filing an amicus brief supporting the plaintiff.

The U.S. Equal Employment Opportunity Commission (EEOC) submitted a statement to the court, asserting that Workday should face claims of bias in its algorithm-based applicant screening system.

The EEOC argued that Workday’s software might enable discriminatory practices by allowing employers to exclude applicants from protected categories, violating Title VII of the Civil Rights Act of 1964.

Update: July 12, 2024

A federal judge has allowed a discrimination lawsuit against Workday’s AI hiring software to proceed.

Workday’s attempt to dismiss the case was denied, with the judge ruling that the company’s software acts as an employer’s agent in the hiring process.

“Workday’s software is not simply implementing in a rote way the criteria that employers set forth, but is instead participating in the decision-making process by recommending some candidates to move forward and rejecting others,” the judge said. “Given Workday’s allegedly crucial role in deciding which applicants can get their ‘foot in the door’ for an interview, Workday’s tools are engaged in conduct that is at the heart of equal access to employment opportunities.”

However, the judge did agree with Workday that the plaintiff did not convincingly argue that Workday functions as an employment agency, highlighting that the software does not recruit, solicit, or procure employees for companies. The court also sided with Workday, noting that the plaintiff did not provide specific evidence of intentional discrimination, despite showing that the tools had a disparate impact.

The case continues.

AI Bias Lawsuits Intensify

This is not the first lawsuit (and it won’t be the last) related to AI use and employment.

As companies continue to leverage AI-enabled tools for hiring, a combination of regulations and standards will continue to evolve to ensure responsible AI use.

New York City LL144, while perhaps limited in its scope and enforcement, requires anyone using such tools in hiring or promotion to conduct a bias audit and publish the results publicly.

Similarly, the EEOC has made it clear that employers and agencies using algorithm-based hiring tools in the hiring process are still liable for discrimination caused by the tools. As we have seen, such tools can still be biased, and their use does not provide a legal loophole for employers. The agency has taken action in similar cases before, such as the $365k fine against a tutoring company for using software to explicitly reject candidates for their age.

As a provider of AI Governance technology and, perhaps more importantly, as humans who believe in ethical AI use, we will be watching this case very closely.

Stay tuned.

Request A Demo

Explore the leading AI governance platform.