Workday Hit With Lawsuit Claiming Its AI Shuts Out Black, Disabled And Older Jobseekers
Plaintiffs claim repeated rejections despite strong qualifications, with some denials occurring almost instantly

A high-stakes lawsuit against HR software giant Workday could reshape the rules for how companies use artificial intelligence in recruitment. The case raises major questions about bias, fairness, and accountability in algorithm-driven hiring.
It alleges that Workday's AI-based system has systematically rejected qualified candidates, particularly from marginalised groups, revealing troubling flaws in how these tools are used to determine who gets hired.
Legal Battle Over Algorithmic Discrimination
Following a ruling by a California district judge last Friday, Workday is now facing a collective action lawsuit over claims that its job-screening algorithms are discriminatory.
The lawsuit was first filed last year by Derek Mobley, who says Workday's system led to his rejection from more than 100 job applications over seven years. He attributes this to his race, age, and disabilities. Four additional plaintiffs have joined, all of them over 40, claiming repeated rejections after applying for hundreds of jobs through Workday's platform. In many cases, these rejections were received within minutes or hours of submission.
The Algorithm's Alleged Bias
Court filings state that Workday's AI 'disproportionately disqualifies individuals over forty (40) from securing gainful employment' during its ranking and screening process. The case is now proceeding as a collective action, which functions similarly to a class action and could set a precedent for how companies deploy AI in recruitment.
AI tools are increasingly used by HR teams to manage the overwhelming number of job applications. Many experts are concerned that these technologies may reinforce bias and unfairly filter out suitable candidates.
Corporate Denial Versus Civil Rights Concerns
Workday strongly denies the allegations. A spokesperson described the judge's decision as a 'preliminary, procedural ruling ... that relies on allegations, not evidence.' The company added, 'We continue to believe this case is without merit. We're confident that once Workday is permitted to defend itself with the facts, the plaintiff's claims will be dismissed.'
Civil rights advocates disagree. The ACLU has warned that hiring algorithms can automate discrimination. Without transparency or regulation, AI systems may replicate existing inequalities in ways that are difficult to detect or challenge.
In one well-known case from 2018, Amazon scrapped a resume-screening tool after discovering it favoured male candidates over female ones.
Real People, Repeated Rejections
Mobley, a Morehouse College graduate with nearly a decade of experience in IT, finance, and customer service, said he was repeatedly turned away without receiving interview offers. According to court documents, he applied for a job at 12:55 a.m. and was rejected at 1:50 a.m.
Another plaintiff, Jill Hughes, reported hundreds of rejections shortly after applying. Many arrived outside regular business hours, suggesting that her applications may not have been reviewed by a person. She also said some rejection emails mistakenly claimed she did not meet the job's minimum qualifications.
How AI Learns and Inherits Bias
Experts explain that hiring algorithms are often trained using company-specific historical data. If a business has a workforce that is mostly white, male, or under 40, the algorithm may learn to favour candidates with similar characteristics—even if it was never explicitly instructed to do so.
According to CNN, the Workday case has come at a critical moment, as more companies turn to automated screening to deal with high volumes of applicants. The outcome could establish essential legal standards for AI accountability in hiring and force employers to rethink how they use these technologies.
© Copyright IBTimes 2025. All rights reserved.