πŸ” The AI Lawsuit Every Employer Should Be Watching: What Mobley v. Workday Means for the Future of Hiring

 πŸ” The AI Lawsuit Every Employer Should Be Watching: What Mobley v. Workday Means for the Future of Hiring

By Purciarele Group 8.4.2025



Artificial intelligence has made hiring faster. But has it made it better?

At a glance, AI promises efficiency—scanning resumes, flagging keywords, sorting candidates. But what happens when those algorithms quietly shut out qualified people? What happens when the “smart tech” introduces the very bias it was supposed to eliminate?

That’s not just a tech glitch.
That’s a human problem.
And it’s now at the center of a major lawsuit that should have every business owner, HR professional, and tech vendor paying close attention.


⚖️ The Case: Bias in the Bots?

In Mobley v. Workday, Inc., Derek Mobley—a job seeker over the age of 40—alleges that he applied to more than 100 jobs through platforms using Workday’s AI screening tools. He never received a single interview request. He believes the software flagged and disqualified him (and others) based on age, in violation of the Age Discrimination in Employment Act (ADEA).

Here’s where things get serious: In May 2025, a federal judge ruled the case could move forward as a nationwide collective action. And most notably, the court decided Workday (the software provider) could be treated as an agent of the employer—meaning both the vendor and the hiring company might share legal liability.


🚨 Why This Lawsuit Matters (Even If You Don’t Use Workday)

This isn’t just about one candidate. Or one platform. This is about how employers use tech in hiring—and whether they're accountable for what it does.

Here’s why it matters:

  • 🧾 Vendors Can Be Held Liable
    The court ruling opens the door for software vendors to be treated like co-decision makers. If they create a tool that leads to discrimination, they might share the blame with you.

  • ⚖️ Disparate Impact Counts
    Mobley isn’t claiming intentional ageism. He’s claiming “disparate impact”—meaning the outcome was biased, even if the intent wasn’t. And the law sees that as discrimination.

  • πŸ€– AI Is Not a Legal Shield
    Just because “the computer said no” doesn’t mean you’re protected. Bias that comes from an algorithm is still bias. And courts—and the EEOC—are watching.

  • πŸ“‹ The EEOC Is Preparing to Regulate AI
    The Equal Employment Opportunity Commission has already warned that AI hiring tools are under scrutiny. This case adds fuel to the fire—and puts HR departments and business owners on notice.


πŸ’‘ What You Should Do Right Now

If you’re using AI or even just a resume-sorting platform, now’s the time to take a hard look at what’s happening behind the scenes.

✅ Start Here:

1. Conduct a Bias Audit
Have your hiring tools reviewed—by real people—to spot patterns in rejection rates by age, race, gender, or disability. Not sure where to start? That’s where we come in.

2. Keep a Human in the Loop
AI can assist. But it should never replace thoughtful review by your HR team. Empower your people to question software-driven outcomes.

3. Document, Document, Document
If you do use AI-assisted hiring, keep clear records of decisions, overrides, and how final calls were made. Transparency is your best defense.

4. Vet Your Vendors
Ask your software providers the hard questions:

  • Have you audited your tool for bias?

  • Do we get documentation of how decisions are made?

  • What happens if we’re sued based on your system’s decisions?

5. Train Your Team
HR shouldn’t just know how to use hiring tools—they need to understand the risks too. Make sure your team knows what to look for.


πŸ“‰ Beyond Hiring: AI Lawsuits Are Expanding

Mobley v. Workday may be the headline—but it’s not the only case turning heads.

  • πŸŽ™️ Voice actors are suing AI firms for unauthorized voice cloning

  • πŸ“ˆ Workers are challenging performance review algorithms

  • ⚖️ Disability advocates are warning about algorithmic exclusion in interviews and tasks

This is just the beginning. The courts are catching up—and the companies who ignore these signals could be next in line.


🀝 Final Word from Purciarele Group

Let’s be clear:
AI isn’t going anywhere. But people still matter more.

At Purciarele Group, we don’t just tick compliance boxes—we help you build hiring systems that are ethical, human-first, and legally sound. Whether you need help vetting your vendor, conducting a bias audit, or rethinking your process, we’re here for you.

We’re real people, doing real work—custom to your business.

Because hiring shouldn’t be left to chance (or to unchecked algorithms).
It should reflect who you are—and the kind of workplace you’re building.


πŸ“© Ready to future-proof your hiring process?
Contact us today for a confidential consult. Let’s protect your people and your business—together. Purciarele Group



#HRCompliance #AIHiring #WorkplaceBias #MobleyvWorkday #FutureOfWork #HiringEthically #PurciareleGroup #PeopleFirstHR #HRMatters #AIAudits #EEOCLaws #WeLoveHRSoYouDontHaveTo

Comments

Popular posts from this blog

HR Red Flags You're Ignoring—Until It's Too Late

Unmasking Passive Toxicity Before It Undermines Your Culture

HR & the Spirit of Summer