AI & Employment: Educator & Facilitator Guide
For workforce development educators, law school clinics, union trainers, HR professionals, civics and government teachers, and community organizers. Facilitation guide for the ATS scanner, disparate impact discussion framework, and a worker advocacy simulation.
How to use the ATS scanner in a group.
The tool works best as a shared investigation — with the group exploring each annotation together rather than individually.
Opening framing (3 minutes)
- →Ask the room: 'How many of you have applied for a job in the last two years? Of those, how many do you think had a human read your resume before it was rejected?' Let the surprise land.
- →Frame the tool: 'We're going to look at what software sees when it reads a resume — not what it should see, but what it actually does. Some of it is about formatting. Some of it is about documented discrimination.'
- →Turn on Facilitator Mode before beginning — discussion prompts will appear at each annotation.
Key discussion questions by section
Contact & Identity
"If you knew your name was causing you to receive 30% fewer callbacks, what would you do? What would it mean if you had to?" — Then: the burden of adaptation is itself a form of discrimination.
Work Experience (gap)
"Who is most likely to have a two-year employment gap in this room? Caregivers, parents, formerly incarcerated people, chronically ill people. What do those groups have in common demographically?"
Education
"If an ATS weights Harvard over Howard — and Howard is a historically Black university — is that neutral? Who designed this criteria? Who does it serve?"
Formatting
"A resume formatted to impress a human may be rejected by software before any human sees it. Who has access to the coaching that teaches you how to format for ATS? Who doesn't?"
Video / AI Interviews
"HireVue scores facial expressions and vocal tone. It was trained on data from a predominantly white, college-educated hiring base. What does it not know? Who is penalized for what it doesn't know?"
Closing (5 minutes)
- →Name the legal structure: 'The EEOC says disparate impact is illegal. That means if an AI tool produces racially disparate outcomes, the employer is liable — even if no one intended to discriminate. This is not a hypothetical. iTutorGroup paid $365,000.'
- →Connect individual tactics to collective action: 'Formatting your resume better helps you. NYC Local Law 144 helps everyone. Both matter — and they're not the same work.'
- →End with a concrete action: file an EEOC inquiry (free, 5 minutes online), look up whether your city has AI hiring legislation pending, or join Coworker.org if you're facing algorithmic management.
Teaching disparate impact theory.
Disparate impact is the most important legal concept for understanding AI employment discrimination — and it is poorly understood by most people outside legal contexts.
Disparate impact in plain language
Under Title VII (Griggs v. Duke Power, 1971), an employer can be found liable for discrimination even if they had no discriminatory intent — if an employment practice produces a statistically significant disparate impact on a protected class, and the practice is not justified by business necessity.
The EEOC's 2023 guidance explicitly applies this framework to AI. If an employer uses an AI screening tool that rejects Black applicants at a significantly higher rate than white applicants — even if the algorithm has no explicit racial input — the employer is potentially liable under Title VII.
The three-step teaching frame
- →Step 1: Show the disparity (statistical evidence that a tool produces different outcomes by race/gender/age)
- →Step 2: Business necessity defense — the employer must show the practice is required for job performance
- →Step 3: Less discriminatory alternative — even if there's business necessity, is there an equally valid tool with less disparate impact?
Case study analysis: Mobley v. Workday
Derek Mobley applied to 100+ positions using Workday's ATS and was rejected from all of them. He sued Workday — not the employers — arguing the ATS itself was a discriminatory tool under Title VII, ADEA, and ADA.
The case raises a critical legal question: can a software vendor be sued as an employment agency under civil rights law? If yes, the accountability structure for AI hiring tools changes fundamentally. Workday's initial motion to dismiss was denied in 2024 — the case is proceeding.
Analysis question 1
If Workday is liable as an 'employment agency,' what does that mean for every software vendor whose tools influence hiring decisions?
Analysis question 2
Workday argues it does not make hiring decisions — employers do. Under what circumstances is this defense valid? When does it fail?
Analysis question 3
What evidence would Mobley need to show to prove disparate impact? How would he get that data, given Workday controls it?
Drafting NYC Local Law 144 for your city.
A policy exercise: participants draft an AI hiring bias audit ordinance for their own city, building on the NYC model and adapting it to local context.
Analyze the NYC law
Read the key provisions of NYC Local Law 144: the bias audit requirement, the independent auditor standard, the publication requirement, the penalty structure ($375/day/violation), and the exemptions. Identify what the law does and does not require.
Identify gaps in the NYC model
NYC LL144 has documented gaps: the bias audit definition is narrow; the 'impact ratio' standard doesn't require equal outcomes; third-party auditors are paid by employers; penalties are low. Map what is missing from the strongest possible version of the law.
Draft an ordinance for your city
Drafting elements to address: scope (which employers, which tools), audit standard (what constitutes a 'bias audit'), auditor independence, disclosure requirements (to whom? in what form?), enforcement mechanism and penalty level, and worker rights to challenge adverse decisions.
Stakeholder negotiation
Assign roles: employer/HR vendor coalition (arguing cost burden and trade secrets), worker/civil rights coalition (arguing enforcement and transparency), city council staff (arguing administrability). Negotiate to a version that could pass — and name what you had to give up.
Educator references.
Want CPAI to deliver this workshop in your community?
We partner with unions, workforce development programs, legal aid organizations, and community colleges.