Menu

AI Hiring Discrimination Against People With Disabilities

Author: Juyoun Han and Patrick Lin
Published: 2020/11/25 - Updated: 2026/02/06
Publication Type: Informative
Category Topic: Laws and Rights - Related Publications

Page Content: Synopsis - Introduction - Main - Insights, Updates

Synopsis: This research examines how artificial intelligence recruitment technologies perpetuate employment discrimination against people with disabilities through facial recognition software, voice analysis tools, and automated screening systems. Authored by disability rights attorney Juyoun Han and law researcher Patrick Lin, the analysis draws on EEOC case data showing disability discrimination comprises 33.4% of employment claims, then demonstrates how AI hiring platforms amplify this bias by training algorithms on non-diverse employee pools that systematically exclude candidates with speech disorders, mobility impairments, mental health conditions, and sensory disabilities. The paper offers practical value to job seekers, HR professionals, and legal advocates by documenting emerging state regulations like Illinois's AI Video Interview Act and New York City's proposed bias audit requirements, while explaining ADA protections that prohibit disability-related inquiries during recruitment - making it essential reading for anyone navigating the intersection of disability rights and workplace technology - Disabled World (DW).

Introduction

Disability Bias in AI Hiring Tools and Legal Protection

People with disabilities face significant disadvantages in the workforce. According to the U.S. Equal Employment Opportunity Commission (EEOC), of all the employment discrimination cases filed in 2019, the most common claims involved disability-based discrimination (33.4%), closely followed by race and gender-based discrimination. Today, a new form of employment discrimination causes concern: Artificial Intelligence ("AI") bias.

Main Content

What is Artificial Intelligence?

Artificial intelligence is a branch of computer science that develops computers and machines to imitate intelligent human behavior. General examples of AI in our daily lives might include "Siri" or "Alexa." AI is also integrated into assistive technologies such as Seeing AI, AVA, Voiceitt, and smart wheelchairs, to name a few.

How is Artificial Intelligence Used in Hiring, and How Does it Impact People with Disabilities?

AI is also widely used in hiring and recruiting for jobs.

According to Glassdoor, AI hiring tools are widely used across different industries, from "Allstate to Hilton to Five Guys Burgers." A common example of AI can be found on LinkedIn, a website that connects job seekers with employers and recruiters. To employees, LinkedIn's AI suggests a job they may be interested in based on their profile and job experience and suggests connections to potential employers as well.

Other AI hiring tools include text-searching technology that screens high-volume job applications, facial analysis technology that scans applicants' facial expressions and body language during video interviews, and voice scanning technology that evaluates a job applicant's speech, tone, and word choices.

However, despite its convenience, AI can also be biased based on race, gender, and disability status and can be used to exacerbate systemic employment discrimination. For instance, researchers have found that assessing facial movement and voice in applications may "massively discriminate against many people with disabilities that significantly affect facial expression and voice: disabilities such as deafness, blindness, speech disorders, and surviving a stroke." Also, online personality tests and web-based neuroscience games used in AI hiring tools may screen out people with mental illnesses.

Generally, AI hiring tools are programmed to identify an employer's preferred traits based on the employer's existing pool of employees.

If disabled people are not represented in the employer's current pool of employees, the AI hiring tool may learn to screen out job candidates with a disability. Essentially, AI would treat "underrepresented traits as undesired traits." As a result, "people with disabilities - like other marginalized groups - risk being excluded," says Alexandra Givens, president and CEO of the Center for Democracy & Technology. To overcome bias, AI hiring tools need to be trained with more diverse data, including employees with disabilities. Currently, disabled people are underrepresented in the workforce, unsurprisingly, technology emulates this phenomenon. "If an algorithm's training data lacks diversity, it can entrench existing patterns of exclusion in deeply harmful ways," Givens wrote in an article for Slate.

Seeking Solutions through Legal Advocacy

The ADA limits an employer's ability to make disability-related inquiries at the recruiting stage. AI hiring tools that enable employers to gain information regarding an applicant's disability and screen out qualified candidates would face liability under the ADA and state and local human rights laws. According to Bloomberg, the U.S. Equal Employment Opportunity Commission is already investigating at least two potential claims and lawsuits involving an AI tool's discriminatory decisions in hiring, promotion, and other workplace decisions.

State and local governments are proposing and enacting laws that regulate the use of AI hiring tools and scrutinize any discriminatory effects such tools may cause. Illinois has pioneered the AI Video Interview Act, which requires employers to notify, explain and obtain consent from job applicants about its use of AI hiring tools. New York City is reviewing a proposed bill requiring AI hiring tool sellers to undergo an annual "bias audit." While we await lawmakers to enact laws to promote AI accountability, advocates will seek action in courts to tackle discrimination arising from AI hiring tools.

Juyoun Han is a Lawyer at Eisenberg & Baum LLP who leads the firm's Artificial Intelligence Fairness & Data Privacy department. As a litigator, Juyoun has represented Deaf and Hard of Hearing clients in courts across the country, advocating for individuals with disabilities to be treated equally in the workplace, hospitals, law enforcement, and prisons. Patrick Lin is a second-year law student at Brooklyn Law School, where he is vice president of Legal Hackers and a staff member of the Brooklyn Law Review. Before law school, Patrick worked on data management and regulatory compliance as a technology consultant.

Insights, Analysis, and Developments

Editorial Note: The proliferation of AI hiring systems represents a technological inflection point for disability employment rights, where the promise of efficiency collides with the reality of algorithmic bias. While companies rapidly adopt these tools to process thousands of applications, the research presented here reveals a troubling pattern: AI doesn't eliminate human prejudice - it automates and scales it. The fact that disability discrimination already leads all EEOC complaint categories should serve as a warning that allowing opaque algorithms to make hiring decisions without rigorous oversight will deepen existing inequities rather than resolve them. What makes this analysis particularly valuable is its dual focus on legal remedies and technical solutions, acknowledging that meaningful progress requires both stronger enforcement of anti-discrimination laws and fundamental changes to how AI systems are trained. As state and local governments begin crafting regulations, the challenge ahead involves ensuring that accessibility isn't treated as an afterthought to be patched in later, but rather as a core design principle from the start - Disabled World (DW).

Attribution/Source(s): This quality-reviewed publication was selected for publishing by the editors of Disabled World (DW) due to its relevance to the disability community. Originally authored by Juyoun Han and Patrick Lin and published on 2020/11/25, this content may have been edited for style, clarity, or brevity.

Related Publications

: New bill seeks to end age and disability discrimination in federal jury service, expanding civic inclusion for seniors and people with disabilities.

: Landmark settlement protects San Diego's unhoused living in vehicles, offering ticket forgiveness, safe parking, and disability accommodations.

: New York Supreme Court rules in favor of Access-A-Ride users seeking equal fare discounts from MTA, ending discriminatory pricing for paratransit riders.

Share Page
APA: Juyoun Han and Patrick Lin. (2020, November 25 - Last revised: 2026, February 6). AI Hiring Discrimination Against People With Disabilities. Disabled World (DW). Retrieved February 19, 2026 from www.disabled-world.com/disability/legal/ai-hiring.php
MLA: Juyoun Han and Patrick Lin. "AI Hiring Discrimination Against People With Disabilities." Disabled World (DW), 25 Nov. 2020, revised 6 Feb. 2026. Web. 19 Feb. 2026. <www.disabled-world.com/disability/legal/ai-hiring.php>.
Chicago: Juyoun Han and Patrick Lin. "AI Hiring Discrimination Against People With Disabilities." Disabled World (DW). Last modified February 6, 2026. www.disabled-world.com/disability/legal/ai-hiring.php.

While we strive to provide accurate, up-to-date information, our content is for general informational purposes only. Please consult qualified professionals for advice specific to your situation.