Disability Bias in AI Hiring Tools and Legal Protection

Author: Juyoun Han and Patrick Lin
Published: 2020/11/25 - Updated: 2023/09/25
Publication Type: Informative
Peer-Reviewed: Yes
Topic: Lawyers and Rights - Publications List

Page Content: Synopsis - Introduction - Main

Synopsis: Artificial Intelligence (AI) bias in job hiring and recruiting causes concern as new form of employment discrimination. Despite its convenience, AI can also be biased based on race, gender, and disability status and can be used in ways that exacerbate systemic employment discrimination.

Introduction

People with disabilities face significant disadvantages in the workforce. According to the U.S. Equal Employment Opportunity Commission (EEOC), of all the employment discrimination cases filed in 2019, the most common claims involved disability-based discrimination (33.4%), closely followed by race and gender-based discrimination. Today, a new form of employment discrimination causes concern: Artificial Intelligence ("AI") bias.

Main Item

What is Artificial Intelligence?

Artificial intelligence is a branch of computer science that develops computers and machines to imitate intelligent human behavior. General examples of AI in our daily lives might include "Siri" or "Alexa." AI is also integrated into assistive technologies such as Seeing AI, AVA, Voiceitt, and smart wheelchairs, to name a few.

How is Artificial Intelligence Used in Hiring, and How Does it Impact People with Disabilities?

AI is also widely used in hiring and recruiting for jobs.

According to Glassdoor, AI hiring tools are widely used across different industries, from "Allstate to Hilton to Five Guys Burgers." A common example of AI can be found on LinkedIn, a website that connects job seekers with employers and recruiters. To employees, LinkedIn's AI suggests a job they may be interested in based on their profile and job experience and suggests connections to potential employers as well.

Other AI hiring tools include text-searching technology that screens high-volume job applications, facial analysis technology that scans applicants' facial expressions and body language during video interviews, and voice scanning technology that evaluates a job applicant's speech, tone, and word choices.

However, despite its convenience, AI can also be biased based on race, gender, and disability status and can be used to exacerbate systemic employment discrimination. For instance, researchers have found that assessing facial movement and voice in applications may "massively discriminate against many people with disabilities that significantly affect facial expression and voice: disabilities such as deafness, blindness, speech disorders, and surviving a stroke." Also, online personality tests and web-based neuroscience games used in AI hiring tools may screen out people with mental illnesses.

Generally, AI hiring tools are programmed to identify an employer's preferred traits based on the employer's existing pool of employees.

If disabled people are not represented in the employer's current pool of employees, the AI hiring tool may learn to screen out job candidates with a disability. Essentially, AI would treat "underrepresented traits as undesired traits." As a result, "people with disabilities - like other marginalized groups - risk being excluded," says Alexandra Givens, president and CEO of the Center for Democracy & Technology. To overcome bias, AI hiring tools need to be trained with more diverse data, including employees with disabilities. Currently, disabled people are underrepresented in the workforce, unsurprisingly, technology emulates this phenomenon. "If an algorithm's training data lacks diversity, it can entrench existing patterns of exclusion in deeply harmful ways," Givens wrote in an article for Slate.

Seeking Solutions through Legal Advocacy

The ADA limits an employer's ability to make disability-related inquiries at the recruiting stage. AI hiring tools that enable employers to gain information regarding an applicant's disability and screen out qualified candidates would face liability under the ADA and state and local human rights laws. According to Bloomberg, the U.S. Equal Employment Opportunity Commission is already investigating at least two potential claims and lawsuits involving an AI tool's discriminatory decisions in hiring, promotion, and other workplace decisions.

State and local governments are proposing and enacting laws that regulate the use of AI hiring tools and scrutinize any discriminatory effects such tools may cause. Illinois has pioneered the AI Video Interview Act, which requires employers to notify, explain and obtain consent from job applicants about its use of AI hiring tools. New York City is reviewing a proposed bill requiring AI hiring tool sellers to undergo an annual "bias audit." While we await lawmakers to enact laws to promote AI accountability, advocates will seek action in courts to tackle discrimination arising from AI hiring tools.

Juyoun Han is a Lawyer at Eisenberg & Baum LLP who leads the firm's Artificial Intelligence Fairness & Data Privacy department. As a litigator, Juyoun has represented Deaf and Hard of Hearing clients in courts across the country, advocating for individuals with disabilities to be treated equally in the workplace, hospitals, law enforcement, and prisons. Patrick Lin is a second-year law student at Brooklyn Law School, where he is vice president of Legal Hackers and a staff member of the Brooklyn Law Review. Before law school, Patrick worked on data management and regulatory compliance as a technology consultant.

Attribution/Source(s): This peer reviewed publication was selected for publishing by the editors of Disabled World (DW) due to its relevance to the disability community. Originally authored by Juyoun Han and Patrick Lin and published on 2020/11/25, this content may have been edited for style, clarity, or brevity. For further details or clarifications, Juyoun Han and Patrick Lin can be contacted at EandBLaw.com NOTE: Disabled World does not provide any warranties or endorsements related to this article.

Explore Similar Topics

- Landmark class action settlement protects San Diego residents who rely on their vehicles as their only form of shelter from unjust enforcement and fines.

- Supreme Court of NYS denies MTA and the NYC Transit Authority motion to dismiss its case for Equal Fare Discounts, ruling in favor of the plaintiffs.

Citing and References

Founded in 2004, Disabled World (DW) is a leading resource on disabilities, assistive technologies, and accessibility, supporting the disability community. Learn more on our About Us page.

Cite This Page: Juyoun Han and Patrick Lin. (2020, November 25 - Last revised: 2023, September 25). Disability Bias in AI Hiring Tools and Legal Protection. Disabled World (DW). Retrieved March 27, 2025 from www.disabled-world.com/disability/legal/ai-hiring.php

Permalink: <a href="https://www.disabled-world.com/disability/legal/ai-hiring.php">Disability Bias in AI Hiring Tools and Legal Protection</a>: Artificial Intelligence (AI) bias in job hiring and recruiting causes concern as new form of employment discrimination.

While we strive to provide accurate and up-to-date information, it's important to note that our content is for general informational purposes only. We always recommend consulting qualified healthcare professionals for personalized medical advice. Any 3rd party offering or advertising does not constitute an endorsement.