Disability Bias in AI Hiring Tools and Legal Protection

Author: Juyoun Han and Patrick Lin
Published: 2020/11/25 - Updated: 2023/09/25
Publication Type: Informative - Peer-Reviewed: Yes
Contents: Summary - Main - Related Publications

Synopsis: Artificial Intelligence (AI) bias in job hiring and recruiting causes concern as new form of employment discrimination. Despite its convenience, AI can also be biased based on race, gender, and disability status and can be used in ways that exacerbate systemic employment discrimination. A typical example of AI can be found on LinkedIn, a website that connects job seekers with employers and recruiters.

Main Digest

People with disabilities face significant disadvantages in the workforce. According to the U.S. Equal Employment Opportunity Commission (EEOC), of all the employment discrimination cases filed in 2019, the most common claims involved disability-based discrimination (33.4%), closely followed by race and gender-based discrimination. Today, a new form of employment discrimination causes concern: Artificial Intelligence ("AI") bias.

What is Artificial Intelligence?

Artificial intelligence is a branch of computer science that develops computers and machines to imitate intelligent human behavior. General examples of AI in our daily lives might include "Siri" or "Alexa." AI is also integrated into assistive technologies such as Seeing AI, AVA, Voiceitt, and smart wheelchairs, to name a few.

How is Artificial Intelligence Used in Hiring, and How Does it Impact People with Disabilities?

AI is also widely used in hiring and recruiting for jobs.

According to Glassdoor, AI hiring tools are widely used across different industries, from "Allstate to Hilton to Five Guys Burgers." A common example of AI can be found on LinkedIn, a website that connects job seekers with employers and recruiters. To employees, LinkedIn's AI suggests a job they may be interested in based on their profile and job experience and suggests connections to potential employers as well.

Other AI hiring tools include text-searching technology that screens high-volume job applications, facial analysis technology that scans applicants' facial expressions and body language during video interviews, and voice scanning technology that evaluates a job applicant's speech, tone, and word choices.

However, despite its convenience, AI can also be biased based on race, gender, and disability status and can be used to exacerbate systemic employment discrimination. For instance, researchers have found that assessing facial movement and voice in applications may "massively discriminate against many people with disabilities that significantly affect facial expression and voice: disabilities such as deafness, blindness, speech disorders, and surviving a stroke." Also, online personality tests and web-based neuroscience games used in AI hiring tools may screen out people with mental illnesses.

Generally, AI hiring tools are programmed to identify an employer's preferred traits based on the employer's existing pool of employees.

If disabled people are not represented in the employer's current pool of employees, the AI hiring tool may learn to screen out job candidates with a disability. Essentially, AI would treat "underrepresented traits as undesired traits." As a result, "people with disabilities - like other marginalized groups - risk being excluded," says Alexandra Givens, president and CEO of the Center for Democracy & Technology. To overcome bias, AI hiring tools need to be trained with more diverse data, including employees with disabilities. Currently, disabled people are underrepresented in the workforce, unsurprisingly, technology emulates this phenomenon. "If an algorithm's training data lacks diversity, it can entrench existing patterns of exclusion in deeply harmful ways," Givens wrote in an article for Slate.

Seeking Solutions through Legal Advocacy

The ADA limits an employer's ability to make disability-related inquiries at the recruiting stage. AI hiring tools that enable employers to gain information regarding an applicant's disability and screen out qualified candidates would face liability under the ADA and state and local human rights laws. According to Bloomberg, the U.S. Equal Employment Opportunity Commission is already investigating at least two potential claims and lawsuits involving an AI tool's discriminatory decisions in hiring, promotion, and other workplace decisions.

State and local governments are proposing and enacting laws that regulate the use of AI hiring tools and scrutinize any discriminatory effects such tools may cause. Illinois has pioneered the AI Video Interview Act, which requires employers to notify, explain and obtain consent from job applicants about its use of AI hiring tools. New York City is reviewing a proposed bill requiring AI hiring tool sellers to undergo an annual "bias audit." While we await lawmakers to enact laws to promote AI accountability, advocates will seek action in courts to tackle discrimination arising from AI hiring tools.

Juyoun Han is a Lawyer at Eisenberg & Baum LLP who leads the firm's Artificial Intelligence Fairness & Data Privacy department. As a litigator, Juyoun has represented Deaf and Hard of Hearing clients in courts across the country, advocating for individuals with disabilities to be treated equally in the workplace, hospitals, law enforcement, and prisons. Patrick Lin is a second-year law student at Brooklyn Law School, where he is vice president of Legal Hackers and a staff member of the Brooklyn Law Review. Before law school, Patrick worked on data management and regulatory compliance as a technology consultant.

Attribution/Source(s):

This peer reviewed publication titled "Disability Bias in AI Hiring Tools and Legal Protection" was chosen for publishing by Disabled World's editors due to its relevance to our readers in the disability community. While the content may have been edited for style, clarity, or brevity, it was originally authored by Juyoun Han and Patrick Lin and published 2020/11/25 (Edit Update: 2023/09/25). For further details or clarifications, you can contact Juyoun Han and Patrick Lin directly at EandBLaw.com. Please note that Disabled World does not provide any warranties or endorsements related to this article.

📢 Discover Related Topics


👍 Share This Information To:
𝕏.com Facebook Reddit

Page Information, Citing and Disclaimer

Disabled World is an independent disability community founded in 2004 to provide disability news and information to people with disabilities, seniors, their family and/or carers. You can connect with us on social media such as X.com and our Facebook page.

Cite This Page (APA): Juyoun Han and Patrick Lin. (2020, November 25). Disability Bias in AI Hiring Tools and Legal Protection. Disabled World. Retrieved April 19, 2024 from www.disabled-world.com/disability/legal/ai-hiring.php

Permalink: <a href="https://www.disabled-world.com/disability/legal/ai-hiring.php">Disability Bias in AI Hiring Tools and Legal Protection</a>: Artificial Intelligence (AI) bias in job hiring and recruiting causes concern as new form of employment discrimination.

Disabled World provides general information only. Materials presented are never meant to substitute for qualified professional medical care. Any 3rd party offering or advertising does not constitute an endorsement.