Disability Bias in AI Hiring Tools and Legal Protection
Author: Juyoun Han and Patrick Lin | Contact: EandBLaw.com
Synopsis: Artificial Intelligence (AI) bias in job hiring and recruiting causes concern as new form of employment discrimination. Despite its convenience, AI is also capable of being biased based on race, gender, and disability status, and can be used in ways that exacerbate systemic employment discrimination. A common example of AI can be found on LinkedIn, a website that connects job seekers with employers and recruiters.
People with disabilities face significant disadvantages in the workforce. According to the U.S. Equal Employment Opportunity Commission (EEOC) (1), of all the employment discrimination cases filed in 2019, the most common claims involved disability-based discrimination (33.4%), closely followed by race and gender-based discrimination. Today, a new form of employment discrimination causes concern: Artificial Intelligence ("AI") bias.
What is Artificial Intelligence?
Artificial intelligence(2) is a branch of computer science that develops computers and machines to imitate intelligent human behavior. General examples of AI in our daily lives might include "Siri" or "Alexa." AI is also integrated into assistive technologies such as Seeing AI, AVA, Voiceitt, and smart wheelchairs, just to name a few.
How is Artificial Intelligence Used in Hiring, and How Does it Impact People with Disabilities?
AI is also widely used in hiring and recruiting for jobs. According to Glassdoor, AI hiring tools(3) are widely used across different industries, from "Allstate to Hilton to Five Guys Burgers." A common example of AI can be found on LinkedIn, a website that connects job seekers with employers and recruiters. To employees, LinkedIn's AI(4) suggests a job they may be interested in based on their profile and job experience and suggests connections to potential employers as well. Other examples of AI hiring tools include text searching technology that screens high-volumes of job applications, facial analysis technology that scans facial expressions and body language of applicants during video interviews, and voice scanning technology that evaluates a job applicant's speech, tone, and word choices.
However, despite its convenience, AI is also capable of being biased based on race, gender, and disability status, and can be used in ways that exacerbate systemic employment discrimination. For instance, researchers have found that assessing facial movement and voice in applications may "massively discriminat[e] against many people with disabilities(5) that significantly affect facial expression and voice: disabilities such as deafness, blindness, speech disorders, and surviving a stroke." Also, online personality tests and web-based neuroscience games used in AI hiring tools may screen out people with mental illnesses.(6)
Generally, AI hiring tools are programmed to identify an employer's preferred traits based on the employer's existing pool of employees. That means, if disabled people are not represented in the employer's current pool of employees, then the AI hiring tool may learn to screen out job candidates with a disability. Essentially, AI would treat "underrepresented traits as undesired traits." As a result, "people with disabilities - like other marginalized groups - risk being excluded," says Alexandra Givens(7), president and CEO of the Center for Democracy & Technology. To overcome bias, AI hiring tools need to be trained with more diverse data that includes employees with disabilities. Currently, disabled people are underrepresented in the workforce, and unsurprisingly, technology emulates this phenomenon. "If an algorithm's training data lacks diversity, it can entrench existing patterns of exclusion in deeply harmful ways," Givens wrote in an article for Slate.(8)
Seeking Solutions through Legal Advocacy
The ADA limits an employer's ability to make disability-related inquiries at the recruiting stage. AI hiring tools that enable employers to gain information regarding an applicant's disability and screen out qualified candidates would face liability under the ADA as well as state and local human rights laws. According to Bloomberg, the U.S. Equal Employment Opportunity Commission is already investigating at least two potential claims(9) and lawsuits involving an AI tool's discriminatory decisions in hiring, promotion, and other workplace decisions.
State and local governments are proposing and enacting laws that regulate the use of AI hiring tools and scrutinize any discriminatory effects such tools may cause. Illinois has pioneered the AI Video Interview Act which requires employers to notify, explain and obtain consent from job applicants about its use of AI hiring tools. New York City is reviewing a proposed bill(10) that would require sellers of the AI hiring tools to undergo an annual "bias audit." While we await lawmakers to enact laws to promote AI accountability, advocates will seek action in courts to tackle discrimination arising from AI hiring tools.(11)
Juyoun Han is a Lawyer at Eisenberg & Baum LLP who leads the firm's Artificial Intelligence Fairness & Data Privacy department. As a litigator, Juyoun has represented Deaf and Hard of Hearing clients in courts across the country, advocating for individuals with disabilities to be treated equally at the workplace, hospitals, law enforcement, and in prisons. Patrick Lin is a second-year law student at Brooklyn Law School, where he is vice president of Legal Hackers and a staff member of the Brooklyn Law Review. Prior to law school, Patrick worked on data management and regulatory compliance as a technology consultant.
- 1 - https://www.eeoc.gov/newsroom/eeoc-releases-fiscal-year-2019-enforcement-and-litigation-data
- 2 - https://www.merriam-webster.com/dictionary/artificial%20intelligence
- 3 - https://www.washingtonpost.com/technology/2019/10/22/ai-hiring-face-scanning-algorithm-increasingly-decides-whether-you-deserve-job/
- 4 - https://engineering.linkedin.com/blog/2018/10/an-introduction-to-ai-at-linkedin
- 5 - https://benetech.org/wp-content/uploads/2018/11/Tech-and-Disability-Employment-Report-November-2018.pdf
- 6 - https://www.abajournal.com/news/article/do_job_personality_tests_discriminate_eeoc_probes_lawyers_complaint_filed_o
- 7 - https://cdt.org/staff/alexandra-reeve-givens/
- 8 - https://slate.com/technology/2020/02/algorithmic-bias-people-with-disabilities.html
- 9 - https://news.bloomberglaw.com/daily-labor-report/punching-in-workplace-bias-police-look-at-hiring-algorithms
- 10 - https://legistar.council.nyc.gov/LegislationDetail.aspx?ID=4344524&GUID=B051915D-A9AC-451E-81F8-6596032FA3F9
- 11 - https://www.eandblaw.com/services/a-i-fairness-and-data-privacy/
- 12 - https://www.eandblaw.com/services/a-i-fairness-and-data-privacy/
Primary Information Source(s):
Disability Bias in AI Hiring Tools and Legal Protection | Juyoun Han and Patrick Lin (EandBLaw.com). Disabled World makes no warranties or representations in connection therewith. Content may have been edited for style, clarity or length.
In Other News:
You're reading Disabled World. See our homepage for informative disability news, reviews, sports, stories and how-tos. You can also connect with us on social media such as Twitter and Facebook or learn more about Disabled World on our about us page.
Disclaimer: Disabled World provides general information only. Materials presented are in no way meant to be a substitute for professional medical care by a qualified practitioner, nor should they be construed as such. Any 3rd party offering or advertising on disabled-world.com does not constitute endorsement by Disabled World.
Cite This Page (APA): Juyoun Han and Patrick Lin. (2020, November 25). Disability Bias in AI Hiring Tools and Legal Protection. Disabled World. Retrieved September 26, 2021 from www.disabled-world.com/disability/legal/ai-hiring.php