Software-based Screening Tools May Violate ADA
Use with Caution
By Amelia J. Holstrom, Esq. and Trevor Brice, Esq.
Over the past several years, employers have turned to various software-based recruitment and employment screening tools to evaluate applicants and employees. The software, which uses artificial intelligence and various algorithms to make decisions, often helps employers evaluate more applicants in a shorter period of time, select individuals for interviews, or evaluate current employees for raises or advancement at the business.
But could the use of this software be creating legal liability for your business? Maybe.
In May, the Equal Employment Opportunity Commission (EEOC), the federal agency that enforces federal anti-discrimination in employment laws, issued guidance to employers, titled “The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees.” The guidance addresses three main areas, or ways, in which software-based screening tools may violate the Americans with Disabilities Act (ADA), if employers are not careful.
The EEOC warns employers that without proper safeguards, a software-based screening tool may unintentionally (or intentionally) screen out individuals with disabilities.”
First, the EEOC guidance reminds employers that if their software-based screening tool does not have a process for individuals to request accommodations that may be necessary for an individual with a medical condition to be fairly and accurately rated by the software or use the software, it may violate the ADA. Under the ADA, employers are required to provide reasonable accommodations to applicants and employees. For example, it may be a reasonable accommodation to allow a visually impaired applicant or employee to be evaluated through a non-computer-based screening tool.
Second, the EEOC warns employers that without proper safeguards, a software-based screening tool may unintentionally (or intentionally) screen out individuals with disabilities. The EEOC specifically referenced ‘chatbot’ screening tools, which are designed to engage in communications online through texts and emails. A chatbot might be programmed with an algorithm that rejects all applicants who mention in conversation with the chatbot that they have a gap in their employment history. If this gap in employment is due to a medical condition, then the chatbot may function to screen out the applicant unlawfully due to their disability, even though the individual would be capable of performing the essential functions of the position for which they were applying with (or without) an accommodation.
Finally, the EEOC guidance reminds employers that if a software-based screening tool asks questions that require employees to disclose medical conditions or other disability-related information, it may be an unlawful, disability-related inquiry that violates the ADA.
The guidance also cautions employers that they can be liable for discrimination caused by software-based screening tools, even if the employer did not create the tool. In other words, utilizing software developed by an outside vendor does not insulate an employer from liability.
Although the EEOC highlighted several issues that might make the use of software-based screening tools problematic under the ADA, it also provided employers with guidance on steps they can take to help mitigate their risk, including, but not limited to: making it clear how an individual may request an accommodation related to the screening tool or the use of the software; promptly and appropriately responding to all requests for such accommodations; thoroughly questioning the methodology used by the software the businesses uses, including asking the software provider whether it was developed with individuals with disabilities in mind and what the software provider did to make the interface accessible to individuals with disabilities; and asking the software provider if it attempted to determine if any algorithm used by the software disadvantages individuals with disabilities.
Employers should not expect the concerns raised by the EEOC over the use of software-based screening tools to stop at the ADA. Just weeks before the EEOC issued this guidance, the EEOC filed a lawsuit against iTutorGroup Inc., Shanghai Ping’An Intelligent Education Technology Co. Ltd., and Tutor Group Ltd., alleging that the companies’ online recruitment software was programmed to automatically reject female applicants over age 55 and male applicants over age 60 in violation of the Age Discrimination in Employment Act.
Given the growing use of software-based screening tools, it is imperative that employers thoroughly evaluate their own software and their vendor-provided software for any possible discriminatory bias and seek legal advice with regard to their evaluation whenever appropriate.
Amelia Holstrom is a partner with the Springfield-based law firm Skoler Abbott, and Trevor Brice is an associate with Skoler Abbott; (413) 737-4753.
Amelia Holstrom is a partner with the Springfield-based law firm Skoler Abbott, and Trevor Brice is an associate with Skoler Abbott; (413) 737-4753.