Artificial Intelligence and the Americans with Disabilities Act

 |  Share

Do you want DE Insights Delivered to Your Inbox? Sign up Today!

Throughout recent years, the use of artificial intelligence (“AI”) software in the recruitment and hiring of employees has rapidly increased in popularity. AI software has gained traction as a valuable tool in streamlining the hiring process, and employers have embraced it to assist them in various employee-related decisions, while regulators have been hesitant to accept the practice with open arms. With AI’s increased prevalence, federal agencies such as the Equal Employment Opportunity Commission (“EEOC”) have taken it upon themselves to assist employers in navigating the use of AI while ensuring that they comply with Title I of the Americans with Disabilities Act (“ADA”). Although AI software may be a helpful tool while making crucial employment decisions, employers need to understand its inherent risks and how to simultaneously comply with the ADA while avoiding potential liability.

Title I of the ADA—a federal civil rights law—prohibits certain actors, namely employers, from discriminating based on one’s disability. While most employers will utilize AI software developed by a third party, they assume the liability for any discrimination caused by using the software and are subsequently responsible for ensuring that their use does not discriminate against potentially disabled applicants. There are different ways in which AI software may violate the ADA, including:

  1. If the employer does not provide a “reasonable accommodation” to an individual to be rated fairly and accurately by the algorithm
  2. If the algorithm “screens out” a disabled applicant by lowering their performance on a selection criterion or preventing them from meeting it.
  3. If an assessment includes disability-related inquiries before a conditional offer of employment.

Ensuring a “Reasonable Accommodation”

When utilizing AI software in the hiring process, it is essential for employers to ensure that they provide ways in which applicants with disabilities can meet the standards required by any AI software’s testing or evaluation process. Employers are not required to lower their production or quality standards relative to a specific position or eliminate an essential job function, but they must provide reasonable accommodations to enable disabled applicants to potentially meet qualification standards. Employers can do so by supplying materials in an alternative format to assist disabled applicants in the testing process or by offering an alternative evaluation method.

Avoiding “Screening Out”

Before implementing AI tools and using them thereafter, employers should review them for possible biases. While many developers may advertise their AI software as “bias-free,” it is important that employers ensure that this includes not only things such as race, sex, or religious biases, but also avoids any type of disability discrimination. This can be done by confirming that the AI software does not measure personality, cognitive, or neurocognitive traits in a way that may screen out people with certain cognitive, intellectual, or mental health-related disabilities. Employers can comply with this requirement by conducting an independent bias audit on any AI tools they use to test for the disparate treatment of certain individuals based on protected characteristics. Although AI tools are not yet subject to specific federal regulations, it is good for employers to begin utilizing bias audits, as individual states have already started  regulating employer use of AI tools, including a recently passed law in New York City (effective January 1, 2023) that mandates bias audits for all automated employment decision tools.

Preventing Disability-Related Inquiries

While employers are permitted—and often necessitated—to utilize inquiries for health-related information, they must avoid using AI software to do so before a candidate is offered a conditional offer of employment. Employers must prevent this use as such inquiries may constitute an ADA violation even when an individual does not have a disability. Questions that would violate the ADA include those inquiries that would directly or indirectly elicit information about a disability or seek information related to an applicant’s physical and/or mental health.  Therefore, to guarantee compliance, employers should steer clear of utilizing any inquiries that could be perceived as health-related or qualify as a medical exam until after an offer for employment has been made.

Conclusion

When used correctly, the use of AI software in the hiring process is an exciting and valuable tool for employers looking to increase hiring consistency and efficiency while narrowing down large numbers of applicants in a methodical manner. However, when taking advantage of this new technology, employers must ensure that the AI software does not violate the ADA. This can be achieved by implementing reasonable accommodations that treat disabled applicants fairly, preventing unintentional or intentional screening out, and avoiding disability-related inquiries before a conditional offer of employment.

DarrowEverett LLP remains dedicated to supporting our clients’ use of emerging technologies and will continue to monitor this evolving topic in order to assist our clients in navigating relevant hurdles and compliance with applicable laws.

_____________________________________

This alert should not be construed as legal advice or a legal opinion on any specific facts or circumstances. This alert is not intended to create, and receipt of it does not constitute a lawyer-client relationship. The contents are intended for general informational purposes only, and you are urged to consult your attorney concerning any particular situation and any specific legal question you may have. We are working diligently to remain well informed and up to date on information and advisements as they become available. As such, please reach out to us if you need help addressing any of the issues discussed in this alert, or any other issues or concerns you may have relating to your business. We are ready to help guide you through these challenging times.

Unless expressly provided, this alert does not constitute written tax advice as described in 31 C.F.R. §10, et seq. and is not intended or written by us to be used and/or relied on as written tax advice for any purpose including, without limitation, the marketing of any transaction addressed herein. Any U.S. federal tax advice rendered by DarrowEverett LLP shall be conspicuously labeled as such, shall include a discussion of all relevant facts and circumstances, as well as of any representations, statements, findings, or agreements (including projections, financial forecasts, or appraisals) upon which we rely, applicable to transactions discussed therein in compliance with 31 C.F.R. §10.37, shall relate the applicable law and authorities to the facts, and shall set forth any applicable limits on the use of such advice.