Even in a job market where employers may turn to artificial intelligence to sift through hundreds of resumes to find the right candidate, human decision-makers remain key for helping avoid potential disability discrimination by tools like automated interviewing software, experts say.
Employers considering using AI in the hiring or promotion process need to do their due diligence to make sure they understand how an interview tool works, including discussions with the vendor or developer of the tool, experts said. (iStock.com/jittawit.21)
Management-side lawyers and AI experts told Law360 that they anticipate not only more state-level regulation of AI in employment in the coming months but also increased action from the plaintiffs bar, including class action lawsuits alleging disparate impact discrimination. Employers must carefully vet the AI tools they use in the workplace by questioning vendors, as well as training and consulting their own employees on any automated processes they use, experts said.
In one example of the potential dangers of such tools, the
ACLU last month lodged complaints with the Colorado Civil Rights Division and the
U.S. Equal Employment Opportunity Commission alleging that tax preparation company
Intuit used a discriminatory AI video interviewing software developed by HireVue, also named as a defendant, to screen candidates for jobs.
The ACLU said the software discriminated against a deaf job applicant who speaks Native American English with a deaf accent. Her accommodation requests for human-generated closed captioning during an automated remote interview went nowhere, and the system didn't transcribe her speech well, the organization said.
Duane Morris partner Alex Karasik said that while aggressive regulation or litigation from the federal level seems unlikely in the near future, employers still need to be proactive about potential AI bias because states and the plaintiffs bar are homing in.
"Even though AI-related technologies are streamlining employment processes exponentially by the day, there still is a required human element," said Karasik. "Because a human needs to be able to understand when these unique one-off situations may come up, where an applicant or employee needs an accommodation. And a human needs to have the agility to adapt and apply that accommodation request appropriately and lawfully."
Here are three ways employers should keep humans involved while considering and deploying AI-powered interviewing and hiring software.
Understanding the Technology
Employers purchasing artificial intelligence tools to use during the hiring or promotion process should do their due diligence to ensure they know how the tools work. That will likely involve conversations with the vendor or developer of the interview tool, experts said.
Before they buy, employers should be sure the vendor can answer questions about how the tool works and what data it was trained on.
"If employers aren't understanding what the technology is doing or how the technology works, there could be a significant risk of AI-related litigation," said Karasik.
"Are [the tools] doing cognitive tests? Are they reading facial recognition? Are they using these AI-related technologies to conduct personality evaluations of applicants? Understanding exactly what the technology is doing is paramount," he said.
Some of the questions employers should ask vendors include whether they have conducted bias audits of the tools, and what the audit results showed. Karasik also recommended asking vendors how their tools can accommodate applicants with disabilities, including specific questions about how the tool might account for something like a speech impairment.
If the tool can't handle that, the employer needs to know it will be responsible for providing a reasonable accommodation to the applicant, he said.
A lot of these tools use predictive AI, a type of AI that tracks patterns and uses them to make predictions. So employers should ask what formulas and data the tool is using to make those predictions, Karasik said. If it's trained on someone with one kind of accent, it may not accurately transcribe the speech of someone with a deaf accent, for example.
Developers may be reluctant to turn over the exact algorithms they've built into the tool, but the more the employer can find out about how the tool works, the better, he added.
"I think that training and testing go hand in hand," Karasik said. "The vendors need to be training the tools with the different potential variations or outcomes, such as somebody with a speech impediment, someone with an accent, someone that may not speak the same way as the initial person that was trained on the tool."
Vance Knapp, a partner at
Fisher Phillips, added that employers can further shield themselves from liability through their written agreements with the vendors. The agreements should include written acknowledgments that the tool is accessible to people with disabilities, or can be used with a reasonable accommodation, he said.
"And you want the vendor to commit to periodic internal audits to make sure that their tools remain in compliance with federal and state antidiscrimination laws," he added.
These kinds of tools can't be set-it-and-forget-it, Knapp warned, for a number of reasons, among them the constantly changing legal landscape and evolving claims by the plaintiffs bar.
Making Sure the Process Is Accessible to All
Knapp said an AI screening tool can help employers sort through what might be thousands of resumes for a few positions, even conducting initial interviews using automated voice-recognition technology.
But experts agree the law is clear: "There should always be a way to request an accommodation under the ADA," said Hilke Schellmann,
New York University investigative journalism professor and author of the book "The Algorithm: How AI Decides Who Gets Hired, Monitored, Promoted, and Fired and Why We Need to Fight Back Now."
"Some of these automated tools are just not accessible, or not sufficiently accessible," she said. "There are many ways people have disabilities, and I think [the tools] aren't always calibrated" for that, she said.
Those disabilities could include a wide range of impairments, including deafness or being hard of hearing or speaking with a stutter or other speech impediment, or someone with impaired motor skills, he said.
And the way to request an accommodation must be clear to users, Knapp said. The ACLU alleged in its suit against Intuit, for instance, that Intuit had provided only a phone number an applicant could call if they needed technical assistance, and no contact information in case of needed accommodations. If the allegations are true, that's unacceptable, experts said.
Though nothing on the federal level requires employers to disclose that they're using artificial intelligence to evaluate job applicants, jurisdictions like New York City and Illinois do have such requirements, and it tends to be a best practice generally, said Karasik.
"Creating platforms for reasonable accommodations, both in the interview process, the promotion process and really in any employment decision-making process, is essential for businesses to comply with antidiscrimination laws," he said. "Businesses should be transparent about the technology they're using, as well as accommodations for individuals who may not be able to adequately and fairly utilize that technology."
Reviewing Accommodation Requests
Of course, there's no way for either a human or an AI tool to anticipate the exact disability any applicant might have, Fisher Phillips' Knapp pointed out. That's why he advises employers to get a human involved every time an applicant requests an accommodation.
"Eventually, I think they're going to come up with an AI tool that will be completely interactive, that will be able to take into account visual, auditory, speech and impediments." But we're not there yet, Knapp added.
"Because we can't necessarily know every sort of potential disability that could be out there, or some other language impediment, barrier, or things like that, at this stage you have to be able to give the person an option to request a human reviewer," he said.
In an ideal world, the humans who are made aware that an applicant has requested a disability accommodation would not ultimately be involved in the hiring decision, said Knapp, though he acknowledged that's often not possible.
Where it gets tricky is when an applicant must inform the employer that they are deaf in order to get closed-captioning or a sign language interpreter during the interview, he said. Knapp said he advises clients to disclose that information to decision-makers on a "need-to-know" basis.
Karasik meanwhile suggested employers convene internal "AI committees" that include employees from human resources, tech and the legal department, and possibly also rank-and-file workers, before rolling out any new tool, which can help them prepare for a broad range of potential situations.
"For instance, if you have individuals who can provide insight on what their experience was like interviewing with these tools, someone that maybe has a disability or somebody that has a different perspective," he said. "I think it's good to get as many different perspectives in the room as possible on those committees."
--Additional reporting by Grace Elletson. Editing by Bruce Goldman and Nick Petruncio.
For a reprint of this article, please contact reprints@law360.com.