Job Screening Service Halts Facial Analysis of Applicants
Job hunters may now need to impress not just prospective bosses but artificial intelligence algorithms too—as employers screen candidates by having them answer interview questions on a video that is then assessed by a machine.
HireVue, a leading provider of software for vetting job candidates based on an algorithmic assessment, said Tuesday it is killing off a controversial feature of its software: analyzing a person’s facial expressions in a video to discern certain characteristics.
Job seekers screened by HireVue sit in front of a webcam and answer questions. Their behavior, intonation, and speech is fed to an algorithm that assigns certain traits and qualities.
HireVue says that an “algorithmic audit” of its software conducted last year shows it does not harbor bias. But the nonprofit Electronic Privacy Information Center had filed a complaint against the company with the Federal Trade Commission in 2019.
HireVue CEO Kevin Parker acknowledges that public outcry over the use of software to analyze facial expressions in video was part of the calculation. “It was adding some value for customers, but it wasn’t worth the concern,” he says.
The algorithmic audit was performed by an outside firm, O’Neil Risk Consulting and Algorithmic Auditing. The company did not respond to requests for comment.
Alex Engler, a fellow at the Brookings Institution who has studied AI hiring, says the idea of using AI to determine someone’s ability, whether it is based on video, audio, or text, is far-fetched. He says it is also problematic that the public cannot vet such claims.
“There are parts that machine learning can probably help with, but fully automated interviews, where you’re making inferences about job performance—that’s terrible,” he says. “Modern artificial intelligence can’t make those inferences.”
HireVue says that about 100 companies, including GE, Unilever, Delta, and Hilton, use its technology. The software requires job applicants to respond to a series of questions in a recorded video. The company’s software then analyzes various characteristics including the language they use, their speech, and, until now, their facial expressions. It then provides an assessment of the applicant’s suitability for a job, as well as a measure of traits including “dependability,” “emotional intelligence,” and “cognitive ability.”
Parker says the company helped screen more than 6 million videos last year, although sometimes this involved simply transcribing answers for an interviewer rather than performing an automated assessment of candidates. He adds that some clients let candidates opt out of automated screening. And he says HireVue has developed ways to avoid penalizing candidates with spotty internet connections, automatically referring those candidates to a human.
AI experts warn that algorithms trained on data from previous job applicants may perpetuate existing biases in hiring. Lindsey Zuloaga, HireVue’s chief data scientist, says the company screens for bias on gender, race, and age by collecting that information in training data and looking for signs of bias.
But she acknowledges that it may be more difficult to know if the system is biased on factors such as income or education level, or if it could be affected by something like a stutter.
“I am surprised they are dropping this, as it was a keystone feature of the product they were marketing,” says John Davisson, senior counsel at EPIC. “That is the source of a lot of concerns around biometric data collection, as well as these bold claims about being able to measure psychological traits, emotional intelligence, social attitudes, and things like that.”
The use of facial analysis to determine emotion or personality traits is controversial; some experts warn that the underlying science is flawed.
Lisa Feldman Barrett, a professor at Northeastern University who studies analysis of emotion, says a person’s face does not on its own reveal emotion or character. “Just by looking at someone smiling, you can’t really tell anything about them except maybe that they have nice teeth,” she says. “It is a bad idea to make psychological inferences, and therefore determine people’s outcomes, based on facial data alone.”
Source link