New York City Proposes Regulating Algorithms Used in Hiring
In 1964, the Civil Rights Act barred the humans who made hiring decisions from discriminating on the basis of sex or race. Now, software often contributes to those hiring decisions, helping managers screen résumés or interpret video interviews.
That worries some tech experts and civil rights groups, who cite evidence that algorithms can replicate or magnify biases shown by people. In 2018, Reuters reported that Amazon scrapped a tool that filtered résumés based on past hiring patterns because it discriminated against women.
Legislation proposed in the New York City Council seeks to update hiring discrimination rules for the age of algorithms. The bill would require companies to disclose to candidates when they have been assessed with the help of software. Companies that sell such tools would have to perform annual audits to check that their people-sorting tech doesn’t discriminate.
The proposal is a part of a recent movement at all levels of government to place legal constraints on algorithms and software that shape life-changing decisions—one that may shift into new gear when Democrats take control of the White House and both houses of Congress.
More than a dozen US cities have banned government use of face recognition, and New York state recently passed a two-year moratorium on the technology’s use in schools. Some federal lawmakers have proposed legislation to regulate face algorithms and automated decision tools used by corporations, including for hiring. In December, 10 senators asked the Equal Employment Opportunity Commission to police bias in AI hiring tools, saying they feared the technology could deepen racial disparities in employment and hurt economic recovery from Covid-19 in marginalized communities. Also last year, a new law took effect in Illinois requiring consent before using video analysis on job candidates; a similar Maryland law restricts use of face analysis technology in hiring.
Lawmakers are more practiced in talking about regulating new algorithms and AI tools than implementing such rules. Months after San Francisco banned face recognition in 2019, it had to amend the ordinance because it inadvertently made city-owned iPhones illegal.
The New York City proposal launched by Democratic council member Laurie Cumbo would require companies using what are termed automated employment-decision tools to help screen candidates or decide terms such as compensation to disclose use of the technology. Vendors of such software would be required to conduct a “bias audit” of their products each year and make the results available to customers.
The proposal faces resistance from some unusual allies, as well as unresolved questions about how it would operate. Eric Ellman, senior vice president for public policy at the Consumer Data Industry Association, which represents credit- and background-checking firms, says the bill could make hiring less fair by placing new burdens on companies that run background checks on behalf of employers. He argues that such checks can help managers overcome a reluctance to hire people from certain demographic groups.
Some civil rights groups and AI experts also oppose the bill—for different reasons. Albert Fox Cahn, founder of the Surveillance Technology Oversight Project, organized a letter from 12 groups including the NAACP and New York University’s AI Now Institute objecting to the proposed law. Cahn wants to regulate hiring tech, but he says the New York proposal could allow software that perpetuates discrimination to get rubber-stamped as having passed a fairness audit.
Cahn wants any law to define the technology covered more broadly, not let vendors decide how to audit their own technology, and allow individuals to sue to enforce the law. “We didn’t see any meaningful form of enforcement against the discrimination we’re concerned about,” he says.
Others have concerns but still support the New York proposal. “I hope that the bill will go forward,” says Julia Stoyanovich, director of the Center for Responsible AI at New York University. “I also hope it will be revised.”
Source link