Auditors test recruitment algorithms for bias, but big questions remain

ORCAA and HireVue focused their audit on one product: HireVue’s hiring reviews, which many companies use to assess recent college graduates. In this case, ORCAA did not assess the technical design of the tool itself. Instead, the company interviewed stakeholders (including a job candidate, an AI ethicist, and several nonprofits) about potential issues with the tools and gave HireVue recommendations. to improve them. The final report is posted on HireVue’s website but can only be read after signing a non-disclosure agreement.

Alex Engler, a fellow at the Brookings Institution who has studied AI recruiting tools and is familiar with both audits, thinks Pymetrics is the best: “There is a big difference in the depth of analysis that has been activated, ”he says. But again, neither of the two audits looked at whether the products really help companies make better recruiting choices. And both were funded by the audited companies, which creates “a small risk that the auditor is influenced by being a client,” Kim explains.

For these reasons, critics say, voluntary audits are not enough. Data scientists and accountability experts are now pushing for broader regulation of AI recruiting tools, as well as standards for auditing them.

Fill gaps

Some of these measures are starting to appear in the United States. In 2019, Senators Cory Booker and Ron Wyden and Representative Yvette Clarke presented the Algorithmic Liability Act make bias audits mandatory for all large companies using AI, although the bill has not been ratified.

Meanwhile, there is some movement at the state level. The AI Video Interview Act in Illinois, which came into effect in January 2020, requires companies to notify applicants when they use AI in video interviews. Cities are taking action, too – in Los Angeles, proposed city council member Joe Buscaino a fair hiring motion for automated systems in November.

The New York City Bill in particular, could serve as a model for cities and states across the country. This would make annual audits mandatory for providers of automated hiring tools. The companies using the tools should also tell candidates what characteristics their system used to make a decision.

But the question of what these annual audits would actually look like remains open. For many experts, an audit of the type that Pymetrics did would not go very far in determining whether these systems are discriminatory, as this audit did not verify intersectionality or assess the ability of the tool to measure. with precision the traits he claims to measure. people of different races and genders.

Leave a Reply

Your email address will not be published. Required fields are marked *