A new report out from nonprofit Upturn analyzed some of the most prominent hiring algorithms on the market and found that by default, such algorithms are prone to bias.
If past hiring data is used, the training information can be biased or unrepresentative, carrying these biases over to the software. Just removing data about gender and race won’t keep bias out of software. Other pieces of information, like distance from the office, can correlate strongly to more sensitive factors. Amazon saw this in a hiring algorithm it tried to develop. Hiring managers can also be prone to giving too much credence to the recommendations made by hiring algorithms.
To make the use of hiring AIs more fair, the report recommends:
- allowing independent auditing of employer and vendor software
- having governments update their regulations to cover predictive hiring software
- scrutinizing ad and job platforms in more detail to analyze their growing influence on hiring