Robot hand choosing cubes representing people

Alexander Limbach/Shutterstock

A Hiring Algorithm That Isn’t Vulnerable to Biased Data

Artificial intelligence has become a popular tool for job recruiters, in part because programmers can code applicant-screening algorithms to avoid any explicit discrimination in their decision-making processes. But algorithms run on data, and the data they use can itself contain bias—which means AI can perpetuate or even amplify historical patterns of discrimination. To try to solve this problem, Chicago Booth’s Rad Niazadeh and his coauthors designed a new hiring algorithm that incorporates fairness constraints, or guardrails to protect against discriminatory results.

More from Chicago Booth Review
More from Chicago Booth

Your Privacy
We want to demonstrate our commitment to your privacy. Please review Chicago Booth's privacy notice, which provides information explaining how and why we collect particular information when you visit our website.