Amazon's use of Artificial Intelligence for Hiring
Charlie Ye, Erika Kaczynski, Drew Busch, Julia Merlin

Introduction

Graph shows the ratio between men and women in large Tech companies. Source: Reuters
Since 2014, Amazon.com Inc's AMZN.O machine-learning specialists had been developing computer programs to automate the search for new hires, but by 2015, they realized that their system was not producing gender-neutral ratings of applicants. This resulted in many applicants’ resumes, especially those of women, not making it past the initial screening step provided by the machine learning algorithm. When we consider the historic gender breakdown of the tech industry, it has always been male-dominated. Prevailing ideas of men being better software developers than women resulted in more male applicants and many hiring decisions for men over women, further perpetrated by male interviewers and management. This is no different at Amazon, which became clear when they made the decision to lean on AI to assist their hiring process.
Findings and Results

Flowchart shows the process of building an artificial intelligence algorithm. Source: J onathan Kanevsky
In order to have AI that properly makes predictions based on new data, it needs to be “trained” using pre-existing data for which the developers already have the results. Using this training data, it recognizes patterns and applies those to new data to make conclusions.
Amazon’s AI was trained by observing patterns in resumes submitted over the last 10 years. This means the AI is making the assumption that hires in the past are a good representation of how the company should hire in the future. Not only can biases arise from the gender ratio of applicants, but also from the human employees who are making the hiring decisions. Since their actions are used to train the AI, their biases will show through in the end.
The lack of diversity and existing biases in their data resulted in particular issues driving hiring decisions towards male applicants. Little significance was placed on actual technical terms on resumes that are commonly known in the industry, and instead more arbitrary aspects of resumes were focused on to screen out applicants. In particular, the use of the word “women’s” on resumes, like in “women’s tennis club captain”, was viewed negatively. More subtle word usage like “executed” and “captured,” which were primarily used by male applicants, were favored.
Image explains how can bias be programmed into a machine. Source: Thought Leadership
Implications and Broader Impacts
Source: Personnel Today
Today, almost all big companies are adopting machine learning algorithms in some aspect of their hiring process and in other important decision-making processes like loans/mortgages and insurance claims. Biases like in Amazon’s hiring algorithm can extend beyond gender to other demographics like race, sexual orientation, and more, and it’s increasingly important to identify and address these issues.
"Bias is all of our responsibility. It hurts those discriminated against, of course, and it also hurts everyone by reducing people’s ability to participate in the economy and society. It reduces the potential of AI for business and society by encouraging mistrust and producing distorted results. " Brittany Presten.
Infographic on the cycle of discrimination. Source: World Economic Forum
To help reduce bias and discrimination from algorithms, companies can take specific steps and perform proper oversight. First, they should strive to choose the correct model. They should carefully select their dataset, making sure to not focus only on their own data that may already contain biases and lack diversity, like Amazon’s hiring dataset. Finally, they should have the intent to remove any existing bias when performing data processing, taking into account any historical discrimination that may have been created.
Source: TalentLyft
References
British Medical Journal. (2021). Ridding AI and machine learning of bias involves taking their many uses into consideration [Infographic].
Dastin, J. (2018, October 10). Amazon scraps secret AI recruiting tool that showed bias against women. Reuters. Retrieved November 15, 2021, from https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G.
Huang, H. (2018, October 10). Global Headcount [Graph]. Reuters. https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
Manyika, J. M., Silberg, J. S., & Presten, B. P. (2019, October 25). What Do We Do About the Biases in AI? Harvard Business Review. Retrieved November 10, 2021, from https://hbr.org/2019/10/what-do-we-do-about-the-biases-in-ai
Stackhouse, J. (2019, March 15). How do you Program Bias into a Machine? [Infographic]. https://thoughtleadership.rbc.com/ai-for-good-battling-bias-before-it-becomes-irreversible/
TalentLyft. (2020, November 16). Major Leaning Lessons from AI [Illustration]. https://www.talentlyft.com/en/blog/article/414/the-ai-recruitment-evolution-from-amazons-biased-algorithm-to-contextual-understanding