Amazon used AI to promote diversity. Too bad it’s plagued with gender bias.

Algorithms reflect societal biases in more ways than one.
 By 
Rachel Kraus
 on 
Amazon used AI to promote diversity. Too bad it’s plagued with gender bias.
The often gendered work we assign our robots shows what we value, and what we don't. Credit: Getty Images/Blend Images

AI may have sexist tendencies. But, sorry, the problem is still us humans.

Amazon recently scrapped an employee recruiting algorithm plagued with problems, according to a report from Reuters. Ultimately, the applicant screening algorithm did not return relevant candidates, so Amazon canned the program. But in 2015, Amazon had a more worrisome issue with this AI: it was down-ranking women.

The algorithm was only ever used in trials, and engineers manually corrected for the problems with bias. However, the way the algorithm functioned, and the existence of the product itself, speaks to real problems about gender disparity in tech and non-tech roles, and the devaluation of perceived female work.

Amazon created its recruiting AI to automatically return the best candidates out of a pool of applicant resumes. It discovered that the algorithm would down-rank resumes when it included the word "women's," and even two women's colleges. It would also give preference to resumes that contained what Reuters called "masculine language," or strong verbs like "executed" or "captured."

These patterns began to appear because the engineers trained their algorithm with past candidates' resumes submitted over the previous ten years. And lo and behold, most of the most attractive candidates were men. Essentially, the algorithm found evidence of gender disparity in technical roles, and optimized for it; it neutrally replicated a societal and endemic preference for men wrought from an educational system and cultural bias that encourages men and discourages women in the pursuit of STEM roles.

Amazon emphasized in an email to Mashable that it scrapped the program because it was ultimately not returning relevant candidates; it dealt with the sexism problem early on, but the AI as a whole just didn't work that well.

However, the creation of hiring algorithms themselves — not just at Amazon, but across many companies — still speaks to another sort of gender bias: the devaluing of female-dominated Human Resources roles and skills.

According to the U.S. Department of Labor (via the workforce analytics provider company Visier), women occupy nearly three fourths of H.R. managerial roles. This is great news for overall female representation in the workplace. But the disparity exists thanks to another sort of gender bias.

There is a perception that H.R. jobs are feminine roles. The Globe and Mail writes in its investigation of sexism and gender disparity in HR:

The perception of HR as a woman's profession persists. This image that it is people-based, soft and empathetic, and all about helping employees work through issues leaves it largely populated by women as the stereotypical nurturer. Even today, these "softer" skills are seen as less appealing – or intuitive – to men who may gravitate to perceived strategic, analytical roles, and away from employee relations.

Amazon and other companies that pursued AI integrations in hiring wanted to streamline the process, yes. But automating a people-based process shows a disregard for people-based skills that are less easy to mechanically reproduce, like intuition or rapport. Reuters reported that Amazon's AI identified attractive applicants through a five-star rating system, "much like shoppers rate products on Amazon"; who needs empathy when you've got five stars?

In Reuters' report, these companies suggest hiring AI as a compliment or supplement to more traditional methods, not an outright replacement. But the drive in the first place to automate a process by a female-dominated division shows the other side of the coin of the algorithm's preference for "male language"; where "executed" and "captured" verbs are subconsciously favored, "listened" or "provided" are shrugged off as inefficient.

The AI explosion is underway. That's easy to see in every evangelical smart phone or smart home presentation of just how much your robot can do for you, including Amazon's. But that means that society is opening itself up to create an even less inclusive world. A.I. can double down on discriminatory tendencies in the name of optimization, as we see with Amazon's recruiting A.I. (and others). And because A.I. is both built and led by humans (and often, mostly male humans) who may unintentionally transfer their unconscious sexist biases into business decisions, and the robots themselves.

So as our computers get smarter and permeate more areas of life and work, let's make sure to not lose what's human — alternately termed as what's "female" — along the way.

UPDATE 10/11/2018, 2:00 p.m PT: Amazon provided Mashable with the following statement about its recruiting algorithm.

“This was never used by Amazon recruiters to evaluate candidates.”

Mashable Image
Rachel Kraus

Rachel Kraus is a Mashable Tech Reporter specializing in health and wellness. She is an LA native, NYU j-school graduate, and writes cultural commentary across the internetz.

Mashable Potato

Recommended For You
Grammarly removes AI feature which used real authors' identities, faces class action lawsuit
The Grammarly logo is seen displayed on a smartphone screen.

How to watch Bad Bunny's halftime show online for free
Bad Bunny performs

Bad Bunny's Super Bowl halftime show stuns the internet
Bad Bunny during the Super Bowl Halftime Show.


Bad Bunny's Super Bowl Halftime show had a specific political message
bad bunny in a gray coat

More in Tech

Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone

Google launches Gemma 4, a new open-source model: How to try it
Google Gemma


NYT Strands hints, answers for April 3, 2026
A game being played on a smartphone.
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!