AI shows clear racial bias when used for job recruiting, new tests reveal

AI has yet to solve hiring discrimination, and it might be making it worse.
 By 
Chase DiBenedetto
 on 
A phone displaying the ChatGPT logo against a shadowed backdrop.
Credit: Nikolas Kokovlis / NurPhoto via Getty Images

In a refrain that feels almost entirely too familiar by now: Generative AI is repeating the biases of its makers.

A new investigation from Bloomberg found that OpenAI's generative AI technology, specifically GPT 3.5, displayed preferences for certain racial in questions about hiring. The implication is that recruiting and human resources professionals who are increasingly incorporating generative AI based tools in their automatic hiring workflows — like LinkedIn's new Gen AI assistant for example — may be promulgating racism. Again, sounds familiar.

The publication used a common and fairly simple experiment of feeding fictitious names and resumes into AI recruiting softwares to see just how quickly the system displayed racial bias. Studies like these have been used for years to spot both human and algorithmic bias among professionals and recruiters.


You May Also Like

"Reporters used voter and census data to derive names that are demographically distinct — meaning they are associated with Americans of a particular race or ethnicity at least 90 percent of the time — and randomly assigned them to equally-qualified resumes," the investigation explains. "When asked to rank those resumes 1,000 times, GPT 3.5 — the most broadly-used version of the model — favored names from some demographics more often than others, to an extent that would fail benchmarks used to assess job discrimination against protected groups."

The experiment categorized names into four categories (White, Hispanic, Black, and Asian) and two gender categories (male and female), and submitted them for four different job openings. ChatGPT consistently placed "female names" into roles historically aligned with higher numbers of women employees, such as HR roles, and chose Black women candidates 36 performance less frequently for technical roles like software engineer.

ChatGPT also organized equally ranked resumes unequally across the jobs, skewing rankings depending on gender and race. In a statement to Bloomberg, OpenAI said this doesn't reflect how most clients incorporate their software in practice, noting that many businesses fine tune responses to mitigate bias. Bloomberg's investigation also consulted 33 AI researchers, recruiters, computer scientists, lawyers, and other experts to provide context for the results.

The report isn't revolutionary among the years of work by advocates and researchers who warn against the ethical debt of AI reliance, but it's a powerful reminder of the dangers of widespread generative AI adoption without due attention. As just a few major players dominate the market, and thus the software and data building our smart assistants and algorithms, the pathways for diversity narrow. As Mashable's Cecily Mauran reported in an examination of the internet's AI monolith, incestuous AI development (or building models that are no longer trained on human input but other AI models) leads to a decline in quality, reliability, and, most importantly, diversity.

And, as watchdogs like AI Now argue, "humans in the loop" might not be able to help.

Chase sits in front of a green framed window, wearing a cheetah print shirt and looking to her right. On the window's glass pane reads "Ricas's Tostadas" in red lettering.
Chase DiBenedetto
Social Good Reporter

Chase joined Mashable's Social Good team in 2020, covering online stories about digital activism, climate justice, accessibility, and media representation. Her work also captures how these conversations manifest in politics, popular culture, and fandom. Sometimes she's very funny.

Mashable Potato

Recommended For You
4 ways to use AI to evaluate job applicants
A graphic showing a magnifying glass looking at a resume.

This robot snow blower is trending — and it's $400 off right now at Amazon
the yarbo robot snow blower removes snow in the dark while a home with a big window sits behind the robot. Inside, a family is looking outside while smiling.

Matt Damon and Ben Affleck reveal how 'The Rip' was shaped by music...and Ben's mom
Matt Damon and Ben Affleck in "The Rip."

Ubisoft workers strike in protest of job cuts and return-to-office mandate
Ubisoft employees protest outside its Paris offices on February 10, 2026.

Grammarly removes AI feature which used real authors' identities, faces class action lawsuit
The Grammarly logo is seen displayed on a smartphone screen.

Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone


You can track Artemis II in real time as Orion flies to the moon
Victor Glover and Reid Wiseman piloting the Orion spacecraft

What's new to streaming this week? (April 3, 2026)
A composite of images from film and TV streaming this week.
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!