AI figured out the word people text when their suicide risk is high

It's not what you think.
 By 
Rebecca Ruiz
 on 
Original image replaced with Mashable logo
Original image has been replaced. Credit: Mashable

If you were asked to guess the words people use when they're most at risk for suicide, you'd be right to think of obvious nouns and verbs like die, overdose and, yes, the word suicide itself.

So when Crisis Text Line, a free mental health support service, built an algorithm to flag high-priority texts, it included those among 50 words to indicate the person messaging desperately needed help.

But when Crisis Text Line started using artificial intelligence to analyze the 22 million messages about emotional distress in its database last summer, its researchers made a surprising discovery: The word ibuprofen was 16 times more likely to predict the person texting would need emergency services than the word suicide.

Another highly predictive type of content wasn't even a word but a crying face emoji. When people included that sad character in their messages, Crisis Text Line supervisors were 11 times more likely to call 911 for assistance. In total, Crisis Text Line has integrated 9,000 new words or word combinations that indicate high risk -- and expects to add more in the future.

Nancy Lublin, the nonprofit's CEO, says these unexpected data points have made a huge difference. Before AI detected the new words, volunteers responded to high-risk texters in less than two minutes. Now the average response time is down to 39 seconds. Lublin believes that's because the algorithm is much better at identifying those most at risk and sending them to the front of the line, like you would in a hospital emergency room.

"We are finally listening to people who are suicidal and using what they’re telling us to figure out how to help them."

Julie Cerel, a clinical psychologist and president-elect of the American Association of Suicidology, says the practical implications of the technology are important. But she also believes the approach reflects a significant change in the way researchers and public health professionals try to prevent suicide.

"What this speaks to is we are finally listening to people who are suicidal and using what they’re telling us to figure out how to help them," she says.

In the past, it's been literally impossible to comb through and code transcripts with suicide attempt survivors at the same scale as Crisis Text Line. Now machine learning makes it possible for researchers to analyze digital conversations and look for signals that someone may be close to attempting suicide.

That approach has gained considerable momentum in the last year. Facebook recently announced that it's incorporating AI into its suicide-prevention efforts, and the research project OurDataHelps launched last spring by asking people to "donate" their social data so scientists could better understand suicide risk.

At Crisis Text Line, conversations that end in what's known as an active rescue are rare. Only 1 percent of those exchanges require intervention by the authorities, and Lublin considers it the last line of defense. The goal, she says, is to help texters create a safety plan and encourage them to feel capable in handling a crisis. But sometimes that approach doesn't work, which is why Lublin wants the system to be as fast and as accurate as possible in singling out high-risk people.

Even though the new words and phrases Crisis Text Line identified might not seem immediately useful to doctors and therapists who work with patients in person, Cerel says they're evidence that people don't always choose the most obvious words to talk about suicidal feelings.

"It's a reminder to keep asking the question," says Cerel, "and make it clear we want to hear the answer."

If you want to talk to someone or are experiencing suicidal thoughts, text the Crisis Text Line at 741-741 or call the National Suicide Prevention Lifeline at 1-800-273-8255. Here is a list of international resources. 

Topics Mental Health

Rebecca Ruiz
Rebecca Ruiz
Senior Reporter

Rebecca Ruiz is a Senior Reporter at Mashable. She frequently covers mental health, digital culture, and technology. Her areas of expertise include suicide prevention, screen use and mental health, parenting, youth well-being, and meditation and mindfulness. Rebecca's experience prior to Mashable includes working as a staff writer, reporter, and editor at NBC News Digital and as a staff writer at Forbes. Rebecca has a B.A. from Sarah Lawrence College and a masters degree from U.C. Berkeley's Graduate School of Journalism.

Mashable Potato

Recommended For You
Instagram to alert parents when teens search for suicide
Instagram logo is reflected in boy's glasses.


Apple expects high demand from its March 4 releases
apple logo over a smartphone with black background

ChatGPT's sex-centered adult mode raises red flags at OpenAI
ChatGPT's erotica mode

Seth Meyers calls out Trump and Republicans clashing over the word 'war'
Seth Meyers presents "Late Night" beside an image of Donald Trump and Pete Hegseth.

More in Life

Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone

What's new to streaming this week? (April 3, 2026)
A composite of images from film and TV streaming this week.

NYT Strands hints, answers for April 3, 2026
A game being played on a smartphone.

The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!