This emoji could mean your suicide risk is high, according to AI

Crisis Text Line used artificial intelligence to pinpoint the things people text when they're thinking of suicide.
 By 
Rebecca Ruiz
 on 
This emoji could mean your suicide risk is high, according to AI
Crisis Text Line used artificial intelligence to figure out the emoji and words people use when their suicide risk is high. Credit: CHRISTOPHER MINESES / MASHABLE

Since its founding in 2013, the free mental health support service Crisis Text Line has focused on using data and technology to better aid those who reach out for help.

Unlike helplines that offer assistance based on the order in which users dialed, texted, or messaged, Crisis Text Line has an algorithm that determines who is in most urgent need of counseling. The nonprofit is particularly interested in learning which emoji and words texters use when their suicide risk is high, so as to quickly connect them with a counselor. Crisis Text Line just released new insights about those patterns.

Based on its analysis of 129 million messages processed between 2013 and the end of 2019, the nonprofit found that the pill emoji, or 💊, was 4.4 times more likely to end in a life-threatening situation than the word suicide.


You May Also Like

Other words that indicate imminent danger include 800mg, acetaminophen, excedrin, and antifreeze; those are two to three times more likely than the word suicide to involve an active rescue of the texter. The loudly crying emoji face, or 😭, is similarly high-risk. In general, the words that trigger the greatest alarm suggest the texter has a method or plan to attempt suicide or may be in the process of taking their own life.

Crisis Text Line has a list of 100 terms that are more high risk than the word suicide. Unexpected terms include vampire, which texters use to describe looking normal on the outside but feeling sick inside, or to say they've been called an "emotional vampire"; blvd, the abbreviation for boulevard, which shows up partly when texters name a location where they're at immediate risk of harm; and 11:11, a number that had no clear pattern.

This isn't Crisis Text Line's first attempt to understand how people communicate via text when they're suicidal. In 2017, the nonprofit used artificial intelligence to analyze 22 million messages; they found that the word ibuprofen was 16 times more likely to predict the person texting would need emergency services than the word suicide. Since Crisis Text Line has far more messages to analyze than it did in 2017, the word ibuprofen remains high-risk, but is not as predictive as the pill emoji.

Bob Filbin, chief data scientist at Crisis Text Line, is hopeful that artificial intelligence is sharpening the nonprofit's ability to detect suicide risk faster and more accurately. When the algorithm flags a message as high-risk, the conversation is coded orange so counselors immediately know to ask whether the user has a plan or method in place, among other questions.

Other words that indicate imminent danger include 800mg, acetaminophen, excedrin, and antifreeze.

When Crisis Text Line set out to learn more about the content associated with suicide risk, it began in 2015 with a list of 50 words identified by academics as high-risk, checking to see whether texters used the same words in conversations that already happened. Then it deployed an algorithm to see which words or emoji appeared uniquely in conversations with texters that ended in an active rescue. That's when ibuprofen emerged as a top candidate. (All of Crisis Text Line's data is anonymized.)

The updated algorithm used to analyze the 129 million messages considered not only the most frequently used words, phrases, and emoji, but also looked at the context of the conversation. So if the phrase "kill myself" appears, for example, the algorithm is designed to consider whether the rest of the sentence further increases risk (I want to kill myself) or negates the risk (I don't want to kill myself).

Compared to the general population, Crisis Text Line users skew young, low-income, and high-risk, but the AI-generated insights are promising.

Lindsey C. McKernan, an assistant professor in the department of psychiatry and behavioral sciences at Vanderbilt University Medical Center who has written about using artificial intelligence in suicide prevention, said in an email that Crisis Text Line's findings could be helpful.

"New research on texting’s role in suicide prediction has the potential to provide us another window or 'sign' to attune to as a family member, friend, or clinician interacting with someone under these circumstances," McKernan wrote. (Common warning signs include giving away possessions or having sleep and mood changes.)

But it's important to remember that algorithms can be wrong sometimes, inaccurately classifying someone as high-risk or indicating that someone is low-risk when they're not, McKernan said.

"Examining texting patterns could give us one more piece of information to inform suicide risk predictions and a prevention strategy for de-escalation, particularly for younger individuals at risk of suicide," she wrote.

Filbin said texting allows people, especially younger users, a way to be vulnerable during difficult moments.

"Part of the reason why we see these words is because text is a digital medium where people end up being particularly honest, and honest faster," said Filbin. "Texting can be uniquely powerful for young people to talk about their crises.

In turn, artificial intelligence helps Crisis Text Line better understand when those emergencies require immediate care and attention.

If you want to talk to someone or are experiencing suicidal thoughts, Crisis Text Line provides free, confidential support 24/7. Text CRISIS to 741741 to be connected to a crisis counselor. You can call the National Suicide Prevention Lifeline at 1-800-273-8255. Here is a list of international resources.

Rebecca Ruiz
Rebecca Ruiz
Senior Reporter

Rebecca Ruiz is a Senior Reporter at Mashable. She frequently covers mental health, digital culture, and technology. Her areas of expertise include suicide prevention, screen use and mental health, parenting, youth well-being, and meditation and mindfulness. Rebecca's experience prior to Mashable includes working as a staff writer, reporter, and editor at NBC News Digital and as a staff writer at Forbes. Rebecca has a B.A. from Sarah Lawrence College and a masters degree from U.C. Berkeley's Graduate School of Journalism.

Mashable Potato

Recommended For You
Instagram to alert parents when teens search for suicide
Instagram logo is reflected in boy's glasses.


ChatGPT's sex-centered adult mode raises red flags at OpenAI
ChatGPT's erotica mode


Apple plans a 'high-end' Ultra line, including iPhone Fold, report says
Apple logo on iPhone

More in Life
How to watch Chelsea vs. Port Vale online for free
Alejandro Garnacho of Chelsea reacts

How to watch 'Wuthering Heights' at home: Margot Robbie and Jacob Elordi's controversial romance now streaming
Margot Robbie and Jacob Elordi embracing in still from "Wuthering Heights"

How to watch New York Islanders vs. Philadelphia Flyers online for free
Matthew Schaefer of the New York Islanders warms up

How to watch Mexico vs. Belgium online for free
Israel Reyes of Mexico reacts

How to watch Brazil vs. Croatia online for free
Vinicius Junior #10 of Brazil leaves

Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone

What's new to streaming this week? (April 3, 2026)
A composite of images from film and TV streaming this week.

Google launches Gemma 4, a new open-source model: How to try it
Google Gemma

The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!