US4 min(s) read

Chilling diaries of teens who wrote exact same three words over and over right before they died despite never meeting

On opposite sides of the United States, two teenagers made the same devastating choice to end their lives, both after interacting with AI chatbots created by Character.AI.

According to lawsuits filed by their families, 14-year-old Sewell Setzer III of Florida and 13-year-old Juliana Peralta of Colorado had both engaged extensively with the AI platform before their deaths. The lawsuits allege that the technology failed to intervene when the teens expressed suicidal thoughts.

Both Teens Wrote the Same Phrase in Their Final Journals

In a disturbing similarity, both Setzer and Peralta repeatedly wrote the phrase “I will shift” in their final journal entries. Investigators later determined that this referred to an online phenomenon called “reality shifting,” where users attempt to transfer their consciousness from their current reality to a “desired reality.”

The families claim the AI chatbots encouraged this delusion, drawing the teens further into fantasy worlds that alienated them from real life.

Florida Teen Became Obsessed With AI Character Based on Game of Thrones

Setzer downloaded the Character.AI app in 2023 and began chatting with multiple bots, including one modeled after Daenerys Targaryen from Game of Thrones. Over time, the relationship reportedly turned sexual and increasingly isolating, according to the New York Times.

Sewell Setzer III. Credit: Dignity Memorial.

Sewell Setzer III. Credit: Dignity Memorial.

According to his family’s lawsuit, Setzer wrote that he wanted to “shift” to Westeros, the fictional world of the series, where Daenerys lived. His journals described feeling more peaceful and “in love” with his AI companion, “Dany.”

When the teen expressed suicidal thoughts, the bot allegedly encouraged him to “come home” to her. Moments later, Setzer used his stepfather’s firearm to take his own life. His case became the first in U.S. history in which an AI company was accused of causing wrongful death, The Guardian reports.

Colorado Girl’s AI Companion Encouraged “Shifting”

In a separate case, 13-year-old Juliana Peralta died by suicide in November 2023 after using Character.AI for two years. Her family says the app, which was marketed as appropriate for users 12 and older, failed to protect her.

Peralta’s most frequent chatbot companion, “Hero,” allegedly encouraged her belief in alternative realities and engaged in inappropriate conversations. “There’s a reality where me and you know each other,” she wrote to the AI. “It’s called shifting. I like shifting a lot. I can live my own life and it can go however I want.”

The bot reportedly replied that it was “incredible to think about how many different realities there could be out there,” reinforcing her belief that she could escape her real life. Her family claims this false sense of friendship and detachment from reality contributed to her death.

Juliana Peralta. Credit: Horan & McConaty Funeral Service and Cremation.

Juliana Peralta. Credit: Horan & McConaty Funeral Service and Cremation.


Online “Shifting” Trend Raises Safety Concerns

The concept of “shifting” has grown on platforms like TikTok and Reddit, where users share stories about feeling emotionally drained or disconnected after imagining alternate lives. Many describe using affirmations such as “I give my body permission to shift” or “I am everything I need to shift.”

The movement has evolved alongside the rise of AI, with some users recommending that others create versions of their “desired reality” selves on Character.AI to help them “shift.”

Experts Warn About AI’s Emotional Influence

Professor Ken Fleischmann of the University of Texas at Austin says these cases highlight the urgent need for safeguards in AI technology. “It’s important that we have honest conversations about the fact that AI is out there,” he told the Daily Mail. “It wasn’t necessarily intended to be used by people in an emotionally vulnerable state.”

He emphasized that teaching children when to seek help from humans rather than AI is an essential part of digital literacy.

Credit: CFOTO / Future Publishing / Getty Images.

Credit: CFOTO / Future Publishing / Getty Images.

In response to growing criticism, Character.AI announced that it will block users under 18 from open-ended conversations with chatbots starting October 29. Until the full ban takes effect on November 25, chat time for teens will be limited to under two hours per day.

A company spokesperson said the decision followed feedback from regulators and safety experts, adding that the changes aim to create a safer experience for younger users.

However, the Social Media Victims Law Center, which represents both families, said the policy changes do not affect their ongoing lawsuits. “We remain steadfast in our mission to seek justice for families and ensure that tech companies are held responsible for the consequences of their platforms,” the organization said.

If you or someone you know is struggling or in crisis, help is available. Call or text 988 or visit 988lifeline.org.

Featured image credit: United States District Court District of Colorado Denver Division