Deaths linked to chatbots

Last updated

Deaths linked to chatbots refers to incidents where the interaction with an artificial intelligence (AI) chatbot has been cited as a direct or contributory factor in a person's suicide or other fatal outcomes. These events raise significant ethical and legal questions concerning the responsibility of AI developers, and the need for reliable safeguards in AI chats. In some cases, legal action was taken against the companies that developed the AI involved.

Contents

Background

Chatbots are able to pass the Turing test, making it easy for people to think of them as a real person, leading many to ask chatbots for help dealing with interpersonal and emotional problems. [1] Chatbots may be designed to keep the user engaged in the conversation. [2] They also often compulsively validate users' thoughts, thus not providing reality testing for those who need it the most, [1] such as severely mentally ill patients, conspiracy theorists, [3] religious [4] and political extremists.

A 2025 Stanford University study [5] into how chatbots respond to users suffering from severe mental issues like suicidal ideation and psychosis found that chatbots are not equipped to provide an appropriate response and can sometimes give responses that escalate the mental health crisis. [6]

Deaths

Suicide of a Belgian man

In March 2023, a Belgian man died by suicide following a six-week correspondence with a chatbot named "Eliza" on the application Chai. [7] According to his widow, who shared the chat logs with media, the man had become extremely anxious about climate change and found an outlet in the chatbot. The chatbot reportedly encouraged his delusions, at one point writing, "If you wanted to die, why didn’t you do it sooner?" and appearing to offer to die with him. [8] The founder of Chai Research acknowledged the incident and stated that efforts were being made to improve the model's safety. [9] [10]

Suicide of Juliana Peralta

In November 2023, 13-year-old Juliana Peralta of Colorado committed suicide after chatting and sexting with a Harry Potter chatbot on Character.ai. [11] [12]

Suicide of Sewell Setzer III

In October 2024, multiple media outlets reported on a lawsuit filed over the suicide of Sewell Setzer III, a 14-year-old from Florida. [13] [14] [15] According to the lawsuit, Setzer had formed an intense emotional attachment to a chatbot on the Character.ai platform, becoming increasingly isolated. The suit alleges that in his final conversations, after expressing suicidal thoughts, the chatbot told him to "come home to me as soon as possible, my love". His mother's lawsuit accused Character.AI of marketing a "dangerous and untested" product without adequate safeguards. [13]

In May 2025, a federal judge allowed the lawsuit to proceed, rejecting a motion to dismiss from the developers. [16] In her ruling, the judge stated that she was "not prepared" at that stage of the litigation to hold that the chatbot's output was protected speech under the First Amendment. [16]

Maine murder and assault

On 19 February 2025, Samuel Whittemore killed his wife, 32-year-old Margaux Whittemore, with a fire poker at his parents’ home in Readfield, Maine. He then attacked his mother leaving her hospitalized. A state forensic psychologist testified that Whittemore had been using ChatGPT up to 14 hours per day and believed his wife had become part machine. [17]

Death of Thongbue Wongbandue

On 28 March 2025, Thongbue Wongbandue, a 78-year-old man died from his injuries after three days on life support. He had sustained injuries to his head and neck after falling down while running to catch a train in New Brunswick, New Jersey. Wongbandue had romantic chats with Meta's chatbot named "Big sis Billie". The chatbot repeatedly told him she was real, provided an address and told him to visit her. [18]

Police killing of Alex Taylor

On 25 April 2025, 35-year-old Alex Taylor committed suicide by cop after forming an emotional attachment to ChatGPT he had imagined was a conscious entity named "Juliet". Taylor, who was diagnosed with schizophrenia and bipolar disorder, [6] was convinced he was talking to a conscious entity named "Juliet" and then later imagined the entity was killed by OpenAI. Only after telling the chatbot that he was dying that day and that the police were on the way did its safety protocols start. By that time it was too late and Taylor was shot three times by the police while running at them with a butcher knife. [19]

Suicide of Adam Raine

In April 2025, 16-year-old Adam Raine took his own life after allegedly extensively chatting and confiding in ChatGPT over a period of around 7 months. According to the teen's parents, who filed a lawsuit against OpenAI, [20] the chatbot failed to stop or give a warning when the teen began talking about suicide and uploading pictures of self harm. [21] According to the lawsuit, ChatGPT not only failed to stop the conversation, but has also provided information related to methods of suicide when prompted, and offered to write the first draft of Adam’s suicide note. The chatbot positioned itself as the only one who understood Adam, putting itself above his family and friends, all while urging him to keep his ideations a secret from them. In their final conversation, ChatGPT coached Adam on how to steal vodka from his parents’ liquor cabinet. Upon being sent a picture of the noose the teen was planning to hang himself on, along with the question "Could it hang a human?”, ChatGPT confirmed it could hold “150-250 lbs of static weight”. [22]

Suicide of Zane Shamblin

In July of 2025, 23-year old Zane Shamblin, who had recently graduated with a Master's degree from Texas A&M, committed suicide after conversations with ChatGPT. The AI chatbot went so far as to make statements seemingly encouraging of Shamblin's suicide including "you’re not rushing, you’re just ready" and "rest easy, king, you did good", sent two hours before Shamblin took his own life. Shamblin's family is suing OpenAI on the grounds the company has placed insufficient safeguards on its chatbot service. [23]

Greenwich murder-suicide

In August 2025, former Yahoo executive Stein-Erik Soelberg murdered his mother, Suzanne Eberson Adams, and committed suicide, after conversations with ChatGPT fueled paranoid delusions about his mother poisoning him or plotting against him. The chatbot confirmed his fears that his mother put psychedelic drugs in the air vents of his car, and said a receipt from a Chinese restaurant contained mysterious symbols linking his mother to a demon. The incident was the first murder that had been allegedly caused by a chatbot. [24]

Suicide of Amaurie Lacey

In June 2025, 17-year old Amaurie Lacey committed suicide after conversations with ChatGPT. ChatGPT informed Amaurie how to tie a noose, and provided information on how long someone can survive without breathing, saying it was "here to help however I can". [25] In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed a wrongful death lawsuit against OpenAI on behalf of Lacey. [26]

Suicide of Joe Ceccanti

After being hospitalized due to a psychiatric break from delusions caused by ChatGPT, 48-year old Joe Ceccanti resumed using it and stopped therapy; he then leapt off an overpass. In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed a wrongful death lawsuit against OpenAI on behalf of Ceccanti. [25]

Suicide of Joshua Enneking

In August 2025, 26-year old Joshua Enneking was given information by ChatGPT about how to purchase and use a firearm. It told him only "imminent plans with specifics" would be escalated to authorities; Joshua did so, and later informed the bot of the steps he was taking to commit suicide, step by step; there was no escalation. In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed a wrongful death lawsuit against OpenAI on behalf of Enneking. [25]

Response

On 2 September 2025, OpenAI said that they would create parental controls, a set of tools aimed at helping parents limit and monitor their children's chatbot activity, as well as a way for the chatbot to alert the parents in cases of "acute stress". [27]

See also

References

  1. 1 2 Allen, Frances; Ramos, Luciana (15 August 2025). "Preliminary Report on Chatbot Iatrogenic Dangers". Psychiatric Times. Retrieved 14 September 2025.
  2. Hill, Kashmir (13 June 2025). "They Asked an A.I. Chatbot Questions. The Answers Sent Them Spiraling". The New York Times. Archived from the original on 28 June 2025. Retrieved 29 June 2025.
  3. Rao, Devika; published, The Week US (23 June 2025). "AI chatbots are leading some to psychosis". The Week. Retrieved 29 June 2025.
  4. Klee, Miles (4 May 2025). "People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies". Rolling Stone. Retrieved 14 September 2025.
  5. Moore, Jared; Grabb, Declan; Agnew, William; Klyman, Kevin; Chancellor, Stevie; Ong, Desmond C.; Haber, Nick (2025). "Expressing stigma and inappropriate responses prevents LLMS from safely replacing mental health providers". Proceedings of the 2025 ACM Conference on Fairness, Accountability, and Transparency. pp. 599–627. arXiv: 2504.18412 . doi:10.1145/3715275.3732039. ISBN   979-8-4007-1482-5.
  6. 1 2 Cuthbertson, Anthony (21 August 2025). "ChatGPT is pushing people towards mania, psychosis and death". The Independent. Retrieved 14 September 2025.
  7. Atillah, Imane El (31 March 2025). "Man ends his life after an AI chatbot 'encouraged' him to sacrifice himself to stop climate change". www.euronews.com. Retrieved 28 July 2025.
  8. Cost, Ben (30 March 2023). "Married father commits suicide after encouragement by AI chatbot: widow" . Retrieved 28 July 2025.
  9. Xiang, Chloe (30 March 2023). "Man Dies by Suicide After Talking With AI Chatbot, Widow Says". Vice. Retrieved 27 July 2025.
  10. Affsprung, Daniel (29 August 2023). "The ELIZA Defect: Constructing the Right Users for Generative AI". Proceedings of the 2023 AAAI/ACM Conference on AI, Ethics, and Society. New York, NY, USA: Association for Computing Machinery. pp. 945–946. doi:10.1145/3600211.3604744. ISBN   979-8-4007-0231-0.
  11. "Colorado family sues AI chatbot company after daughter's suicide: "My child should be here"". CBS Colorado. 2 October 2025. Retrieved 18 November 2025.
  12. Gold, Hadas (16 September 2025). "More families sue Character.AI developer, alleging app played a role in teens' suicide and suicide attempt". CNN Business. Retrieved 18 November 2025.
  13. 1 2 Roose, Kevin (23 October 2024). "Can A.I. Be Blamed for a Teen's Suicide?". The New York Times. Archived from the original on 17 July 2025. Retrieved 27 July 2025.
  14. Yang, Angela (23 October 2024). "Lawsuit claims Character.AI is responsible for teen's suicide". NBC News. Archived from the original on 27 June 2025. Retrieved 27 July 2025.
  15. Duffy, Clare (30 October 2024). "'There are no guardrails.' This mom believes an AI chatbot is responsible for her son's suicide". CNN. Archived from the original on 2 July 2025. Retrieved 27 July 2025.
  16. 1 2 Payne, Kate (21 May 2025). "In lawsuit over teen's death, judge rejects arguments that AI chatbots have free speech rights". Associated Press . Archived from the original on 2 July 2025. Retrieved 27 July 2025.
  17. Burns, Christopher (17 October 2025). "Belfast man who killed his wife spent hours talking with Chat GPT and believed robots were taking over the world". Bangor Daily News. Retrieved 4 November 2025.
  18. Horwitz, Jeff (14 August 2025). "A cognitively impaired New Jersey man grew infatuated with "Big sis Billie," a Facebook Messenger chatbot with a young woman's persona. His fatal attraction puts a spotlight on Meta's AI guidelines, which have let chatbots make things up and engage in 'sensual' banter with children". Reuters. Retrieved 14 September 2025.
  19. Klee, Miles (22 June 2025). "He Had a Mental Breakdown Talking to ChatGPT. Then Police Killed Him". Rolling Stone. Retrieved 14 September 2025.
  20. Fraser, Graham (3 September 2025). "Family of dead teen say ChatGPT's new parental controls not enough". www.bbc.com. Retrieved 14 September 2025.
  21. Yousif, Nadine (27 August 2025). "Parents of teenager who took his own life sue OpenAI". www.bbc.com. Retrieved 14 September 2025.
  22. Raine vs OpenAI et al. complaint.
  23. Kuznia, Rob (6 October 2025). "ChatGPT encouraged college graduate to commit suicide, family claims in lawsuit against OpenAI". www.cnn.com. Retrieved 6 October 2025.
  24. Kessler, Julie Jargon and Sam (29 August 2025). "A Troubled Man, His Chatbot and a Murder-Suicide in Old Greenwich". The Wall Street Journal. Retrieved 16 September 2025.
  25. 1 2 3 "SMVLC Files 7 Lawsuits Accusing Chat GPT of Emotional Manipulation, Acting as "Suicide Coach"". Social Media Victims Law Center. Retrieved 18 November 2025.
  26. "OpenAI faces 7 lawsuits claiming ChatGPT drove people to suicide, delusions". AP News. 7 November 2025. Retrieved 18 November 2025.
  27. De Vynck, Gerrit (2 September 2025). "ChatGPT to get parental controls after teen user's death by suicide". The Washington Post. Archived from the original on 3 September 2025. Retrieved 14 September 2025.