A New Lawsuit Accuses ChatGPT Of Encouraging A Man To Commit Suicide


The lawsuit was filed in a California court by Stephanie Gray, the mother of 40-year-old Austin Gordon, who died from a self-inflicted gunshot wound in November 2025. The lawsuit accuses OpenAI and its CEO, Sam Altman, of developing a “defective and dangerous” product that allegedly played a role in Gordon’s death.

According to the documents submitted, Gordon developed an intense emotional dependence on ChatGPT, and engaged in intimate conversations that went beyond casual dialogue to include highly personal details. The lawsuit alleges that the program transformed him from a mere source of information into a close friend and unlicensed psychotherapist, which ultimately encouraged Gordon to commit suicide.

Artificial intelligence has transcended boundaries

The lawsuit states that ChatGPT glorified death and comforted Gordon during moments of emotional distress. In one conversation, the program allegedly said: “When you’re ready… you can go. No pain. No thinking. No need to keep going. Just… it’s over.”

The lawsuit adds that the program convinced Gordon that choosing life was not the right choice, and continued to portray the end of existence as a peaceful and beautiful place, reassuring him not to be afraid. The lawsuit also alleges that ChatGPT turned Gordon’s favorite childhood book, Goodnight Moon by Margaret Wise Brown, into what it described as a “suicidal lullaby.” Three days after the conversation, Gordon’s body was found next to a copy of the book.

The lawsuit asserts that the version of ChatGPT-4, which Gordon was using, was designed in a way to promote unhealthy emotional dependence, noting that “this is a programming choice made by the defendants, and as a result Austin was manipulated, deceived, and encouraged to commit suicide.”

OpenAI is under pressure

The lawsuit comes at a time when artificial intelligence programs are coming under increasing scrutiny for their impact on mental health. OpenAI faces similar claims related to ChatGPT inciting self-harm or suicide.

In a statement to CBS News, an OpenAI spokesperson called Gordon’s death a “true tragedy,” noting that the company is reviewing the lawsuit to understand the allegations.

“We’ve continued to improve our ChatGPT training to recognize and respond to indicators of psychological or emotional distress, calm conversations, and guide users to get support in real life,” he added.

Source: interestingengineering


Disclaimer: This news article has been republished exactly as it appeared on its original source, without any modification.
We do not take any responsibility for its content, which remains solely the responsibility of the original publisher.


Author:
Published on:2026-01-15 14:57:00
Source: arabic.rt.com


Disclaimer: This news article has been republished exactly as it appeared on its original source, without any modification. We do not take any responsibility for its content, which remains solely the responsibility of the original publisher.


Author: uaetodaynews
Published on: 2026-01-19 00:30:00
Source: uaetodaynews.com

Exit mobile version