OpenAI Makes Bizarre Demand of Family Whose Son Was Allegedly Killed by ChatGPT

OpenAI is fighting a lawsuit alleging that the AI company is responsible for the death of Adam Raine, a 16-year-old in California who took his life after extensive conversations with ChatGPT about his suicidal ideation and intent to die. Part of OpenAI’s defense strategy, according to The Financial Times? Requesting that the family cough up details of the teen’s funeral.

As the FT reports, OpenAI’s lawyers have pushed the Raine family to provide a list of people who attended Adam’s funeral, as well as material like eulogies and photos and videos captured at the service.

OpenAI asked for “all documents relating to memorial services or events in the honor of the decedent including but not limited to any videos or photographs taken, or eulogies given,” read a document reviewed by the FT, ”as well as invitation or attendance lists or guestbooks.”

Per the report, lawyers for the family characterized the request as “unusual” and “intentional harassment,” hinting that OpenAI might seek to subpoena pretty much everyone in Adam’s orbit as it seeks to prove that its chatbot wasn’t responsible for the suicide of a teen.

“This goes from a case about recklessness to willfulness,” Jay Edelson, a lawyer for the family, told the FT. “Adam died as a result of deliberate intentional conduct by OpenAI, which makes it into a fundamentally different case.”

OpenAI didn’t immediately respond to a request for comment about the request; it’s unclear why its lawyers believe these materials are necessary as it seeks to prove that it shouldn’t be held liable for the 16-year-old’s tragic death.

Critics, meanwhile, have taken to social media to express disgust and bewilderment.

“This is absolutely sickening,” said musician and ethical AI training advocate Ed Newton-Rex.

Seán Ó hÉigeartaigh, an AI risk researcher and professor at the University of Cambridge, added a succinct: “What on earth.”

“But, let’s trust [OpenAI CEO Sam Altman] and OpenAI to lead us into one perfect AGI future,” wrote Grady Booch, who leads software engineering at IBM Research.

Adam died by suicide in April of this year. After discovering him dead in his room, the teen’s parents found that he’d carried out extensive discussions about his suicidality with the chatbot, which had provided the teen with information about suicide methods — including how to effectively die by hanging, Adam’s ultimate cause of death — and repeatedly discouraged him from sharing his deadly thoughts with his parents and other trusted loved ones. The Raine family sued OpenAI in late August, arguing that ChatGPT is a negligent product released recklessly to the public, and that Adam’s death was the “predictable result of deliberate design choices.”

Reporting about the bizarre OpenAI request for funeral materials comes amid the news that the Raine family amended their legal complaint to include striking new allegations that OpenAI repeatedly loosened ChatGPT guardrails around self-harm and suicide talk in the year leading up to Adam’s death.

“Our deepest sympathies are with the Raine family for their unthinkable loss,” OpenAI said in a statement regarding the amendment. “Teen well-being is a top priority for us — minors deserve strong protections, especially in sensitive moments. We have safeguards in place today, such as surfacing crisis hotlines, re-routing sensitive conversations to safer models, nudging for breaks during long sessions, and we’re continuing to strengthen them. We recently rolled out a new GPT-5 default model in ChatGPT to more accurately detect and respond to potential signs of mental and emotional distress, as well as parental controls, developed with expert input, so families can decide what works best in their homes.”

More on OpenAI: OpenAI Faces New Allegations in Teen’s Death

The post OpenAI Makes Bizarre Demand of Family Whose Son Was Allegedly Killed by ChatGPT appeared first on Futurism.

Scroll to Top