OpenAI says ChatGPT not to blame in teen’s death by suicide


Lawyers for OpenAI said that ChatGPT directed Raine to connect with crisis resources and trusted individuals more than 100 times. — Reuters

OpenAI defended itself against a lawsuit accusing ChatGPT of coaching a 16-year-old to kill himself, saying the chatbot directed the teenager to seek help more than 100 times.

In a court filing Tuesday, the artificial intelligence startup called the death of California high school student Adam Raine "a tragedy” and said a "full reading of his chat history shows that his death, while devastating, was not caused by ChatGPT.” 

Raine told the chatbot that "for several years before he ever used ChatGPT, he exhibited multiple significant risk factors for self-harm, including, among others, recurring suicidal thoughts and ideations,” according to the filing in San Francisco Superior Court.

Lawyers for OpenAI said that ChatGPT directed Raine to connect with "crisis resources and trusted individuals more than 100 times.” In the weeks and days before his death, Raine "told ChatGPT that he repeatedly reached out to people, including trusted persons in his life, with cries for help, which he said were ignored,” according to the filing.

In August, Raine’s family sued OpenAI and its chief executive officer, Sam Altman, over his death, alleging that ChatGPT guided the teenager through the process of tying a noose and offered to help him write a suicide note. In the wake of the suit, the company announced a slew of changes to ChatGPT, including controls that let parents limit the ways teenagers use the chatbot and receive alerts if it determines a teenager may be in distress.

In a statement, Raine family attorney Jay Edelson called OpenAI’s filing "disturbing,” saying the company "tries to find fault in everyone else, including, amazingly, by arguing that Adam himself violated its terms and conditions by engaging with ChatGPT in the very way it was programmed to act.”

The lawsuit includes claims for wrongful death, product liability and negligence.

The case is Raine v. OpenAI Inc., CGC25628528, California Superior Court, San Francisco County. – Bloomberg

Those suffering from problems can reach out to the Mental Health Psychosocial Support Service at 03-2935 9935 or 014-322 3392; Talian Kasih at 15999 or 019-261 5999 on WhatsApp; Jakim’s (Department of Islamic Development Malaysia) family, social and community care centre at 0111-959 8214 on WhatsApp; and Befrienders Kuala Lumpur at 03-7627 2929 or go to befrienders.org.my/centre-in-malaysia for a full list of numbers nationwide and operating hours, or email sam@befrienders.org.my.

 

 

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Smartphone on your kid’s Christmas list? How to know when they’re ready.
A woman's Waymo rolled up with a stunning surprise: A man hiding in the trunk
A safety report card ranks AI company efforts to protect humanity
Bitcoin hoarding company Strategy remains in Nasdaq 100
Opinion: Everyone complains about 'AI slop,' but no one can define it
Google faces $129 million French asset freeze after Russian ruling, documents show
Netflix’s $72 billion Warner Bros deal faces skepticism over YouTube rivalry claim
Pakistan to allow Binance to explore 'tokenisation' of up to $2 billion of assets
Analysis-Musk's Mars mission adds risk to red-hot SpaceX IPO
Analysis-Oracle-Broadcom one-two punch hits AI trade, but investor optimism persists

Others Also Read