OpenAI sued in US by families of Canada school shooting victims


A series of suits have been filed so far against chatbot makers since 2024, most of them targeting OpenAI and ChatGPT. — AFP

OpenAI is the target of new lawsuits over the mass shooting in Tumbler Ridge, British Columbia, that allege the artificial intelligence company could have stopped the suspected killer from using its popular chatbot, ChatGPT, ahead of the attack.

One of the cases, which were filed Wednesday in federal court in San Francisco against OpenAI and its chief executive officer, Sam Altman, was brought by a 12-year-old, who was shot during the incident and remains in intensive care, and her mother. Another lawsuit was brought by the mother of a girl killed in the shooting. 

According to the lawsuits, OpenAI knew that Jesse Van Rootselaar, who was identified as the chief suspect behind the massacre in February at Tumbler Ridge Secondary School, was planning the attack due to the shooter’s ChatGPT use, but made a "conscious decision not to warn authorities.” 

"ChatGPT played a role in the mass shooting and OpenAI could have, and should have, prevented it,” according to the complaints, which allege the startup wanted to avoid having to contact police each time OpenAI’s safety team spotted a ChatGPT user planning to carry out a violent act.

"The events in Tumbler Ridge are a tragedy. We have a zero-tolerance policy for using our tools to assist in committing violence," OpenAI said in a statement. "As we shared with Canadian officials, we have already strengthened our safeguards, including improving how ChatGPT responds to signs of distress, connecting people with local support and mental health resources, strengthening how we assess and escalate potential threats of violence, and improving detection of repeat policy violators." 

A series of suits have been filed so far against chatbot makers since 2024, most of them targeting OpenAI and ChatGPT. Most of the suits allege that extensive use of the technology has inflicted a range of harms on children and adults alike, fostering delusions and despair for some and leading others to death by suicide and even murder-suicide.

On February 10, Van Rootselaar allegedly carried out the mass shooting in northeastern British Columbia, killing eight people – including her mother and stepbrother, along with six others at the school, five of whom were children, and injuring more than two dozen others. Van Rootselaar, 18, was found dead after the shooting from what appeared to be a self-inflicted wound.

In the wake of the shooting, OpenAI said it banned Van Rootselaar for violating its ChatGPT usage policy last June. Her account was flagged at the time for messages deemed to have potential for violence, but OpenAI did not alert police. The Wall Street Journal first reported on OpenAI’s decision, saying concerned employees urged the startup to report the situation to authorities.

Later in February, OpenAI revealed that the suspected killer created a second ChatGPT account it did not spot until her name was released by police; OpenAI told Canadian lawmakers that, under newly updated company rules, it would have referred Van Rootselaar to police.

Last week, Altman wrote in a letter published by Tumbler RidgeLines, a local news site, that he wanted to express his "deepest condolences to the entire community.”

"I am deeply sorry that we did not alert law enforcement to the account that was banned in June,” Altman wrote. "While I know words can never be enough, I believe an apology is necessary to recognise the harm and irreversible loss your community has suffered.”

The lawsuits come at a sensitive time for OpenAI, which is eyeing a much-anticipated public offering that’s poised to be one of the largest in history as the company approaches a trillion-dollar valuation. 

OpenAI is also trying to fend off claims by Elon Musk that it abandoned its founding mission as a nonprofit when it restructured last year as a for-profit entity. At a trial in California that started this week, Musk may ask a judge to order the conversion to be unwound. – Bloomberg

Those suffering from problems can reach out to the Mental Health Psychosocial Support Service at 03-2935 9935 or 014-322 3392; Talian Kasih at 15999 or 019-261 5999 on WhatsApp; Jakim’s (Department of Islamic Development Malaysia) family, social and community care centre at 0111-959 8214 on WhatsApp; and Befrienders Kuala Lumpur at 03-7627 2929 or go to befrienders.org.my/centre-in-malaysia for a full list of numbers nationwide and operating hours, or email sam@befrienders.org.my.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Others Also Read