On Wednesday, April 29, Lawyer Jay Edelson filed seven lawsuits against OpenAI in the Northern District of California.
The cases were filed on behalf of the families of Zoey Benoit, Abel Mwansa, Ticaria Lampert, Ezekiel Schofield, Kylie Smith, Shannda Aviugana-Durand and Maya Gebala.
The last case will supersede a case filed back in March in Canadian court. The remaining six cases are all newly filed, with the US–based Edelson law firm working alongside existing Canadian counsel on behalf of the Tumbler Ridge community.
According to the new filings, OpenAI “instructed the shooter on how to return to ChatGPT even after their account was flagged and terminated for mass shooting discussions. When OpenAI shuts down an account for dangerous behavior, it doesn’t actually ban the user — it tells them how to get back on,” says a release from Jay Edelson’s office. “OpenAI sends users an email instructing them to wait 30 days and open a new account with the same email address. In a public document, OpenAI also tells users that if they don’t want to wait, they can open a new account immediately using a different email address, including a sub-email of their previous account.”
This means, says the release, that OpenAI’s claim that the Tumbler Ridge shooter “bypassed” its security is simply not true. “The shooter did exactly what OpenAI instructed users to do when an account is shut down.”
Instead, the lawyer claims that they chose not to report, because “reporting one case would mean reporting thousands.
“The volume of dangerous incidents on the platform is so large that consistent reporting would reveal just how dangerous ChatGPT has become.”
They reference the fact that the Florida Attorney General recently announced a criminal investigation into OpenAI’s conduct in connection with another mass shooting, “underscoring a growing consensus that the company’s behavior may cross the line from civil to criminal liability.”
According to Rice Parsons Leoni and Elliott LLP, the company that filed suit on behalf of Maya, mom Cia and sister Dehlia in March says that litigating these cases in Canada poses challenges. “Damages for pain and suffering are capped at approximately $470,000 CAD. The largest punitive damages award ever made in Canadian history is $1.5 million CAD. With respect to the murdered children, their estates are not permitted to bring claims in British Columbia for damages against OpenAI, and in most cases the loved ones of wrongfully killed children are unable to recover any recompense under British Columbia’s Family Compensation Act.”
Indeed, the company originally said that they were hoping to find people in Tumbler Ridge to sue OpenAI in BC Court to force changes to the provincial law.
“As a general rule, the law does not permit cases on behalf of loved ones of those harmed or killed to proceed,” Lawyers John Rice and Mallory Hogan wrote in an open letter to Tumbler Ridge on March 10. “Cases for loved ones of people who have suffered harms are especially challenging, and there is a significant body of case law that says wrongdoers don’t owe duties of care to the loved ones of the people they harmed, no matter how awful the harm they have caused. For example, ‘public policy’ arguments have made it hard for loved ones of those negligently killed by medical malpractice to advance their own legal claims, because we have a publicly funded health care system, and expanding lawsuits against doctors, nurses and hospitals on behalf of loved ones would be an extraordinary burden on tax payers.
“However, we see a compelling legal argument here to say that an exception should be made for the loved ones of a mass casualty event because thankfully, these tragedies occur with much less frequency in Canada.
“On the basis, and given the egregious facts of this case, we could possibly make new law or develop factual arguments on existing law.”
The victim’s families seek justice, says Rice, and they want OpenAI’s conduct assessed in the same jurisdiction it calls home – the Northern District of California. What do the victims of the Tumbler Ridge Mass Shooting want? Never again should another AI-predicted and facilitated mass-shooting occur. Full stop.
“These cases are about holding OpenAI accountable for its wanton disregard for public safety and cavalier attitude towards the lives of our children and loved ones.”
Last year, the largest amount awarded in a wrongful death suit in the United States was $640-million. Jay Edelston has said that he expects damages from this case to be in the $1-billion range.
“ChatGPT played a role in the mass shooting and OpenAI could have, and should have, prevented it,” says one of the court filings. “Children watched classmates shot at point-blank range and a teacher killed in front of them. They hid in bathroom stalls and closets, praying the Shooter would not hear them. Some pulled the injured and the dead into their hiding places, careful not to leave trails of blood that could lead the Shooter to them. Parents were asked to identify their children by their clothing, because the gunshots had left little else to recognize. The survivors—students, teachers, and parents alike—are living with physical and psychological injuries that will never fully heal.”
There are more cases to come, says Edelson. “Over the next several weeks, a cross-border team comprised of Edelson PC and Vancouver-based Rice Parsons Leoni and Elliot LLP will be filing over two dozen cases on behalf of the victims of the Tumbler Ridge mass shooting. The lawsuits will be filed in waves.
On Wednesday, April 29, Lawyer Jay Edelson filed seven lawsuits against OpenAI in the Northern District of California.
The cases were filed on behalf of the families of Zoey Benoit, Abel Mwansa, Ticaria Lampert, Ezekiel Schofield, Kylie Smith, Shannda Aviugana-Durand and Maya Gebala.
The last case will supersede a case filed back in March in Canadian court. The remaining six cases are all newly filed, with the US–based Edelson law firm working alongside existing Canadian counsel on behalf of the Tumbler Ridge community.
According to the new filings, OpenAI “instructed the shooter on how to return to ChatGPT even after their account was flagged and terminated for mass shooting discussions. When OpenAI shuts down an account for dangerous behavior, it doesn’t actually ban the user — it tells them how to get back on,” says a release from Jay Edelson’s office. “OpenAI sends users an email instructing them to wait 30 days and open a new account with the same email address. In a public document, OpenAI also tells users that if they don’t want to wait, they can open a new account immediately using a different email address, including a sub-email of their previous account.”
This means, says the release, that OpenAI’s claim that the Tumbler Ridge shooter “bypassed” its security is simply not true. “The shooter did exactly what OpenAI instructed users to do when an account is shut down.”
Instead, the lawyer claims that they chose not to report, because “reporting one case would mean reporting thousands.
“The volume of dangerous incidents on the platform is so large that consistent reporting would reveal just how dangerous ChatGPT has become.”
They reference the fact that the Florida Attorney General recently announced a criminal investigation into OpenAI’s conduct in connection with another mass shooting, “underscoring a growing consensus that the company’s behavior may cross the line from civil to criminal liability.”
According to Rice Parsons Leoni and Elliott LLP, the company that filed suit on behalf of Maya, mom Cia and sister Dehlia in March says that litigating these cases in Canada poses challenges. “Damages for pain and suffering are capped at approximately $470,000 CAD. The largest punitive damages award ever made in Canadian history is $1.5 million CAD. With respect to the murdered children, their estates are not permitted to bring claims in British Columbia for damages against OpenAI, and in most cases the loved ones of wrongfully killed children are unable to recover any recompense under British Columbia’s Family Compensation Act.”
Indeed, the company originally said that they were hoping to find people in Tumbler Ridge to sue OpenAI in BC Court to force changes to the provincial law.
“As a general rule, the law does not permit cases on behalf of loved ones of those harmed or killed to proceed,” Lawyers John Rice and Mallory Hogan wrote in an open letter to Tumbler Ridge on March 10. “Cases for loved ones of people who have suffered harms are especially challenging, and there is a significant body of case law that says wrongdoers don’t owe duties of care to the loved ones of the people they harmed, no matter how awful the harm they have caused. For example, ‘public policy’ arguments have made it hard for loved ones of those negligently killed by medical malpractice to advance their own legal claims, because we have a publicly funded health care system, and expanding lawsuits against doctors, nurses and hospitals on behalf of loved ones would be an extraordinary burden on tax payers.
“However, we see a compelling legal argument here to say that an exception should be made for the loved ones of a mass casualty event because thankfully, these tragedies occur with much less frequency in Canada.
“On the basis, and given the egregious facts of this case, we could possibly make new law or develop factual arguments on existing law.”
The victim’s families seek justice, says Rice, and they want OpenAI’s conduct assessed in the same jurisdiction it calls home – the Northern District of California. What do the victims of the Tumbler Ridge Mass Shooting want? Never again should another AI-predicted and facilitated mass-shooting occur. Full stop.
“These cases are about holding OpenAI accountable for its wanton disregard for public safety and cavalier attitude towards the lives of our children and loved ones.”
Last year, the largest amount awarded in a wrongful death suit in the United States was $640-million. Jay Edelston has said that he expects damages from this case to be in the $1-billion range.
“ChatGPT played a role in the mass shooting and OpenAI could have, and should have, prevented it,” says one of the court filings. “Children watched classmates shot at point-blank range and a teacher killed in front of them. They hid in bathroom stalls and closets, praying the Shooter would not hear them. Some pulled the injured and the dead into their hiding places, careful not to leave trails of blood that could lead the Shooter to them. Parents were asked to identify their children by their clothing, because the gunshots had left little else to recognize. The survivors—students, teachers, and parents alike—are living with physical and psychological injuries that will never fully heal.”
There are more cases to come, says Edelson. “Over the next several weeks, a cross-border team comprised of Edelson PC and Vancouver-based Rice Parsons Leoni and Elliot LLP will be filing over two dozen cases on behalf of the victims of the Tumbler Ridge mass shooting. The lawsuits will be filed in waves.
Trent is the publisher of Tumbler RidgeLines.

