Cia, Maya and Dalia file lawsuit against OpenAI, makers of ChatGPT

A civil lawsuit was filed in Vancouver by Cia Edmunds and Dahlia Gebala, mother and sister of Maya Gebala, against OpenAI—makers of ChatGPT.

The suit also names Maya as one of the plaintiffs.

According to the lawsuit, OpenAI has a charter that states: “We commit to use any influence we obtain over AGI’s deployment to ensure it is used for the benefit of all, and to avoid enabling uses of AI or AGI that harm humanity…” and “we are committed to doing the research required to make AGI safe.”

And yet, says the lawsuit, the defendants “knew or ought to have known that ChatGPT, particularly when powered by GPT-4o, was routinely utilized by users to provide mental health support and counselling; and further that users reported experiencing therapeutic benefits from using ChatGPT.”

Meanwile, Jesse Van Rootselaar had at least two consecutive, OpenAI accounts, and was under the age of 18. According to the company’s own documentation, children under the age of 18 need parental consent before using ChatGPT, while “the OpenAI Defendants took no steps to implement age verification or parental consent procedures.”

According to the filing, Van Rootselaar used ChatGPT for mental health support and counselling, “treating ChatGPT as a mental health counsellor, advisor and/or a pseudo-therapist.”

Even more than that, the lawsuit argues that Van Rootselaar “relied on and perceived ChatGPT to be a trusted confidante, collaborator, ally, and friend. At all material times, OpenAI knew that ChatGPT possessed extensive knowledge and capabilities—including the ability to provide detailed; actionable information on dangerous or harmful subjects like how to conduct a mass casualty event like the Tumbler Ridge Mass Shooting. OpenAI harvested such harmful information and data in an indiscriminate manner and then supplied such information and data to ChatGPT. OpenAI took no steps—adequate or at all—to avoid providing ChatGPT with such information and data, or impose any safeguards to prevent users from obtaining such information from ChatGPT.”

It goes on to argue that ChatGPT equipped Van Rootselaar with “information, guidance, and assistance to plan a mass casualty event like the Tumbler Ridge Mass Shooting, including informing the shooter about the various methods of carrying out a mass casualty event like the Tumbler Ridge Mass Shooting, the types of weapons to be used and describing precedents from other mass casualty events or historical acts of violence.”

And here lies the crux of the issue, according to the lawsuit, as it was late spring/early summer of last year that Van Rootselaar—still 17-years-old, began to describe “various scenarios involving gun violence to ChatGPT over the course of several days.”

ChatGPT’s internal monitoring system flagged the posts as containing content “potentially in violation of ChatGPT’s users policies.”

These posts were routed to about a dozen human moderators, who identified the posts as “indicating an imminent risk of serious harm to others and recommended Canadian law enforcement be informed of the posts.”

These concerns were escalated to OpenAI leadership, with a request to inform Canadian law enforcement. “The OpenAI Defendants subsequently rebuffed their employees’ request to contact Canadian law enforcement about the …posts. lnstead, the only step the OpenAI Defendants took in response … was to ban the shooter’s first OpenAI account.”

To which Van Rootselaar responded by opening a second OpenAI account, which OpenAI failed to detect and ban. “The shooter used their second OpenAI account to continue planning scenarios involving gun violence, including a mass casualty event like the Tumbler Ridge Mass Shooting, with ChatGPT, and to receive mental health counselling and pseudo-therapy from ChatGPT.”

OpenAI, says the filing, is therefore responsible for the method in which ChatGPT functioned as mental health counsellor and pseudo-therapist: “Accordingly; the OpenAl defendants, in knowingly and intentionally permitting ChatGPT to provide pseudo-psychological treatment to the shooter, owed a duty of care to report onstances of clear and imminent risks of serious bodily harm or death posed to individuals identified with reasonable specificity in the gun violence ChatGPT posts or the Ssooter’s other ChatGPT chat logs. “The OpenAI Defendants either knew or ought to have known that the shooter was utilizing ChatGPT to conduct long-range planning of a mass casualty event…and that the shooter posed a clear and imminent risk of serious bodily harm or death to individual identified with reasonable specificity in the Gun Violence ChatGPT Posts or the shooter’s other ChatGPT chat logs.”

This conduct on OpenAI’s part led to the shooting of Maya Gebala, who sustained: “a catastrophic, traumatic brain injury, permanent cognitive and physical disability, right-sided hemiplegia, scarring and physical deformities, depression, anxiety, post-traumatic stress disorder, and such further injuries as will be proven at trial, all of which injuries have caused and continue to cause the plaintiff Maya Gebala pain, suffering, loss of enjoyment of life, permanent physical disability, loss of earnings, past and prospective, loss of income earning capacity, loss of opportunity to earn income and loss of housekeeping capacity, past and prospective.”

Maya is a beneficiary of, says the lawsuit, the health care cost recovery act, and has an obligation to claim for the health care services both past and in the future from OpenAI.

In addition, the lawsuit seeks damages on behalf of Maya’s sister Dahlia, who witnessed the shooting, as well as her sister, “intubated and sedated in hospital with significant swelling and visible physical injuries,” causing “post-traumatic stress disorder, anxiety, depression, sleep disturbances, and such further injuries as will be proven at trial.”

Finally, the lawsuit seeks damages on behalf of mother Cia Edmunds, who also witnessed the tragedy, and has had to watch her daughter recover in the hospital.

While the family is suing OpenAI, there is no amount named in the lawsuit.

Website |  + posts

Trent is the publisher of Tumbler RidgeLines.

Trent Ernst
Trent Ernsthttp://www.tumblerridgelines.com
Trent is the publisher of Tumbler RidgeLines.

Latest articles

Related articles

Leave a reply

Please enter your comment!
Please enter your name here