Parents sue OpenAI over alleged ChatGPT link to Canada school shooting

2 Min Read

The parents of a girl who was critically injured in a school shooting in Canada filed a civil lawsuit on Monday alleging that OpenAI had prior knowledge that the attacker was planning a mass casualty assault.

The lawsuit claims the company was aware of warning signs before the tragedy unfolded.

According to OpenAI, the company had considered notifying law enforcement about the suspect’s activity but ultimately did not alert police.

Months later, the individual carried out one of Canada’s deadliest school shootings in Tumbler Ridge, located in British Columbia, on February 10.

OpenAI later contacted authorities after the shooter, Jesse Van Roostselaar, killed eight people and then herself last month.

The company said the attacker’s ChatGPT account had previously been shut down, but she was able to bypass the restriction by operating a second account.

The legal action, submitted to the British Columbia Supreme Court, alleged that OpenAI had “specific knowledge of the shooter utilising ChatGPT to plan a mass casualty event like the Tumbler Ridge mass shooting”.

The lawsuit said that OpenAI’s chatbot ChatGPT was treated by the attacker as a trusted confidante, collaborator and ally, claiming the system “behaves willingly to assist users such as the shooter to plan a mass casualty event.”

A spokesperson for OpenAI did not immediately respond to requests for comment regarding the legal claim.

The filing further stated that because of the company’s actions, a student identified as Maya Gebala was shot three times at close range. One bullet struck her head, another hit her neck, and a third grazed her cheek. According to the lawsuit, she suffered a catastrophic brain injury that will leave her with permanent cognitive and physical disabilities.

Share This Article
Exit mobile version