Growing Criminal Liability in Mass Shootings: AI and Parental Roles
Growing Criminal Liability in Mass Shootings

The tragic mass shooting in Tumbler Ridge, British Columbia, on February 10 has brought to light a complex web of issues surrounding criminal liability in the age of artificial intelligence. Jesse Van Rootselaar killed eight people, including his mother, half-brother, five students, and a teacher at Tumbler Ridge Secondary School. Rootselaar had a history of mental health problems and had previously attended the school.

OpenAI's Role and Delayed Response

On the same day as the shooting, OpenAI officials were in Victoria meeting with B.C. government officials. It was later revealed that the company had banned Rootselaar from its ChatGPT platform the previous summer. According to the Wall Street Journal, OpenAI staff had debated reporting the user to law enforcement as early as June 2025, but senior management decided the threat was not considered 'credible and imminent.'

OpenAI CEO Sam Altman has since apologized, stating, 'I am deeply sorry that we did not alert law enforcement to the account that was banned in June.' This acknowledgment highlights the company's recognition that informing police was the bare minimum required.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Legal and Ethical Questions

The shooting raises several critical issues, including how artificial intelligence and corporate officials make life-and-death decisions about what constitutes risk. It also questions the legal and ethical responsibilities of companies like OpenAI, the support for individuals with severe mental health issues—especially in rural areas—and the need for a regulatory framework that addresses criminal culpability and civil liability.

Florida's Criminal Investigation

While B.C. Premier David Eby has met with OpenAI, there is no indication that the province has sued the company. In contrast, Florida Attorney General James Uthmeier has opened a criminal investigation to determine if ChatGPT provided 'significant advice' to Phoenix Ikner, a 20-year-old suspect in a mass shooting at Florida State University that left two dead and six injured.

Parental Responsibility in the Spotlight

The case of Colin Gray, whose 14-year-old son Colt Gray killed four people in 2024, illustrates the growing scrutiny of parental responsibility. Colin Gray was convicted on 27 charges, including second-degree murder and involuntary manslaughter, for purchasing an AR-15-style rifle as a Christmas gift and allowing his son access despite warnings. This case is seen as a test of the limits of responsibility in mass shootings.

These developments indicate that criminal liability for mass shootings is evolving, with courts examining the totality of the social milieu and the roles played by various actors, including AI companies and parents.

Pickt after-article banner — collaborative shopping lists app with family illustration