Tumbler Ridge tragedy ‘wake up call’ for Canada to hold big tech accountable

Digital security experts and safety advocates are demanding more robust online safety laws to protect society and especially young Canadians. 

The tragedy in Tumbler Ridge, B.C., has raised serious questions in Canada about how digital platforms respond when credible warning signs of violence emerge after it was revealed online activity from the suspect was not reported to law enforcement in a timely manner.

In June 2025, Tumbler Ridge shooter Jesse Van Rootselaar’s ChatGPT account was banned for describing violent scenarios, according to the Wall Street Journal. While internal systems flagged the communication as “furtherance of violent activities,” OpenAI did not notify the RCMP at the time, stating the content didn’t meet its threshold for police referral, the report said.

OpenAI reached out to the RCMP after the attack on Feb. 10 that killed eight people.

A report from 404 Media also revealed Van Rootselaar had previously created a mass shooting simulator game on Roblox, which the company said it has since removed along with the person’s account.

Tumbler Ridge tragedy a wake-up call 

David Shipley, digital security expert and founder of Canadian cybersecurity company Beauceron, told Metroland Media the Tumbler Ridge tragedy should be a wake-up call for Canada to pass meaningful regulation — not just to keep AI companies in check, but to also hold big tech as a whole accountable for the harms these platforms can potentially cause.

In a social media post, Shipley urged Solomon to “Get mad. Get informed. Then, get busy.” 

Shipley listed a number of recommendations, including: 

  1. Legal accountability for big tech. Shipley said platforms should be held liable for harms they cause and their failure to report to law enforcement. This includes holding companies accountable for “billions of scam ads,” non-consensual intimate images of children and others, and harm through hallucinations in different platforms (i.e., where AI generates false but convincing information).

  2. Algorithm and health warning. Shipley believes algorithms should also be made optional, and clear warnings about platform addiction should be mandated. 

  3. Age-based access restriction. He also says access to social media should be banned for children under 16. “It’s not a perfect solution, but doing nothing is condemning another generation to a wave of addictions, mental health challenges and lost relationships,” he said. 

Read the Full Story at Inside Halton

Previous
Previous

AI-powered attack kits go open source, and CyberStrikeAI may be just the beginning

Next
Next

How AI scams are impacting British Columbians