
Ottawa Meeting Focuses on Reporting Protocols and Public Safety
Canadian officials have expressed disappointment after senior representatives of OpenAI did not present new safety measures during a meeting in Ottawa, held in the aftermath of a mass killing in British Columbia involving a former ChatGPT user.The meeting was convened by Evan Solomon, the federal minister responsible for artificial intelligence, following confirmation that OpenAI had banned an account linked to the suspect but did not alert law enforcement at the time.
Account Banned But Not Reported to Police
Eighteen year old Jesse Van Rootselaar is suspected of killing eight people on February 10 in the town of Tumbler Ridge, British Columbia, before taking her own life. Police said Van Rootselaar, who was born male but identified as a woman and began transitioning six years ago, had a history of mental health problems.OpenAI stated that it had banned Van Rootselaar’s ChatGPT account in 2025 after internal systems flagged misuse of its models in furtherance of violent activities. The company said it considered notifying authorities but determined that the activity did not meet its threshold for reporting, which requires an imminent and credible risk of serious physical harm.
The company added that the earlier suspension, which occurred last year for policy violations, also did not meet internal criteria for escalation to law enforcement.
Ministers Demand Stronger Escalation Framework
Following the Ottawa meeting, Solomon said the government had made clear that Canadians expect credible warning signs of serious violence to be escalated in a timely and responsible manner.He said that relying solely on an internal review is not sufficient when public safety is at stake. The minister also noted that OpenAI had not presented substantial new safety measures during the discussions.
According to Solomon, OpenAI indicated it would return with more concrete proposals tailored specifically to the Canadian context. Ministers from the departments of public safety, culture, and justice were also present at the meeting.
Solomon confirmed that the company is cooperating with Canadian police, though no details of the ongoing investigation were discussed.
OpenAI Commits to Additional Safeguards
In a statement, OpenAI said it has strengthened safeguards in recent months and revised its protocol for referring cases involving violent activity to law enforcement authorities.The company acknowledged that ministers emphasized the expectation for continued and concrete action. It committed to providing an update in the coming days outlining additional steps it plans to take, while continuing to support law enforcement and work with the Canadian government on AI safety measures.
Legislative Context and Renewed Push on Online Harm
The development comes as Canada revisits efforts to regulate harmful online content. In 2024, the Liberal government introduced draft legislation aimed at combating online hate. That initiative stalled after criticism that its scope was overly broad.Ministers have said they will attempt to reintroduce a revised bill this year.
Investigation Continues in Tumbler Ridge
The killings occurred in Tumbler Ridge, a small town of around 2,400 residents in British Columbia. Authorities continue to investigate the circumstances surrounding the February 10 incident.The case has intensified scrutiny on how artificial intelligence companies monitor and escalate potential warning signs, and how governments expect technology firms to respond when violent misuse is detected.
Disclaimer: Due care and diligence have been taken in compiling and presenting news and market-related content. However, errors or omissions may arise despite such efforts.
The information provided is for general informational purposes only and does not constitute investment advice, a recommendation, or an offer to buy or sell any securities. Readers are advised to rely on their own assessment and judgment and consult appropriate financial advisers, if required, before taking any investment-related decisions.