Can Albertans Trust AI That Breaks Laws? Expert Weighs In
Can Albertans Trust AI That Breaks Laws? Expert Weighs In

It was a classic case of Jurassic Park science, according to an Edmonton AI expert. The makers behind ChatGPT were rushing to build a revolutionary technology that would affect everything from education to analytics to even medicine. But in the hurry to get a functioning product out the door, and to see whether these systems could work at all, the creators did not stop to think about whether their process was breaking privacy laws.

That is the situation that Canada's privacy commissioner, Philippe Dufresne, explained on Wednesday, saying that leaders at OpenAI told regulators, "we felt we had to move." Sam Jenkins, an Edmonton expert in artificial intelligence, agreed that it was the AI industry thinking whether they could, not if they should. Jenkins is a managing partner at Punchcard Systems in Edmonton, a company that helps other groups implement digital systems like artificial intelligence.

The Privacy Violations

"The fact is, many AI companies, including OpenAI, emerged from this research and technology culture that prioritized capability and scale before anything else," Jenkins said. The result was a dozen instances of OpenAI breaking Canadian privacy law, as a federal-provincial investigation revealed this week. The main issue of privacy was the gathering of people's personal information from the internet. This could include everything from personal posts on social media to diary entries posted to a blog.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

OpenAI's Response and Alberta's Stance

OpenAI has since amended some of its practices and has been working with regulators. The company is now considered compliant with federal privacy laws. However, Alberta, which has its own separate privacy legislation, is different. Scraping the web for people's personal information "could never" sit right with the local privacy rules, according to Alberta information and privacy commissioner Diane McLeod. When asked, McLeod said that if OpenAI were still continuing this practice with the information of Albertans, it would be breaking the law.

So if the companies building AI continue to neglect Canadian law, how can people be expected to trust the technology? "This is about whether or not we as citizens can trust the systems that are in place to protect us when we're dealing with these multinational organizations," Jenkins added.

Understanding the Legal Framework

For many, the fact that a private company cannot legally scrape social media for its data and use that information for whatever ends it wants might be news. Regardless of whether it is posted to a public forum, a person's personal information is still protected under Alberta's Personal Information Protection Act (PIPA). The rules within PIPA make it so that any private sector body that collects a person's information needs a legitimate reason and valid consent. The information also needs to be collected directly from that person. This differs from federal rules, which outline a broader implied consent.

Jenkins emphasized that the core issue is trust. "When companies prioritize speed over compliance, they erode the public's confidence not only in their products but in the entire field of AI. For Albertans, the message is clear: personal data is protected by law, and any violation undermines the promise of responsible innovation."

Pickt after-article banner — collaborative shopping lists app with family illustration