Michael Burns, a contributor to the National Post, recently returned from Australia with a critical observation for Canadian policymakers. While the nation mourned the Bondi Beach tragedy from late 2023, it simultaneously enacted a significant policy shift for youth protection: the enforcement of a minimum age of 16 for opening or maintaining social media accounts on designated platforms.
Shifting the Burden from Parents to Platforms
Burns emphasizes that Australia's move is less of an outright "ban" and more of a strategic "delay." The core innovation, he argues, is where the policy places responsibility. Rather than penalizing parents or children, Australia has legislated that the burden of age verification and enforcement falls squarely on the social media companies themselves. These are the entities that designed addictive products engineered to maximize engagement, often telling families to simply "use the settings" when problems arise.
This stands in stark contrast to the Canadian approach, which Burns characterizes as a decade-long "unregulated experiment on children." He cites social psychologist Jonathan Haidt's work in "The Anxious Generation," which links the widespread adoption of smartphones and algorithm-driven feeds around 2013 to a sharp rise in teen anxiety, depression, and self-harm. The Canadian default, Burns contends, has been to outsource the problem to families through parental controls and screen-time advice—a response he labels "abdication," not policy.
A Three-Step Framework for Canadian Action
Burns proposes that Canada should adopt a similar minimum-age framework, but must copy its most vital principle: platforms must be held accountable for outcomes, not parents made responsible for constant policing. He outlines three practical steps for effective legislation:
First, implement real accountability with enforceable penalties. If tech companies can target advertising to teens with precision, they can deploy serious measures—like friction in sign-up flows, behavioral detection of underage users, and targeted age verification—to prevent underage accounts. Clear compliance targets must be set and enforced.
Second, mandate forced transparency. Social media platforms should be required to publicly report data on how many underage accounts they detect and remove, the methods they use, and the effectiveness of those methods. Safety claims should be backed by verifiable data.
Third, treat regulation as a live system. Acknowledging that workarounds will emerge, policies must be designed to measure impact, allow for rapid adjustments, and keep the public informed of progress and challenges.
The Need to Go Beyond Social Media
However, Burns argues Canada should not stop at regulating social media accounts alone. The primary gateway, he notes, is the smartphone itself—the device in a child's pocket all day, every day. Even with age-restricted accounts, browser-based feeds, anonymous "burner" accounts, and chaotic group chats persist.
He calls for clear, backed policies to create phone-free zones in schools—encompassing classrooms, hallways, and lunchrooms. The rationale is foundational: attention is a critical learning tool, not a infinitely renewable resource. By addressing both the digital platforms and the physical presence of devices in learning environments, Canada could, in Burns's view, build a more comprehensive shield for young people's mental well-being and educational focus.
The Australian policy, enacted in the shadow of national grief, presents a tangible model. For Michael Burns, it's a clear signal that Canada has an opportunity—and an obligation—to go further in protecting its youngest citizens from the documented harms of the digital world.