Canada Must Ban Social Media for Under-16 to Protect Children, Not Tech Firms
Ban Social Media for Under-16 to Protect Kids, Not Tech Firms

An 11-year-old boy is threatened with the distribution of nude images unless he pays an international extortionist who found him on TikTok. A 12-year-old girl is relentlessly pressured by someone she believed was a friend to expose herself on camera. A 14-year-old boy is unravelling — failing classes, withdrawing from life — because his friend is being exploited on Roblox and he feels powerless to help.

These are not outliers. In 2025 alone, Cybertip.ca processed more than 28,000 reports. These are just three.

Canada’s children are not stumbling into harm by accident. They are being systematically exposed to it — on platforms engineered to capture their attention, monetize their vulnerability, and retain their engagement at all costs. The scale and severity of harm now demand more than incremental reform. They demand intervention.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

The Neurological Reality of Adolescence

For over 25 years, the Canadian Centre for Child Protection has documented a steep and accelerating rise in online harms against children. This trajectory is not coincidental. It reflects a digital environment that is fundamentally misaligned with the developmental realities of childhood.

Adolescence is not merely a social transition — it is a neuro-developmental one. The prefrontal cortex, responsible for judgment, impulse control, and risk assessment, remains under construction well into a person’s 20s. At the same time, the brain’s reward system is highly active and exquisitely sensitive to external stimuli. Social media platforms are designed to exploit this imbalance, delivering intermittent reinforcement — likes, comments, shares — that conditions compulsive engagement and externalizes self-worth.

This is not benign. It is behavioural conditioning.

The Exploitation of Vulnerability

At precisely the stage when young people are forming identity, learning empathy, and developing resilience, they are instead immersed in environments that flatten social cues, distort feedback, and incentivize comparison. Algorithmic amplification ensures that the most emotionally provocative content rises to the top, often trapping youth in cycles of validation-seeking, anxiety, and self-doubt. The parallels to other addictive systems — gambling, nicotine — are not incidental. They are instructive.

Layered onto this is a more acute and devastating risk: sexual exploitation. Predators use these platforms with precision, posing as peers, exploiting developmental vulnerabilities, and leveraging the very features designed to maximize engagement. Younger adolescents — still learning boundaries, trust, and self-protection — are uniquely susceptible.

We would not design a physical environment that exposed children to these risks without safeguards. Yet online, we have done precisely that.

A Call for Decisive Action

Canada now faces a choice: continue to defer to the convenience and profit structures of large technology companies, or act decisively to protect children. The evidence is clear that self-regulation has failed. Incremental measures have not stemmed the tide of harm. What is needed is a comprehensive ban on social media access for those under 16, backed by robust enforcement and age-verification mechanisms.

Such a policy would not be an infringement on rights but a recognition of developmental reality. Just as we restrict access to alcohol, tobacco, and gambling, we must restrict access to platforms that are inherently harmful to developing minds. The technology exists to implement these safeguards; what has been lacking is the political will.

It is time to prioritize the well-being of our children over the bottom lines of tech giants. The cost of inaction is measured not in dollars but in shattered lives.

Pickt after-article banner — collaborative shopping lists app with family illustration