Canada's Deepfake Crackdown: Advocates Demand Faster Removal Powers
As the Canadian federal government advances legislation to criminalize the distribution of sexualized deepfakes, advocates are pushing for stronger measures that would mandate the rapid removal of such content from online platforms. The proposed changes come amid growing concerns about the proliferation of AI-generated intimate images used to exploit individuals.
Legislative Gaps in Victim Protection
Michelle Abel, vice-president at the National Council of Women of Canada and founder of a non-profit focused on combating the exploitation of women and children, criticized the current approach. "They're missing the point when it comes to victimization and removal," Abel stated regarding the government's proposed Bill C-16.
The legislation, currently under study by the parliamentary justice committee, aims to update laws against non-consensual intimate image sharing to include "visual representations" created through artificial intelligence. This language specifically targets the surge of sexual deepfakes that police, victim groups, and educational institutions have warned about for years.
U.S. Model Offers Faster Response Framework
Abel pointed to the United States' Take It Down Act, signed into law in March 2025 by President Donald Trump and later championed by First Lady Melania Trump. This bipartisan legislation requires social media platforms to remove reported non-consensual sexual images, including AI-generated deepfakes, within 48 hours. "We need something more immediate like our U.S. counterpart," Abel emphasized.
Several Canadian provinces have already amended existing revenge porn laws to include AI-altered images, allowing victims to pursue civil lawsuits against perpetrators. British Columbia has implemented additional services where victims can explore removal options through civil resolution tribunals.
The Critical Timing Challenge
Suzie Dunn, an associate professor at Dalhousie University specializing in deepfakes and technology-facilitated gender-based violence, highlighted the urgency of rapid removal. "Timing is critical in getting images removed as swiftly as possible," Dunn explained. "Once they've proliferated, been downloaded, and shared, it becomes very difficult to contain an image."
Dunn noted that current provincial processes can take weeks or even months before removal orders are issued, during which time harmful content continues to spread. She cited a recent British Columbia case where X challenged a tribunal removal order regarding an altered intimate image, despite the platform blocking it within Canada.
Call for Platform Accountability Measures
Dunn argued that the federal government should prioritize reintroducing online harms legislation that specifically holds platforms accountable for addressing reported deepfake content. "Some websites are more responsive to removal requests," she acknowledged, while noting the additional complication of platforms operating outside Canadian jurisdiction.
The debate centers on whether criminalization alone sufficiently protects victims, or whether additional regulatory mechanisms are needed to ensure harmful content can be effectively and quickly removed from digital spaces. As Justice Minister Sean Fraser advances the government's approach, advocates continue to press for more comprehensive solutions that address both legal consequences and practical removal processes.



