Critics of Senator Julie Miville-Dechêne’s successive bills that ostensibly target pornography sites have for years warned of the privacy and equity risks that arise from mandated age verification and the dangers of over broad legislation that would extend far beyond pornography sites by covering social media, search, and AI services. The Senate committee reviewing the latest iteration of those bill – Bill S-209 – met yesterday to conduct a clause-by-clause review of the bill. That the bill passed through committee pending some supplementary remarks was not a surprise. However, that the privacy and equity concerns barely merited a mention and that regulating social media sites was viewed as feature not a bug was a wake up call.
After watching the hearing, it has become apparent that Bill S-209 is not a bill primarily focused on pornography sites. If it was, the bill could be drafted with those directly in mind. Rather, it is a trojan horse online harms bill with two regulatory tools at its disposal: (i) mandated age verification or age estimation technologies that a government could apply to a wide range of social media, search, and AI services and (ii) court-ordered blocking of those services for failure to comply. At its best, online harms legislation seeks to balance freedom of expression with tools to address online harms by requiring platforms to act responsibly under threat of penalty. The Bill S-209 approach is online harms at its worst. It simply wants to stop the availability of common Internet services to anyone under 18 (far older than any social media regulation in the world), make it harder for adults to access those services, and ensure that the government has the power to seek blocking orders for failure to age-gate their users.
The clause-by-clause review featured half the committee sitting in silence with no comments to offer or amendments to propose. The lone voice to raise concerns was Senator Paula Simons, who rightly noted that social media regulation was a far cry from trying to ensure that only adults access pornography sites.
The changes made to the bill were largely cosmetic, leaving the core powers and scope unchanged. The privacy concerns with age verification – mandating that millions of Canadians send government issued IDs outside the country to a third party provider with limited application of Canadian privacy law – were ignored altogether. The risks that age estimation technology would unduly target visible minorities did not garner a single mention. The only assurance about over broad application to non-pornography sites was to punt to government the decision of how and when the law would be applied. That is cold comfort as it is difficult to trust an opaque system that will apparently be used to determine which sites must ID their users or face penalties or site blocking orders.
Bill S-209 will be voted out of the committee later this month before it faces a full Senate vote. The outcome of that vote is not in doubt. The only real question is what happens to the bill once it heads to the House of Commons as the government decides whether it wants to use a Senate private members bill to supplement or replace its approach to addressing online harms.












Pingback: Show your ID for social media? Why this new Canadian law is a ‘Trojan horse’ | iPhone in Canada
This is a sobering analysis and a much-needed reality check.
You have fundamental rights except when your guaranteed freedoms infringe upon the guarantors exercise of power.
The moment that this bill passes is the moment I cut my throat. No to digital ID – my line in the sand will be marked in blood.
“protecting children” has become the politically safest wrapper for laws that:
– mandate age verification or identity checks
– weaken anonymity online
– expand platform monitoring and data retention
– normalize site blocking or algorithmic compliance
Child protection is being used as a moral shield
shift enforcement power to regulators with limited oversight
And crucially: many of these mechanisms work far better for surveillance than for child protection.
Pingback: How to Stay Anonymous as a Content Creator: 2026 Guide
Reading this analysis of Bill S-209 makes one thing really clear: when legislation expands beyond its stated purpose, everyday users end up carrying the burden. The idea of mandatory age verification, data transfers outside Canada, and potential court-ordered blocking of digital platforms feels less like online safety and more like a step toward restricted access to basic communication tools.
What struck me most is how the bill quietly shifts from targeting pornography websites to putting social media platforms, search engines, and even AI-driven services under the same regulatory umbrella. For anyone who relies on open web ecosystems—whether it’s for community updates, independent journalism, or even streaming platforms like Magis TV, which many people use to explore diverse content—this kind of broad enforcement could reshape how we engage with the digital world.
Instead of fostering a healthier online environment, measures like these risk creating more fragmentation, potential censorship, and a chilling effect on digital expression. It’s unfortunate that voices raising concerns, like Senator Paula Simons, were so isolated during the hearing. With so many implications for privacy, equity, and access, Canadians deserve far more transparent debate before tools like age-gating, ID verification systems, and platform blocking orders become the norm.