The Canadian government has responded to three reports focused on digital policies from the Standing Committee on Canadian Heritage, shedding new light on potential future policies and priorities. The three reports – on tech giants, local media, and harms caused by illegal sexually explicit materials posted online – recommended a wide range of measures that include new laws, regulations, and government programs. The government sidesteps some of the recommended legislative reforms in its responses signed by Heritage Minister Marc Miller, suggesting limited interest in committing to broad-based platform liability rules.
The response to a study on “tech giants’ intimidation and subversion tactics to evade regulation in Canada and globally” is a case in point (I appeared before the committee on the study and yes, that was the real title of the study). The committee’s lead recommendation was:
That digital content platforms put mechanisms in place to detect undesirable or questionable content that may be the product of disinformation or foreign interference, and that these platforms be required to promptly identify such content and report it to users; failure to do so should result in penalties.
The government chose to only “acknowledge” the recommendation, a signal that it is not accepted. The government also rejects a recommendation for changes to the Income Tax Act to make advertising on foreign digital platforms no longer fully tax deductible. The recommendation stated:
That the Government of Canada make changes to the Income Tax Act, specifically to rules that allow advertising purchased by businesses on foreign websites to be counted as a fully deductible expense, while restrictions remain for deducting the cost of advertising with Canadian media.
That recommendation is also merely “acknowledged.” The government does, however, “agree in principle” with three other recommendations on developing a disinformation awareness campaign, requiring platforms to work with academic researchers, and ensuring that content moderation practices “do not result in adverse differential treatment of any individual or group of individuals based on one or more prohibited ground of discrimination.” The likelihood of significant action may be limited, however, as the best the government can do is point to its work on AI guidelines and a renewed AI strategy as measures to address discriminatory content moderation. Agreement in principle stops short of a full acceptance or support of the recommendation.
The response also stops short of committing to a re-introduction of Bill C-63, the online harms bill. Instead, the government states:
While the proposed Bill C-63 (Online Harms Act) died on the Order Paper, the Government’s commitment to addressing online harms remains, and consideration is being given to how the objectives in this bill could be accomplished.
The response also points to two other bills: the controversial Bill S-209 and Bill C-216, a private member’s bill introduced by Conservative MP Michelle Rempel Garner. The government has not previously supported Bill S-209 and it would be surprising if its online harms approach were based on an opposition private member’s bill.
The study on holding a national forum on the media was conducted in the aftermath of the mess of the Online News Act. The committee’s lead recommendation was merely that the government support a national forum initiated by the private sector. The government’s response agrees in principle, emphasizing that “a national forum must be led independently of government.” The remaining recommendations focused on government support for journalism and media outlets. The government supports the recommendation in principle, devoting pages to highlighting the myriad of funding programs it has launched over the years. The Online News Act notably rates only a couple of paragraphs, pointing to the Google deal and omitting any reference to blocked news links on Meta (which reports today indicate are the subject of discussion).
The response to the committee’s study on harms caused by illegal sexually explicit material features the government’s most explicit support for legislative action. For example, the committee recommended:
That digital platforms implement processes for detecting and reporting illegal, sexually explicit content, such as child sexual abuse material and the non-consensual distribution of intimate images (including deepfakes), and that such content be removed immediately once it has been identified, under threat of penalty
The response supports the recommendation with no qualifiers:
The Government supports recommendation 4 that digital platforms implement processes to detect, report, and promptly remove illegal sexually explicit content, including child sexual abuse material and non-consensual intimate images, under penalty. The safety of Canadians online is of utmost importance to the Government.
The government supports in principle a study on liability for the role of private messaging services, acknowledging some of the challenges:
Consultations have made it clear that many Canadians, including human rights organizations and privacy advocates, believe that the Government has no place in regulating private conversations online. There are also potential privacy implications that would need to be further examined, as well as technological limitations relating to encryption. Further analysis is required to determine how legislative measures with respect to private messaging could be implemented in a sustainable way including how privacy challenges could be dealt with.
The committee also recommended “that section 161.1(2) of the Criminal Code, which defines “intimate image,” be amended to include the concept of sexually explicit deepfakes.” The government supports the recommendation without qualification, pointing to Bill C-16.
What to make of the various responses? The committee proposed several measures designed to increase platform liability under threat of liability. The government is supportive of new rules to cover liability for failing to act in the context illegal sexually explicit material, but broader based rules may not be on the immediate horizon.








