The Senate Standing Committee on Transport and Communications resumes its hearings into Bill C-11 this week with plans for four sessions that will hear from a wide range of witnesses. Given the shortcomings of the House committee hearings – numerous important stakeholders were not given the opportunity to appear – the Senate review this fall provides a critical opportunity to re-examine the bill and to address some of its obvious flaws. With that in mind, this post is the first of a series that highlights some of Bill C-11’s major risks and concerns.
The series unsurprisingly starts with the issue has been top of mind from the very start: the regulation of user content. Canadian Heritage Minister Pablo Rodriguez adopted the mantra that “platforms are in, users are out” of the bill and sought to assure concerned Canadians that it “listened, especially to the concerns around social media, and we’ve fixed it.” Yet the reality is the overwhelming evidence is that the issue is not fixed and that user content is covered by the bill. Indeed, look no further than Ian Scott, the chair of the CRTC, who told the House committee:
[Section] 4.2 allows the CRTC to prescribe by regulation user uploaded content subject to very explicit criteria. That is also in the Act.
How is user content captured by Bill C-11?
The bill contains an exception for user content (the much-discussed Section 4.1(1) that purports to exclude programs, which are broadly defined as any audiovisual content) from the ambit of regulation, but Section 4.1(2) immediately creates an exception to the exception:
Despite subsection (1), this Act applies in respect of a program that is uploaded as described in that subsection if the program
(a) is uploaded to the social media service by the provider of the service or the provider’s affiliate, or by the agent or mandatary of either of them; or
(b) is prescribed by regulations made under section 4.2.
The regulations in Section 4.2 referenced by Scott says the CRTC should consider three factors in applying Broadcasting Act regulations:
- whether the program that is uploaded to a social media service directly or indirectly generates revenue
- if the program has been broadcast by a broadcast undertaking that is either licensed or registered with the CRTC
- if the program has been assigned a unique identifier under an international standards system
As many have noted, these standards are very expansive. For example, TikTok has concluded that it covers all videos on its platform that include music. In fact, the CRTC can theoretically consider whatever it wants in establishing regulations, since it only needs to “consider” the three factors in the bill. That’s all. It doesn’t say the CRTC can’t consider other factors or simply ignore those factors after having considered them. Much like the lip service the Commission has given at times to policy directions, the CRTC is free under the bill to confirm that it “considered” the factors in setting the regulations and adopt a different approach.
What could it mean to have the CRTC exercise its regulatory power over user content? The bill states the Commission may impose conditions respecting:
the presentation of programs and programming services for selection by the public, including the showcasing and the discoverability of Canadian programs and programming services, such as original French language programs;
This provision is the source of debate on discoverability and the potential harms to online creators that will be the subject of an upcoming post. But note that the condition is not limited to discoverability, which is used an example of the power. The actual power is conditions on “the presentation of programs and programming services for selection by the public.” Applied to user content, those conditions on the presentation of programs could include mandating outcomes that demote or apply warning labels to content the CRTC considers contrary to Broadcasting Act objectives, which are so broad as to cover a wide range of lawful content. While Bill C-11 supporters have insisted that the CRTC does not engage in that form of content regulation, its recent decision involving CBC/Radio-Canada engaged in content regulation without regard for the Charter of Rights and Freedoms or freedom of expression.
To be clear, the risk with these rules is not that the government will restrict the ability for Canadians to speak, but rather that the bill could impact their ability to be heard. In other words, the CRTC will not be positioned to stop Canadians from posting content, but will have the power to establish regulations that could prioritize or de-prioritize certain content, mandate warning labels, or establish other conditions with the presentation of the content (including algorithmic outcomes). The government has insisted that isn’t the goal of the bill. If so, the solution is obvious. No other country in the world seeks to regulate user content in this way and it should be removed from the bill because it does not belong in the Broadcasting Act. In the alternative, it should remove all regulatory powers associated with user content, but leave in the potential for contributions by the user content platforms. That would bring companies such as Youtube within the scope of the bill for contributions purposes, but remove the government and CRTC power to exert regulatory power over user content.