The House of Commons debate over the Online Streaming Act (Bill C-11) is likely to continue this week with the government anxious to get the bill out of the House of Commons and into committee for further study and approval. The recent discussion in the House featured Liberal MP MP Mark Gerretsen insisting that the bill does not cover user generated content:
I can assure this member and all Conservatives that nobody is more interested in preserving the content they create in this House than I am: the content that they give me to put out on social media. If I thought for one second that user-generated content would be impacted by this bill, I certainly would not be in favour of it.
I would like to point out to the member that there are several sections in this piece of legislation that explicitly preserve user-generated content: sections 2.1, 2.2, 2.3, 3(a), 4.1, 4.2 and 4.3(3).
I am curious. This is a simple question. Has the member read the bill, and he has read those sections in particular?
I’m not a Member of Parliament but I have read the bill and those specific sections. The indisputable reality is that the net result of those provisions is that user generated content is covered by the bill. Indeed, the government is gaslighting the public with claims that it does not.
Let’s start with the Section 2 provisions. Canadians will recall that Bill C-10 originally excluded regulating individual users as broadcasters (section 2.1). That provision remains in place:
(2.1) A person who uses a social media service to upload programs for transmission over the Internet and reception by other users of the service – and who is not the provider of the service or the provider’s affiliate, or the agent or mandatary of either of them – does not, by the fact of that use, carry on a broadcasting undertaking for the purposes of this Act.
This section is the source of the government claims that users are not directly regulated by the Act. Section 2.2 provides an additional exclusion for social media services and programming control and Section 2.3 excludes certain Internet transmissions, such as schools, libraries, and museums. These provisions address who is regulated (or more accurately who is not regulated), but do not address what content is regulated.
Gerretsen also points to Section 3(a), which requires that the Act be construed in a manner “consistent with freedom of expression and journalistic, creative and programming independence.” The provision is fine, but doesn’t move the needle given that failure to construe the law in a manner consistent with freedom of expression would make it vulnerable to a constitutional challenge.
Notwithstanding those provisions, the real concern involves regulating user content. Section 4.1(1) addresses the regulation of programs on social media services with the following exclusion:
This Act does not apply in respect of a program that is uploaded to an online undertaking that provides a social media service by a user of the service for transmission over the Internet and reception by other users of the service.
Section 4.1(1) was the provision the government removed from then Bill C-10 last year which opened the door to regulating user generated content. As discussed below, these regulations included discoverability requirements that would allow the CRTC to require platforms to prioritize certain content (and effectively de-prioritize other content). Bill C-11 restores the Section 4.1(1) exception for treating user content as programs subject to potential regulation has been restored.
If the government had stopped there, it could plausibly claim to have excluded user generated content. But instead it added 4.1 (2), which creates an exception to the exception. That exception to the exception – in effect a rule that does allow for regulation of content uploaded to a social media service – says that the Act applies to programs as prescribed by regulations that may be created by the CRTC. The bill continues with Section 4.2, which gives the CRTC the instructions for creating those regulations. I’ve described the result as a legislative pretzel, where the government twists itself around trying to regulate certain content. In particular, it says the CRTC can create regulations that treat content uploaded to social media services as programs by considering three factors:
- whether the program that is uploaded to a social media service directly or indirectly generates revenue
- if the program has been broadcast by a broadcast undertaking that is either licensed or registered with the CRTC
- if the program has been assigned a unique identifier under an international standards system
The law does not tell the CRTC how to weigh these factors. Moreover, these instructions are narrowed by further exclusions for content in which neither the user nor the copyright owner receives revenue (Section 4(3)(a)) as well as for visual images only (Section 4(3)(b).
All of this may seem complicated, but the bottom line is the CRTC is empowered to create regulations applicable to user content uploaded to social media services as programs with three criteria to consider (Rodriguez described it as a “sandbox”). Non-commercial user generated content is out, but user generated content that generates even indirect revenue is subject to potential inclusion within the regulations.
To what might this apply?
TikTok videos are uploaded to the service and may generate indirect revenue, the content is available on licensed or registered services, and the music likely has a unique identifier. The same is true for many Youtube or Instagram videos. Twitch streams may potentially meet the requirements. Podcasts, which can generate revenue, are often available on registered platforms, and may feature an identifier could be caught by the rules.
These are the provisions that Gerretsen cites, which leave little doubt user generated content is still potentially subject to CRTC regulation. In fact, the government effectively admits that the bill as currently drafted covers more than it wants regulated. Video games provides a good case in point. The government insists that video games are out of the bill, though they are clearly part of it as currently drafted. Instead, the government admits a policy direction will be needed to exclude them. The same is true for digital first creators, who create user content that easily meet the CRTC’s regulatory criteria. If the government wants to exclude that form of content, another exclusion in a policy direction will be needed. The government has not released that policy direction and says it will only do so after the bill has received royal assent.
Since digital first creators and other user generated content are still in the bill, what are the regulatory consequences? The CRTC is empowered to impose “discoverability” rules on the Internet platforms with respect to this content. As I’ve noted, it is not at clear how Canadian user generated content will be identified in order to be prioritized. The platforms do not collect the relevant information and there are no obvious standards to apply. If so, legacy content that neatly ticks the right boxes will receive prioritization. The impact will be incredibly damaging to digital first creators, who may find their content effectively de-prioritized in their own country based on Canadian legislation as implemented by the CRTC.
In fact, the situation is even worse since it likely leads de-prioritization of the content worldwide. Simply put, the algorithmic choices on services such as Youtube and TikTok are based on numerous data points. For example, for each page featuring a dozen possible videos to click on, Youtube will record not only which video is viewed, but which are not. The more a video is displayed but not watched, the stronger the signal that the content is not interesting to Youtube users. With Bill C-11 discoverability requirements, there will be an increase in the number of Canadian videos that are displayed to Canadian users. Given that these recommendations are based on regulations rather than user interest, they are very likely to achieve much lower click-through rates than most other content. Those videos will continue to be displayed in Canada given the CRTC regulations, but outside Canada they will get less exposure since the algorithm will discern that the content is not of interest to most users.
The discoverability provision might even be used in broader manner than just prioritizing or de-prioritizing content. The specific provision gives the CRTC the power to establish conditions that include:
the presentation of programs and programming services for selection by the public, including the showcasing and the discoverability of Canadian programs and programming services, such as French language original programs
A careful reading of this provision suggests that the discoverability rules are only illustrative of the conditions that the Commission could establish regarding “the presentation of programs and programming services for selection by the public”. Could this mean the CRTC engages in regulations regarding videos it believes contains misinformation? Or videos that it thinks are particularly valuable and should be made more prominent? The rules don’t say but the door is open to a more activist Commission.
And the concerns with the bill are by no means limited to user generated content regulation: overbroad application to streaming services worldwide, the poorly defined Cancon rules that do not require support for Canadian stories, and the enormous power vested in the CRTC that by Canadian Heritage Minister Pablo Rodriguez’s own admission does not presently have the expertise to meet its mandate are among the additional problems with a bill in serious need of reform and a more forthright explanation from the government.