President Donald J. Trump at the G20 Summit by the White House Public Domain

President Donald J. Trump at the G20 Summit by the White House Public Domain


The BTLR and USMCA, Part One: Why the Broadcast Panel Recommendations Conflict With Canada’s Emerging Trade Obligations

Since the release of the Broadcast and Telecommunications Legislative Review Panel report late last month, I’ve posted on several key issues including an overview of concerns, news regulation, Canadian Heritage Minister Guilbeault’s comments, net neutrality, discoverability claims, consumer costs, and a podcast debate with panel chair Janet Yale. The blog now shifts for the next two days on trade-related concerns arising from the report’s recommendations. This issue is particularly timely since the House of Commons has been debating Bill C-4, the implementation bill for the US-Canada-Mexico (USMCA) Trade Agreement and the government had made treaty implementation one of its top legislative priorities.

The digital trade chapter includes at least two provisions that are implicated by the BTLR report. First, the USMCA includes a legal safe harbour for Internet intermediaries and platforms for content posted by their users. Article 19.17 incorporates the U.S. Communications Decency Act Section 230(c) style provision into the trade agreement:

no Party shall adopt or maintain measures that treat a supplier or user of an interactive computer service as an information content provider in determining liability for harms related to information stored, processed, transmitted, distributed, or made available by the service, except to the extent the supplier or user has, in whole or in part, created, or developed the information.

It adds:

No Party shall impose liability on a supplier or user of an interactive computer service on account of:

(a) any action voluntarily taken in good faith by the supplier or user to restrict access to or availability of material that is accessible or available through its supply or use of the interactive computer services and that the supplier or user considers to be harmful or objectionable; or
(b) any action taken to enable or make available the technical means that enable an information content provider or other persons to restrict access to material that it considers to be harmful or objectionable

The provision is controversial – there was pressure to remove it late last year – but it remains in place. The rule is designed to provide Internet platforms with immunity from liability both for the removal of content as well as for the failure to remove content. Contrary to some claims, the rule does not mean that “everything goes”. Sites and services are still subject to court orders and the enforcement of criminal law (intellectual property rights enforcement are also exempted). However, some argue that the responsibility of Internet intermediaries should go further, with potential liability for failure to act even in cases of harmful, albeit legal, content. That position raises important freedom of expression concerns and questions about how to balance free speech safeguards and protection from harm.

Notwithstanding the USMCA commitment, the Broadcast panel recommends legislative action for harmful content:

Recommendation 94: We recommend that the federal government introduce legislation with respect to liability of digital providers for harmful content and conduct using digital technologies, separate and apart from any responsibilities that may be imposed by communications legislation. Given that the challenges in this area are global in nature, we also encourage the federal government to continue to participate actively in international fora and activities to develop international cooperative regulatory practices on harmful content.

While Janet Yale told me on the Lawbytes podcast that “all we’ve done is recommend that government consult internationally on the best approach”, the reality is that the recommendation goes much further and will be very difficult to square with Canada’s trade commitments under the USMCA.

The same may be true for the panel’s recommendation on mandatory disclosure of algorithms, which states:

We recommend that the Broadcasting Act be amended to ensure that the CRTC can — by regulation, condition of licence, or condition of registration — impose reporting requirements, including with respect to financial information, consumption data, and technological processes such as algorithms, on all media content undertakings

Further, recommendation 63 states:

To ensure that Canadians are able to make informed choices and that Canadian content has sufficient visibility and is easy to find on the services that Canadians use, we recommend that the CRTC impose discoverability obligations on all audio or audiovisual entertainment media content undertakings, as it deems appropriate, including:

  • catalogue or exhibition requirements;
  • prominence obligations;
  • the obligation to offer Canadian media content choices; and
  • transparency requirements, notably that companies be transparent with the CRTC regarding how their algorithms operate, including audit requirements.

While the panel wants to establish a general reporting requirement for algorithms to the CRTC on all media content undertakings – this would include social media companies and news services – the USMCA establishes limitations on mandated algorithmic disclosures. Article 19.16 states:

No Party shall require the transfer of, or access to, a source code of software owned by a person of another Party, or to an algorithm expressed in that source code, as a condition for the import, distribution, sale or use of that software, or of products containing that software, in its territory.

There is a notable exception to this general prohibition, however. It states:

This Article does not preclude a regulatory body or judicial authority of a Party from requiring a person of another Party to preserve and make available the source code of software, or an algorithm expressed in that source code, to the regulatory body for a specific investigation, inspection, examination, enforcement action, or judicial proceeding, subject to safeguards against unauthorized disclosure.

The issue that will arise in the trade context is whether the general algorithmic disclosure requirements recommended by the panel are consistent with the exception for a “specific investigation, inspection, examination, enforcement action or judicial proceeding.” Given its broad scope to any media content undertaking and the general regulatory approach, there is reason to believe the panel’s recommendation goes well beyond the more narrowly tailored exception and is contrary to Canada’s USMCA obligations.


  1. A few questions.

    Let’s say Google states it is an American company, and that while it has millions of Canadian customers it does not operate in Canada, and therefore, is not subject to CRTC regulations. The CRTC then decides to fine Google for non-compliance with its regulations.

    1. How would the CRTC be able to collect the fines? Would it need a Canadian court order that it would then have to try to enforce in the US?
    2. What happens if a US court sides with Google? Does the CRTC have any other enforcement options – like ordering Canadian ISPs to block Google?
    3. Could Google sue the CRTC/ Canadian Government for violating its rights under the USMCA? If so, for how much and would Google be able to sue in the US?

  2. Pingback: News of the Week; February 12, 2020 – Communications Law at Allard Hall