Justice Minister Arif Virani yesterday finally bowed to public pressure by agreeing to split Bill C-63, the Online Harms bill. The move brings to an end the ill-conceived attempt to wedge together Internet platform responsibility with Criminal Code provisions and the potential weaponization of the Canada Human Rights Act that had rightly sparked concerns from a wide range of groups. I wrote about the need to drop those provisions two days after the bill was introduced last February. By the time the fall had rolled around, it was hard to find anyone who supported the bill in its current form.
News
Canadian Media Companies Target OpenAI in Copyright Lawsuit But Weak Claims Suggest Settlement the Real Goal
Canada’s largest media companies, including the Globe and Mail, Toronto Star, Postmedia, CBC, and Canadian Press, came together last week to file a copyright infringement lawsuit against OpenAI, the owners of ChatGPT. The lawsuit is the first high profile Canadian claim lodged against the enormously popular AI service, though there have been similar suits filed elsewhere, notably including a New York Times lawsuit launched last year. While the lawsuit itself isn’t a huge surprise, the relatively weak, narrow scope of the claims discussed below are. Unlike comparable lawsuits, the Canadian media companies claim is largely limited to data scraping, which may be the weakest copyright claim. Moreover, the companies say they have no actual knowledge of when, where, or how their data was accessed, an acknowledgement that doesn’t inspire confidence when there is evidence available if you know where to look.
So why file this lawsuit? The claim is sprinkled with the most obvious reason: the Canadian media companies want a settlement that involves OpenAI paying licence fees for the inclusion of their content in its large language models and the lawsuit is designed to kickstart negotiations. The companies aren’t hiding the ball as there are repeated references along the lines of “at all times, Open AI was and is well aware of its obligations to obtain a valid licence to use the Works. It has already entered into licensing agreements with several content creators, including other news media organizations.” The takeaway is that Canadian media companies want to licence their stuff too, much like the licensing agreements with global media companies such as News Corp, Financial Times, Hearst, Axel Springer, Le Monde, and the Associated Press.
When Antisemitism Isn’t Taboo: Reflecting on the Response to Nazi-Era Hate on the Streets of Montreal
Last week, as Concordia students staged a “strike” to protest the ongoing Israel-Hamas war in Gaza, video captured someone giving a Nazi salute to nearby Jewish students while repeatedly declaring the “final solution is coming your way.” Antisemitism has become far too common, but this incident, which had unmistakable Holocaust echoes, still had the capacity to shock. Soon after, the culprit was identified as Mia Abdulhadi, the co-owner of two Second Cup coffee cafe franchises improbably located in the Montreal Jewish General hospital.
The Concordia events later gave way to violent riots in Montreal, but this particular case has been hard to shake. Part of it stems from the affirmation of the campus antisemitism concerns that have been voiced for many months by Jewish students and faculty. Despite the denials, the reality is that the line between legitimate protest and the use of reprehensible antisemitic slurs was blurred long ago. University presidents have acknowledged as much, yet largely failed to respond. The net effect – as evidenced last week – is that the Jewish community has faced intolerable discrimination on campus and is too often left to fear for its own safety.
Protecting Freedom of Expression: My Heritage Committee Appearance on the Chilling Effect of Antisemitism
The Standing Committee on Canadian Heritage is in the midst of conducting a study on protecting freedom of expression that has opened the door to discussing a wide range of issues. I appeared as a witness before the committee yesterday and divided my opening remarks into two issues. First, I discussed the way digital policies (notably including Bills C-11, C-18, C-63, and S-210) all intersect with expression in either directly or indirectly, arguing that we haven’t always taken the protection of expression sufficiently seriously in the digital policy debate. Second, I focused on the challenge of when expression chills others expression, using antisemitism as a deeply troubling example.
I will likely devote a future podcast to the full appearance and my exchanges with MPs, who wanted to learn more about both the speech implications of digital policy and some of the suggestions for addressing antisemitism. In the meantime, my opening comments are posted below in text with a video on the chilling effect of antisemitism. I discuss the myriad of concerns and identify steps that could be taken to mitigate against the harms, including clearly defined policies, such as the IHRA definition of antisemitism, active enforcement of campus policies and codes, principled implementation of institutional neutrality, leadership in speaking out against conduct that creates fear and chills speech, as well as time and place restrictions and bubble zone legislation to strike a much needed balance.
Canadian Government to Ban TikTok (the Company not the App)
The Canadian government has just announced the conclusion of its national security review of TikTok and arrived at a curious conclusion: it plans to ban the company from operating in Canada but the app will remain available here. I wrote earlier this year about the need for better laws to counter the risks associated with TikTok, rather than banning the app altogether. That post came in response to U.S. legislation that proposed to ban the app, but which is now in doubt given the results of yesterday’s U.S. Presidential election. There may well be good reasons to ban the app if it poses security and privacy risks that differ from those of other platforms, but banning the company rather than the app may actually make matters worse since the risks associated with the app will remain but the ability to hold the company accountable will be weakened.