Blog

Bill C-63 screenshot, https://www.parl.ca/DocumentViewer/en/44-1/bill/C-63/first-reading

Debating the Online Harms Act: Insights from Two Recent Panels on Bill C-63

The Online Harms Act has sparked widespread debate over the past six weeks. I’ve covered the bill in a trio of Law Bytes podcast (Online Harms, Canada Human Rights Act, Criminal Code) and participated in several panels focused on the issue. Those panels are posted below. First, a panel titled the Online Harms Act: What’s Fact and What’s Fiction, sponsored by CIJA that included Emily Laidlaw, Richard Marceau and me. It paid particular attention to the intersection between the bill and online hate.

Read more ›

April 18, 2024 1 comment News
Data Center by Bob Mical https://flic.kr/p/i3NECz CC BY-NC 2.0

AI Spending is Not an AI Strategy: Why the Government’s Artificial Intelligence Plan Avoids the Hard Governance Questions

The government announced plans over the weekend to spend billions of dollars to support artificial intelligence. Billed as “securing Canada’s AI Advantage”, the plan includes promises to spend $2 billion on an AI Compute Access Fund and a Canadian AI Sovereign Compute Strategy that is focused on developing domestic computing infrastructure. In addition, there is $200 million for AI startups, $100 million for AI adoption, $50 million for skills training (particularly those in the creative sector), $50 million for an AI Safety Institute, and $5.1 million to support the Office of the AI and Data Commissioner, which would be created by Bill C-27.  While the plan received unsurprising applause from AI institutes that have been lobbying for the money, I have my doubts. There is unquestionably a need to address AI policy, but this approach appears to paper over hard questions about AI governance and regulation. The money may be useful – though given the massive private sector investment in the space right now a better case for public money is needed – but tossing millions at each issue is not the equivalent of grappling with AI safety, copyright or regulatory challenges.

Read more ›

April 9, 2024 18 comments News
IMG_7927 by Steve Eason https://flic.kr/p/2pi3HEy CC BY-NC 2.0

Tweets Are Not Enough: Why Combatting Relentless Antisemitism in Canada Requires Real Leadership and Action

The Jewish holiday of Purim over the weekend sparked the usually array of political tweets featuring some odd interpretations of the meaning of the holiday and expressing varying degrees of support for the Jewish community.  But coming off one of the worst weeks in memory  – cancelled Jewish events due to security concerns, antisemitism in the mainstream media, deeply troubling comments on the floor of the House of Commons, and the marginalization of some Jewish MPs in government – the time for generic statements of support does not cut it. The Globe and Mail has noted the “dangerous slide into antisemitism” and called for a House motion unequivocally condemning antisemitism. This post provides further context to that piece, arguing that such a motion is necessary but insufficient since it is leadership and real action from our politicians, university presidents, and community groups that is desperately needed. 

Read more ›

March 26, 2024 15 comments News
fedi-tiktok by David Lohner CC0 1.0 https://flic.kr/p/2pCxJA9

Better Laws, Not Bans: Why a TikTok Ban is a Bad Idea

New legislation making its way through the U.S. Congress has placed a TikTok ban back on the public agenda. The app is already prohibited on government devices in Canada, the government has quietly conducted a national security review, and there are new calls to ban it altogether from the Canadian market. While it might be tempting for some politicians to jump on the bandwagon, a ban would be a mistake. There are legitimate concerns with social media companies, but there simply hasn’t been convincing evidence that TikTok currently raises a national security threat nor that it poses a greater risk than any other social media service. The furor really seems to be a case of economic nationalism – a desire to deny a popular Chinese service access to the U.S. market – rather than a genuine case that TikTok poses a unique privacy and security threat. Taken at face value, however, the case against TikTok comes down to a simple concern: its owner, ByteDance, is a Chinese company that could theoretically be required to disclose user information to the Chinese government or compelled to act on its behalf. The proposed U.S. law therefore would require that TikTok be sold within six months or face a ban.

While the concerns associated with TikTok given its Chinese connection and popularity with younger demographics are well known, the privacy and security case against it is very weak.

Read more ›

March 15, 2024 15 comments News
Arif Virani, MP for Parkdale-High Park by Nicole Contois https://flic.kr/p/ThAyBg CC0 1.0

Government Gaslighting Again?: Unpacking the Uncomfortable Reality of the Online Harms Act

The Online Harms Act was only introduced two weeks ago, but already it appears that the government is ready to run back the same playbook of gaslighting and denials that plagued Bills C-11 and C-18. Those bills, which addressed Internet streaming and news, faced widespread criticism over potential regulation of user content and the prospect of blocked news links on major Internet platforms. Rather than engage in a policy process that took the criticism seriously, the government ignored digital creators (including disrespecting indigenous creators) and dismissed the risks of Bill C-18 as a bluff. The results of that strategy are well-known: Bill C-11 required a policy direction fix and is mired in a years-long regulatory process at the CRTC and news links have been blocked for months on Meta as the list of Canadian media bankruptcies and closures mount.

Bill C-63, the Online Harms Act, offered the chance for a fresh start given that the government seemed to accept the sharp criticism of its first proposal, engaging in a more open consultative process in response. As I noted when the bill was first tabled, the core of the legislation addressing the responsibility of Internet platforms was indeed much improved. Yet it was immediately obvious there were red flags, particularly with respect to the Digital Safety Commission charged with enforcing the law and with the inclusion of Criminal Code and Human Rights Act provisions with overbroad penalties and the potential to weaponize speech complaints. The hope – based on the more collaborative approach used to develop the law – was that there would be a “genuine welcoming of constructive criticism rather than the discouraging, hostile processes of recent years.” Two weeks in that hope is rapidly disappearing.

Read more ›

March 13, 2024 22 comments News