The Canadian government has just announced the conclusion of its national security review of TikTok and arrived at a curious conclusion: it plans to ban the company from operating in Canada but the app will remain available here. I wrote earlier this year about the need for better laws to counter the risks associated with TikTok, rather than banning the app altogether. That post came in response to U.S. legislation that proposed to ban the app, but which is now in doubt given the results of yesterday’s U.S. Presidential election. There may well be good reasons to ban the app if it poses security and privacy risks that differ from those of other platforms, but banning the company rather than the app may actually make matters worse since the risks associated with the app will remain but the ability to hold the company accountable will be weakened.
Latest Posts
Why the Conspiratorial Responses to Canada’s Antisemitism Guide Demonstrate Its Necessity
Delegates from dozens of countries gathered nearly 25 year ago in Stockholm, Sweden for the Stockholm International Forum, where they affirmed a global commitment to combatting racism, antisemitism, ethnic hatred, and ignorance of history. That meeting sparked what became a 16-year open process to develop much-needed anti-racism tools, including the creation of the International Holocaust Remembrance Alliance (IHRA) working definition of antisemitism.
Antisemitism is generally understood as a certain perception of Jews that veers into hatred, but specific examples can be helpful for those seeking to apply policies in the workplace, codes in educational environments, or standards for government funding programs. The IHRA definition, which is non-legally binding, seeks to fill the void by including both general principles and specific examples. It has struck a chord with endorsements from 45 countries and hundreds of provincial and local governments. And Canada has been a leader in this regard: the federal government adopted it in 2019 as part of its anti-racism strategy and the majority of provinces have followed suit with their own support measures.
Combatting antisemitism should not be controversial, yet a new Canadian effort to provide governments, businesses, and schools with greater clarity on implementing the IHRA definition has sparked opposition from the NDP and even outrage in some quarters.
The Law Bytes Podcast, Episode 218: Emily Laidlaw and Taylor Owen on Saving the Online Harms Act
The Online Harms Act or Bill C-63 was introduced last February after years of false starts, public consultations, and debates. Months later, the bill appears to be stalled in the House of Commons and has yet to make it to committee for further study. Some view that as a win, given their criticism of the bill, though others who have waited years for action against online harms are beginning to fear that the Parliamentary clock is working against them.
Emily Laidlaw, the Canada Research Chair in Cybersecurity at the University of Calgary and Taylor Owen, the Beaverbrook Chair in Media, Ethics and Communications at the Max Bell School of Public Policy at McGill University, have both been actively engaged in this issue for years, including their participation on the government’s expert advisory group. They join the Law Bytes podcast to discuss where things stand on Bill C-63 and the steps they recommend to get the bill back on track for study and debate.
CRTC Approves Google’s $100 Million Online News Act Exemption Deal
The government’s deeply flawed attempt to force tech platforms to pay Canadian news outlets for linking to news is nearing its payout. The CRTC this week formally exempted Google from negotiating individual agreements and facing a potential mandated arbitration system in return for a lump sum $100 million annual payment. The $100 million deal was the government’s last ditch attempt to salvage the Online News Act as its insistence that tech platforms would never walk away from news proved to be disastrously wrong. Within weeks of the former Bill C-18 receiving royal assent in June 2023, Meta blocked news links on its Facebook and Instagram platforms. The block has remained in place for more than a year, causing significant harm to news outlets and sparking a CRTC investigation into whether user attempts to evade the block bring the company within the scope of the law.