Geist INDU Appearance by Michael Geist

Geist INDU Appearance by Michael Geist


The Law Bytes Podcast, Episode 182: Inside the Hearings on Privacy and AI Reform – My Industry Committee Appearance on Bill C-27

After months of delays, the House of Commons Standing Committee on Industry and Technology has finally begun to conduct hearings on Bill C-27, which wraps Canadian privacy reform and AI regulation into a single legislative package. Last week, I appeared before the committee, making the case that the process is need of fixing and the bill in need of reform. The appearance sparked a wide range of questions from MPs from all parties. This week’s Law Bytes podcast takes you inside the committee hearing room for my opening statement and exchanges with MPs.

The podcast can be downloaded here, accessed on YouTube, and is embedded below. Subscribe to the podcast via Apple Podcast, Google Play, Spotify or the RSS feed. Updates on the podcast on Twitter at @Lawbytespod.


Standing Committee on Industry and Technology, October 26, 2023

Opening Statement:

Appearance before the House of Common Standing Committee on Industry, Science and Technology, October 26, 2023

Good afternoon. My name is Michael Geist.  I am a law professor at the University of Ottawa, where I hold the Canada Research Chair in Internet and E-commerce Law, and I am a member of the Centre for Law, Technology, and Society. I appear in a personal capacity representing only my own views.

I’d like to start by noting that the very first time I appeared before a House of Commons committee was in March 1999 on Bill C-54, which would later become PIPEDA. I didn’t really know what I was doing. My focus was on whether the bill would provide sufficient privacy protections for those just coming online who had little background or knowledge of privacy, security or even the Internet for that matter. I highlighted some of the shortcomings in the bill – poorly defined consent standards that would lead to reliance on implied consent, broad exceptions to the use or disclosure of personal information, and doubts about enforcement. I urged the committee to strengthen the bill but did not fully appreciate that the policy choices being made back then would last for decades.

I start with this brief trip down memory lane because I feel like we find ourselves in a similar position today, this time with policy choices on artificial intelligence and emerging technologies that will similarly last for far longer than we might care to admit.

It is for that reason, that I think it is more important to emphasize the need to get it right rather than to get it fast. I often hear the ISED Minister talk about being first and I don’t understand why that appears to be a key objective. Indeed, if you leave aside the fact that the core of this bill was introduced in 2020 and languished for years, suddenly now we’re in race to conduct hearings that I don’t quite get. We’ve got an AI bill facing a major overhaul with no actual text yet available and witnesses that seemingly have to pick between privacy and AI creating the risk of limited analysis all around. We need to do better. I’ll focus these remarks on privacy but to be clear the AI bill and the proposed changes raise of host of concerns, including the need for independent enforcement and the high impact definitions that puzzlingly include search and social media algorithms.

The other lesson from the past two decades is that you can seek to create a balanced statute but the playing field will never be balanced. It is always tilted in favour of business, many of which have the resources and expertise to challenge the law, challenge complaints, and challenge the Commissioner. Most Canadians don’t stand a chance. That’s why we must craft rules that seek to balance the playing field too – broad scope of coverage, better oversight and audit mechanisms, and tough penalties to ensure incentives align with better privacy protection.

How to do that?  Very quickly given limited time, five ideas:

First, we must end the practice of “do what I say, not as I do” when it comes to privacy. It is unacceptable in 2023 for political parties to exempt themselves from the standards they expect all businesses to follow. Indeed, you cannot argue that privacy is a fundamental right, but then claim it should not apply in its most robust manner to political parties.

Second, the addition of language around fundamental right to privacy is welcome, but I think it should also be embedded elsewhere so that it factors more directly into the application of the law. For example, as former Commissioner Therrien noted, it could be included in Section 12(2) as among the factors to consider in the appropriate purposes test.

Third, the past 20 years has definitely demonstrated that penalties matter for compliance purposes and are a critical part of balance. The bill features some odd exclusions: there are penalties for elements of the appropriate purpose provision in Section 12, but not for the main provision limiting collection, use and disclosure for appropriate purposes. In the crucial Section 15 provision on consent, there are no penalties around the timing of consent or for using an implied consent within the legitimate interest exception. The bill says such a practice “is not appropriate” – whatever that means – but the penalty provision doesn’t apply regardless.

Fourth, the committee has already heard debate about the appropriate standard for anonymized data and I get the pressure to align with other statutes. But I’d note that Section 6(6) specifically excludes anonymized data from the Act. Yet I think we want the Commissioner to play a data governance role here with potential audits or other review, particularly if a lower standard is adopted.

Fifth, provided we ensure that the privacy tribunal is regarded an expert tribunal whose rulings will be granted deference by the courts, I’m ok with creating an additional layer of privacy governance. I appreciate the concerns that this may lengthen the timeline for resolution of cases, but the metric that counts is not how fast the Privacy Commissioner can address the issue, but how fast a complainant can get a binding final outcome. Given the risks of appeals and courts treating cases on a de novo basis, the existing timelines can go far beyond an intial commissioner decision and a tribunal might actually help.

Thank you for attention. I look forward to your questions.


  1. An easy way to make money by dedicating a few hours to your work. Flexibility to accept orders based on availability. With enough experience, you can evaluate work and negotiate with clients if necessary. qx By consistently performing well, you can maintain long-term customer relationships and make it easier to find a new job.
    Visit here now………..

  2. I ­­­­­­a­­m ­­­­ma­­k­­i­­ng 28­­5­­ Dollars e­­a­­ch­­ h­­o­­u­­r ­­­­f­­o­­r w­­o­­r­­ki­­n­­g­­ ­­­­on­­l­­i­­n­­e. ­­I n­­e­­v­­e­­r ­­­­t­­h­­o­­u­­g­­h­­t ­­­­t­­h­­a­­t ­­­­i­­t ­­­­w­­a­­s ­­­­l­­e­­g­­i­­t­­ b­­u­­t­­­­ ­­m­­y­­ ­­­­­­b­­e­­s­­t­­­­ ­­f­­r­­i­­e­­n­­d­­ ­­e­­a­­r­­n­­s ­­­­29,0­­0­­0 d­­o­­l­­l­­a­­r­­s ­­­­ev­­e­­r­­y ­­m­­o­­n­­t­­h d­­­­o­­i­­n­­g t­­h­­i­­s a­­n­­d­­ s­­h­­e ­­­­sh­­o­­w­­e­­d m­­e h­­o­­w­­, C­­h­­e­­c­­k ­­i­­t ­­o­­u­­t ­­b­­y.

    V­­i­­s­­i­­t­­i­­n­­­­­­­g F­­­­­­­ol­­­­­­­lo­­­­­­­w­­­­­­­in­­­­­­­g­­­­­­ L­­i­­n­­k————————–>>>

  3. I’d like to know if the ‘areas of concern’ for AI systems that can affect people’s lives are mainly restricted to fields that don’t don’t have a regulated professional body. Investment folks might not be happy but getting liability issues squared away in the IT sector might get ahead of many of the concerns society has (for both AI and cyber).