The House of Commons Standing Committee on Access to Information, Privacy and Ethics spent much of February conducting a study on the collection and use of mobility data by the Government of Canada. The study stems from reports that the Public Health Agency of Canada worked with Telus and BlueDot, an AI firm, to identify COVID-19 trends based on mobility data. I appeared before the committee earlier this week, making the case that this is a a genuine privacy quandary where the activities were arguably legal, the notice met the low legal standard, Telus is widely viewed as seeking to go beyond the strict statutory requirements, and the project itself had the potential for public health benefits. Yet despite these factors, something does not sit right with many Canadians. I believe that something are outdated privacy laws that are no longer fit for purpose. My opening statement is posted below.
Appearance before the House of Commons Standing Committee on Access to Information, Privacy and Ethics, February 28, 2022
Good morning. My name is Michael Geist. I’m a law professor at the University of Ottawa where I hold the Canada Research Chair in Internet and E-commerce Law and I’m a member of the Centre for Law, Technology and Society. I appear in a personal capacity representing only my own views.
I’d like to thank the committee for the invitation to appear on this issue, which represents an exceptionally thorny privacy challenge. I recognize that some of your witnesses have brought differing perspectives on the legality and ethics of this collection and use of mobile data. From my perspective, I’d like to start by noting three things:
First, ensuring that the data was aggregated and de-identified was a textbook approach to how many organizations have addressed their privacy obligations, namely by de-identifying data and placing outside the scope of personally identifiable information that falls within the law.
Second, the potential use of the data in the midst of a global pandemic may be beneficial.
Third, it does not appear that there is a violation of the law, because the data itself was aggregated and de-identified and the public notice may not have been seen by many but that too is not uncommon.
This creates a genuine privacy quandary: the activities were arguably legal, the notice met the low legal standard, Telus is widely viewed as seeking to go beyond the strict statutory requirements, and the project itself had the potential for public health benefits.
There could have been some improvements – the Privacy Commissioner of Canada should have been more actively engaged in the process and the public notification should have been more prominent – but I’m not entirely convinced that either step would have changed very much. The OPC would surely have pushed for the more prominent notification and some assurances on the de-identification of the data, but it seems likely that the project would have continued. Similarly, better notices would have benefited the few Canadians that paid attention, but it is a fiction to suggest that millions are actively monitoring privacy policies or similar web pages for possible amendments.
Yet despite all of these factors, something does not sit right with many Canadians. I believe that the foundational problem that this incident highlights is that our laws are no longer fit for the purpose and in dire need of reform. It is not that I think we need laws that would ban or prohibit this activity – again, most recognize the potential benefits. Rather, we need laws that provide greater assurances that our information is protected and will not be misused, that policies are transparent, and the consent is informed. That does not come from baking in broad exceptions under the law that permit the activity because the law does not apply. Instead, it means updating our laws so that they contemplate these kinds of activities and provide a legal and regulatory roadmap for how to implement them in a privacy-protective manner.
The need for reform applies to both the Privacy Act and PIPEDA.
With respect to the Privacy Act, there have been multiple studies and successive federal privacy commissioners who have sounded the alarm on the legislation that is viewed as outdated and inadequate. Canadians rightly expect that the privacy rules that govern the collection, use, and disclosure of their personal information by the federal government will meet the highest standards. For decades, we have failed to meet that standard.
The failure to engage in meaningful Privacy Act reform may be attributable in part to the lack of public awareness of the law and its importance. The Privacy Commissioner has played an important role in educating the public about PIPEDA and broader privacy concerns. The Privacy Act desperately needs to include a similar mandate for public education and research.
With respect to PIPEDA, I would need far more than five minutes to identify all the potential reforms. Simply put, the issue has inexplicably been placed on the backburner. Despite claims that it was a priority, the former Bill C-11 was introduced in November 2020 and there was seemingly no effort to even bring it to committee. The bill attracted some criticism, but this is not rocket science.
If Canada is looking for a modernized privacy law and it wishes to meet international standards, the starting point is the European Union’s GDPR. Notwithstanding some recent scare tactics from groups such as the Canadian Marketing Association, the reality is that the GDPR is a widely recognized standard, global multinationals are familiar with the obligations, there are innovative rules that seek to address the emerging digital challenges, and there are tough enforcement powers and penalties. There is room to tweak the rules for Canada, but we should not let perfect be the enemy of the good.
Modernized privacy rules are not some theoretical exercise. As this recent event demonstrates, failing to implement those rules leaves Canada in a difficult position with potential conflicting rules at the provincial level, compliance strategies that may undermine public trust, and policy implementation choices that fail to maximize the benefits that can come from better data.
I look forward to your questions.