Much of the discussion around the new lawful access bill (Bill C-22) has focused on provisions that improved upon Bill C-2, notably the decision to scrap the warrantless information demand power by requiring judicial oversight for access to subscriber information. Yet despite that improvement, there remain serious privacy concerns with the government’s latest iteration of lawful access. Buried in the second half of Bill C-22 is a provision granting the government the power to require “core providers” to retain categories of metadata, including transmission data, for up to one year. This is mandatory metadata retention that would require telecom and electronic service providers to store information about the communications of all their users, regardless of whether those users are suspected of anything. It is one of the most privacy invasive tools a government can deploy and the international experience suggests that there are major privacy risks.
News
Government Enacts Political Party Anti-Privacy Rules With Bill C-4 Royal Assent Sprint
I’ve written extensively about Bill C-4 and the government’s effort to bury political party privacy rules that largely eliminate privacy obligations for federal political parties and apply the new rules retroactively to May 2000. This past week’s Law Bytes podcast featured Senate hearings on the bill, which ultimately resulted in an amendment to require the government to establish actual privacy obligations within three years. The government yesterday rejected the amendment and the bill received royal assent in a lightning-fast process.
A Tale of Two Bills: Lawful Access Returns With Changes to Warrantless Access But Dangerous Backdoor Surveillance Risks Remain
The decades-long battle over lawful access entered a new phase yesterday with the introduction of Bill C-22, the Lawful Access Act. This bill follows the attempt last spring to bury lawful access provisions in Bill C-2, a border measures bill that was the new government’s first piece of substantive legislation. The lawful access elements of the bill faced an immediate backlash given the inclusion of unprecedented rules permitting widespread warrantless access to personal information. Those rules were on very shaky constitutional ground and the government ultimately decided to hit the reset button on lawful access by proceeding with the border measures in a different bill.
Lawful access never dies, however. Bill C-22 cover the two main aspects of lawful access: law enforcement access to personal information held by communication service providers such as ISPs and wireless providers and the development of surveillance and monitoring capabilities within Canadian networks. In fact, the bill is separated into two with the first half dealing with “timely access to data and information” and the second establishing the Supporting Authorized Access to Information Act (SAAIA).
Why the Online Harms Act is the Wrong Way to Regulate AI Chatbots
In the wake of reports that AI Minister Evan Solomon may press AI companies such as OpenAI to more aggressively report potential safety risks identified in private chats to law enforcement, attention has quickly turned to the Online Harms Act as a potential regulatory solution. The Online Harms Act or Bill C-63, died on the order paper last year, but is expected to return in some form in the coming months. Given that the Act is tailor made to address online harms, it isn’t surprising that some would suggest that it could be expanded to cover AI chatbots.
Yet the law was deliberately designed to avoid doing what politicians want the AI companies to do as it expressly exempted private communications and proactive monitoring from its scope. Indeed, applying the Online Harms Act to AI chatbots would not simply extend existing online safety rules to a new technology. It would require dismantling core privacy safeguards which were added after the government’s earlier online harms proposal faced widespread criticism for encouraging platform monitoring and rapid reporting to law enforcement. In effect, proposals to use online harms to regulate AI chatbots risks reviving many of the same surveillance concerns that forced the government back to the drawing board just a few years ago.
Nobody Wants This: Senate Rejects Government’s Anti-Privacy Plan for Political Parties By Sending Bill Back to the House With a Sunset Clause
Faced with a bill that would leave political parties subject to weaker privacy rules than virtually any other major organization in Canada, the Senate voted yesterday to amend the bill by including a sunset clause on the privacy provisions that gives that the government three years to come up with something better. The change is designed to allow the new rules, which as the Senate heard repeatedly from experts and privacy commissioners are not real privacy rules at all, to apply immediately but expire in three years. This will have the effect of killing a B.C. privacy challenge that sparked the legislation in the first place. The bill heads back to the House of Commons, where the government can either accept the change and have the bill pass or reject the change and send it back again to the Senate. If it is sent back, the Senate is unlikely to oppose the privacy elements in the bill again.











