Wiertz Sebastien - Privacy by Sebastien Wiertz (CC BY 2.0) https://flic.kr/p/ahk6nh
Unless you’ve been offline or focused on a distorted national anthem rendition for the past week, you know that Pokémon Go has taken the world by storm with millions of people wandering around searching for virtual Pokémon characters. The game was officially released in Canada on the weekend – it started first in the U.S., Australia, and New Zealand – with millions of people already playing it.
My weekly technology law column (Toronto Star version, homepage version) notes that Pokémon Go provides a first peek at the potential of widespread use of “augmented reality”, which combines real space places such as parks or buildings with virtual characters or objects that appear on a computer or smartphone. In this case, the app uses GPS on smartphones to identify players’ physical location with the goal of collecting and training virtual Pokémon characters located there.
Three years ago this month, Edward Snowden shocked the world with a series of disclosures that revealed a myriad of U.S. government-backed surveillance programs. The Snowden revelations sparked a global debate over how to best strike the balance between privacy and security and led to demands for greater telecom transparency.
My weekly technology law column (Toronto Star version, homepage version) notes that the initial Canadian response to the surveillance debate was muted at best. Many Canadians assumed that the Snowden disclosures were largely about U.S. activities. That raised concerns about Canadian data being caught within the U.S. surveillance dragnet, but it did not necessarily implicate the Canadian government in the activities.
Privacy Commissioner of Canada Daniel Therrien was in the news this week as he expressed concern with the evasiveness of Canada’s spy agencies and the ongoing refusal of some of Canada’s telecom companies (namely Bell) to issue transparency reports. I’ll have more to say about privacy and government agencies in my technology law column next week, but on the issue of telecom transparency reports, I believe that Therrien already has the necessary legal mandate to act now. Therrien urged all telecom companies to release transparency reports, noting:
“I think Canadians are telling us, first of all, that they would much prefer that data be shared from telcos to government only with a warrant, with a court authorization. But when that does not happen, Canadians expect that there be transparency…frankly, if there’s not more progress I will continue to call for legislation on this issue.”
I wrote about why Canada’s telecom transparency reporting still falls short late last month, emphasizing that a non-binding approach to transparency reporting has been a failure.
Canadian telecom company privacy practices were back in the spotlight this month with the release of a transparency report from Rogers Communications. The report provides new insights into how much – or how little – Canadians know about when their personal information is disclosed to government agencies.
For Rogers customers, the good news is that recent changes in the law, including court decisions that set limits on the disclosure of mass data from cellphone towers and that protect Internet subscriber information – are having a significant effect. Law enforcement agencies are still able to obtain data on hundreds of thousands of people, but warrantless access to basic subscriber information has stopped.
My weekly technology law column (Toronto Star version, homepage version) notes that the latest Rogers report is the first from the company since the release in 2015 of telecom transparency guidelines that garnered support from the federal privacy commissioner, Industry Canada, and the telecom sector. The guidelines attempt to provide a common framework for disclosure so that the public will be better able to compare privacy protections and policies among Canada’s major telecom companies.
The U.S. government’s attempt to invoke a centuries-old law to obtain a court order to require Apple to create a program that would allow it to break the security safeguards on the iPhone used by a San Bernardino terrorist has sparked an enormous outcry from the technology, privacy, and security communities.
For U.S. officials, a terrorism related rationale for creating encryption backdoors or weakening user security represents the most compelling scenario for mandated assistance. Yet even in those circumstances, companies, courts, and legislatures should resist the urge to remove one of the last bastions of user security and privacy protection.
My weekly technology law column (Toronto Star version, homepage version) argues that this case is about far more than granting U.S. law enforcement access to whatever information remains on a single password-protected iPhone. Investigators already have a near-complete electronic record: all emails and information stored on cloud-based computers, most content on the phone from a cloud back-up completed weeks earlier, telephone records, social media activity, and data that reveals with whom the terrorist interacted. Moreover, given the availability of all of that information, it seems likely that much of the remaining bits of evidence on the phone can be gathered from companies or individuals at the other end of the conversation.