free consultation by russell davies  https://flic.kr/p/4jxLPq (CC BY-NC 2.0)

free consultation by russell davies https://flic.kr/p/4jxLPq (CC BY-NC 2.0)

News

The (Still Secret) Online Harms Consultation: What the Government Heard, Part One

The results of this summer’s online harms consultation remains largely shrouded in secrecy as the Canadian government still refuses to disclose the hundreds of submissions it received. Canadian Heritage Minister Pablo Rodriguez now leads the file, but he has said little about his department’s plans or explained why a public consultation should not feature public availability of the submissions. I have maintained an ongoing blog post with links to dozens of submissions that have been independently posted. While even a cursory review reveals widespread criticism, I’ve worked with the University of Ottawa law student Pelle Berends to do a deeper dive on the available submissions. This first post identifies the common concerns raised in the submissions with a chart breaking down the positions posted below. A second post will highlight frequently raised recommendations.

At a high level, a common concern is the plan to regulate five distinct types of harms, each with their own distinct legal tests to determine them, with a single, uniform legislative framework. Many argued that such a “one-size-fits-all” approach is ineffective and that each harm should have its own framework.

The most frequently mentioned concern involved the mandated 24-hour takedown requirement, with many arguing it is too short, arbitrary, and will lead to, at best, ineffective content moderation. As a result, it would have a significant impact on freedom of expression with the potential removal of legitimate content.

Many submissions also focused on concerns that the government’s plans will harm those that it seeks to protect. In particular, the content moderation tools could be used or abused by those with ill-intentions against marginalized groups. Moreover, the proposal encourages content moderation practices based on AI-based systems that may have significant bias that could disproportionately impact marginalized groups. Further, some submissions voiced concern that AI-based systems will lead to an over-removal of content due to its inability to understand the nuance of language which is required to distinguish hate speech from satire and again result in a disproportionate removal of content from marginalized communities.

Pro-active monitoring also attracted attention with many submitters noting that this is particularly harmful because it will lead to pre-publication censorship of user posts. In fact, several pointed to the UN Special Rapporteurs concluding that such provisions are disproportionate. It was also noted when combined with the mandatory reporting requirements, platforms would be turned into extensions of law enforcement, leading to private sector surveillance of private citizens’ expression. Closely linked to the reporting requirements were the privacy implications of requiring platforms to send user information for such a broad range of harms to law enforcement organizations.

Website blocking was also discussed with the practice viewed as ineffective since it is easily overcome with basic technological skills, leading to a game of “whack-a-mole” as authorities try to keep up with websites that keep coming back after being blocked. It was also noted that site blocking infringes communication rights, since sites could be blocked even if only a single part/post infringed the law.

Many submissions also express concern about scope and definitions found in the consultation. Definitions of the harms are broad, capture legal content, and provide challenges for those who are supposed to follow them in content moderation decisions due to their broad and vague nature (namely taking criminal definitions and bringing them to a “regulatory context”). Further, the definitions of OSCs and OSCPs are broad and capture many entities, including the comment sections on blogs and news websites, despite government assurance that it only targets social media. There was also concern that the Governor in Council has broad powers to change the scope and definitions of the harms without democratic oversight.

Finally, many focused on the inadequacy of the consultation process, noting that it took place during an election period. That may have limited public participation and raised questions about a caretaker government dealing with such controversial subject matter. The consultation materials themselves were also viewed insufficient because they did not ask open-ended questions or provide justification for the problems addressed and solutions proposed.

Concern Who Raised It? Number of Submitters
1. 24-hour time-limit is problematic – leads to over-removal/freedom of expression concerns Access Now

Canadian Association of Research Libraries

Canadian Civil Liberties Association

CIPPIC

Citizen Lab

Cybersecure Policy Exchange

Global Network Initiative

Google Canada

International Civil Liberties Monitoring Group

Internet Society: Canada Chapter

Joint Submission from Anti-Racist groups

LEAF

OpenMedia

Ranking Digital Rights

Carmichael & Laidlaw

Geist

TekSavvy

Tucows

18
2. The proposal will harm those it seeks to help / marginalized communities Access Now

Canadian Association of Research Libraries

CIPPIC

Citizen Lab

Cybersecure Policy Exchange

Global Network Initiative

Independent Press Gallery of Canada

International Civil Liberties Monitoring Group

Joint Submission from Anti-Racist groups

LEAF

OpenMedia

Ranking Digital Rights

Carmichael & Laidlaw

Geist

Webber & MacDonald

TekSavvy

16
3. Concerns about the scope/definition of “online harms” (the five categories) Access Now

Canadian Civil Liberties Association

Citizen Lab

Cybersecure Policy Exchange

Global Network Initiative

Google Canada

Independent Press Gallery of Canada

International Civil Liberties Monitoring Group

Internet Society: Canada Chapter

Carmichael & Laidlaw

Wilson

Geist

TechNation

Tucows

14
4. Promoting AI/algorithmic monitoring Access Now

Canadian Association of Research Libraries

Canadian Civil Liberties Association

Citizen Lab

Global Network Initiative

Google Canada

International Civil Liberties Monitoring Group

LEAF

OpenMedia

Carmichael & Laidlaw

Geist

TekSavvy

13
5. Definitions (i.e. OCSPs, OCSs, hatred, etc) are problematic (too broad/vague potentially infringe freedom of speech/capturing too many services) Access Now

CIPPIC

Citizen Lab

Cybersecure Policy Exchange

Global Network Initiative

International Civil Liberties Monitoring Group

Internet Society: Canada Chapter

Carmichael & Laidlaw

Wilson

Geist

TechNation

TekSavvy

12
6. Criticism of consultation process overall Canada Civil Liberties Association

Citizen Lab

International Civil Liberties Monitoring Group

Internet Archive Canada

Internet Society: Canada Chapter

Joint Submission from Anti-Racist groups

LEAF

OpenMedia

Haggart and Tusikov

Wilson

Geist

TechNation

12
7. Privacy concerns regarding the regulator/reporting/preservation of information requirements Canadian Association of Research Libraries

Canadian Civil Liberties Association

CIPPIC

Citizen Lab

Global Network Initiative

Google Canada

Independent Press Gallery of Canada

International Civil Liberties Monitoring Group

Internet Society: Canada Chapter

Canadian Association of Civil Liberties

Carmichael & Laidlaw

Tucows

12
8. Problematic “one size fits all” approach Canadian Civil Liberties Association

CIPPIC

Citizen Lab

Cybersecure Policy Exchange

International Civil Liberties Monitoring Group

Internet Society: Canada Chapter

Joint Submission from Anti-Racist groups

LEAF

OpenMedia

McKelvey

Geist

11
9. Over-removal of content Access Now

Canadian Association of Research Libraries

Citizen Lab

Google Canada

Independent Press Gallery of Canada

International Civil Liberties Monitoring Group

Internet Society: Canada Chapter

Joint Submission from Anti-Racist groups

Tucows

Carmichael & Laidlaw

Geist

11
10. Proactive monitoring infringes freedom of expression/privacy Access Now

Canadian Internet Policy and Public Interest Clinic

Citizen Lab

Cybersecure Policy Exchange

Global Network Initiative

Google Canada

OpenMedia

Ranking Digital Rights

Carmichael & Laidlaw

Geist

11
11. Website blocking violates freedom of expression/is ineffective/is unnecessary Access Now

Canadian Civil Liberties Association

Cybersecure Policy Exchange

Google Canada

Internet Society: Canada Chapter

Public Interest Advocacy Centre

OpenMedia

TekSavvy
Geist

Tucows

10
The lack of penalty/incentives against removal of legal content/over-removal of content /adequate appeal mechanisms for removed content Access Now

Canadian Association of Research Libraries

International Civil Liberties Monitoring Group

Joint Submission from Anti-Racist groups

OpenMedia

5
Concern about CSIS’s new powers Citizen Lab

Global Network Initiative

Independent Press Gallery of Canada

International Civil Liberties Monitoring Group

4
Penalties are disproportionate/onerous Access Now

Canadian Association of Research Libraries

Google Canada

Wilson

4
Concerns about the efficacy of the Digital Recourse Council International Civil Liberties Monitoring Group

Internet Society: Canada Chapter

OpenMedia

3
Concern about the failure to address the business model that drives these companies’ behaviour News Media Canada

Ranking Digital Rights

Haggart and Tusikov

3
Net neutrality concerns Canadian Association of Research Libraries

Public Interest Advocacy Centre

2
Look to international law/best practices for guidance Canadian Coalition for the Rights of Children

Cybersecure Policy Exchange

2
Concern about lack of timeline/budget Chinese Canadian National Council for Social Justice

McKelvey

2
Provides too much power to Cabinet Independent Press Gallery of Canada

Internet Society: Canada Chapter

2

4 Comments

  1. this and the new digital services tax as cases a covid force many to either stay home and ….yeaaaaa

    we need to dump stupid from govt before all of us are living in a igloo

  2. Pingback: ● NEWS ● #MichaelGeist ☞ The (Still Secret) Online Harms Consultati… | Dr. Roy Schestowitz (罗伊)

  3. Pingback: Links 16/12/2021: QEMU 6.2 and PAPPL 1.1 | Techrights

  4. I can admit that I feel a little guilty not catching the copyright consultation. Still, I am very proud to have participated in this consultation and played a part (however small) in voicing opposition to the online harms proposal. I know my comments probably went straight into the trash because I didn’t blow the trumpets and hailed this idea as the most amazing idea ever, but I knew that wouldn’t be the point. The point was that I (along with MANY many others) spoke out when the time came to speak and voice opposition directly to the government.

    I can admit that when I wrote my submission, I could have taken more time to find more angles that this was a bad proposal, but being able to offer a representative voice for someone looking to be a small online business seemed like an angle that wouldn’t get covered all that well. So, covering that angle, I hope, added a bit of a more unique angle to all of this.

    Happy others were able to cover numerous points I missed as I found myself saying to myself, “That is also a really good point” when reading these submissions. One of those moments where I can say, “I’m proud of you, people of the Internet!”

Leave a Reply

Your email address will not be published. Required fields are marked *

*

*