Last week, I posted on the results of this summer’s online harms consultation, which remains shrouded in secrecy as the Canadian government still refuses to disclose the hundreds of submissions it received. That post focused on the common concerns raised in the submissions as pulled from my ongoing blog post that features links to dozens of submissions that have been independently posted. This second post highlights frequently cited recommendations. These recommendations are particularly important given that the mandate letter for Canadian Heritage Minister Pablo Rodriguez indicates that any online harms legislation “should be reflective of the feedback received during the recent consultations.”
That language is notable since the most common recommendation was to call on the government to scrap the consultation and hold a new one. Many argued the consultation was deeply flawed (because it does not actually ask questions, was conducted during an election, offered a complete policy as opposed to asking about the best approach), advocating instead for meaningful contributions by stakeholders to determine the shape of the policy backed by more of an evidence-based approach from government.
Several submissions maintained that the government should explicitly acknowledge that human rights protection be the guiding principle, both in terms of the legislation and in how content moderation is conducted. Because of the potential harm to rights that this regime could have is significant, they argue such explicit recognition is needed to both create a safe bill and ensure it is implemented according to human rights principles
There was also widespread support for removing the 24 hour takedown requirement, though there was less agreement about what should replace it. Some did not recommend any measures beyond removal. A few recommended alternate time frames with which content should be removed, such as requiring it be done “expeditiously”, implying more of a duty of care framework that allows for more flexibility when making determinations. Others suggested a ‘trusted flagger’ system that would grant certain content flaggers special status indicating that their reports are trustworthy, thereby allowing for more expeditious takedowns.
There were also considerable calls to refine the definitions and scope of the law. For example, many called for a narrowing of the scope of the reporting requirements to law enforcement or its removal altogether. Those in favour of keeping it suggested narrowing it to situations where harm to person is imminent or for certain, specific kinds of online harm such as child sexual exploitation material and terrorism content.
Many recommended that website blocking be removed or, if the government insists on implementing it, narrowly defined to protect rights and only used as an extraordinary remedy. Another common recommendation was the removal of proactive monitoring obligations altogether.
Recommendations | Who Recommended It? | Number of Submitters |
1. Stop this proposal and undertake a more robust consultation process with explanations/justifications for the policy | Canadian Civil Liberties Association
CitizenLab International Civil Liberties Monitoring Group Internet Archive of Canada Internet Society: Canada Chapter Joint Submission from Civil Liberties and Anti-Racism Groups LEAF OpenMedia TechSavvy Tucows Haggart and Tusikov |
11 |
2. Remove 24-hour timeline | AccessNow
Canadian Civil Liberties Association CIPPIC Cybersecure Policy Exchange Global Network Initiative Google Canada OpenMedia Ranking Digital Rights Carmichael and Laidlaw |
9 |
2(a). Mandate the review of content “expeditiously” which provide flexibility for determinations | AccessNow
CIPPIC Cybersecure Policy Exchange Google Canada TekSavvy Carmichael and Laidlaw |
7 |
2(b). Have expedient takedowns for child sexual abuse material and non-consensually distributed intimate images, but not the other harms | LEAF | 1 |
2(c). Have clear guidelines for what characteristics merit prioritization in moderation | Global Network Initiative | 1 |
2(d). Separate the content removal obligations from the voluntary ‘flagging’ systems that the platforms have for legal, but harmful, content | Google Canada | 1 |
“Trusted Flagger” system | Google Canada
Carmichael and Laidlaw |
2 |
3. Definitions need to be more precise/narrow (what services are covered, the precise definitions of the harms) | AccessNow
CIPPIC Cybersecure Policy Exchange Global Network Initiative Google Canada International Civil Liberties Monitoring Group Technation Carmichael and Laidlaw McKelvey |
9 |
3(a). Policy should target only what is illegal | Technation
Google Canada |
2 |
3(b). The GiC should not be delegated such broad powers to change the scope of the proposal | Independent Press Gallery | 1 |
4. Site-blocking should be removed or narrowly defined with clear criteria | AccessNow
Cybersecure Policy Exchange Internet Society Canada Chapter OpenMedia Public Interest Advocacy Centre TechSavvy Tucows Carmichael and Laidlaw |
8 |
4(a). Stricter safeguards around site-blocking to ensure it does not infringe rights | Public Interest Advocacy Centre
TekSavvy Carmichael and Laidlaw |
3 |
4(b). CRTC should be the decision-maker for site-blocking, if site-blocking is mandated | Public Interest Advocacy Centre | 1 |
5. Reporting to law enforcement obligations should be narrow (such as for imminent harm or child sexual exploitation/terrorism)/not exist and cannot be combined with proactive monitoring | CIPPIC
Cybersecure Policy Exchange Chinese Canadian National Council for Social Justice Google Canada International Civil Liberties Monitoring Group LEAF OpenMedia Carmichael and Laidlaw |
8 |
5(a). Due process protections should be included in the reporting requirements | Google Canada | 1 |
6. Remove proactive monitoring | AccessNow
CIPPIC Cybersecure Policy Exchange Google Canada OpenMedia Technation Carmichael and Laidlaw |
7 |
6(a). Indicate that using automated systems is not mandatory | Google Canada | 1 |
7. Each harm should be dealt with independently | CitizenLab
International Civil Liberties Monitoring Group Internet Society Canada Chapter LEAF OpenMedia TechSavvy |
6 |
8. Have human rights-based approach i.e. as explicit guiding principles | Ranking Digital Rights
LEAF TechNation Carmichael and Laidlaw |
4 |
There must be democratic oversight of the regulators | Global Network Initiative
Ranking Digital Rights Technation |
3 |
Transparency requirements should be publicly accessible in a way that protects privacy | Cybersecure Policy Exchange
International Civil Liberties Monitoring Group LEAF |
3 |
Target the ad-centric business model for regulation | OpenMedia
Ranking Digital Rights Haggart and Tusikov |
3 |
Independent/non-governmental/public auditing and transparency around content moderation tools, algorithms, and human rights impacts | Chinese Canadian National Council for Social Justice
OpenMedia Ranking Digital Rights |
3 |
Recognize new forms of harms (identity fraud, TFGBV based ones like rape and death threats, online threats to journalists) | Cybersecure Policy Exchange
LEAF News Media Canada |
3 |
Regulate the broader digital economy, dealing with issues like monopolistic behaviour | Canadian Association of Research Libraries
Haggart and Tusikov |
2 |
Proposal raises concerns for innovation and competition | Google Canada
Wilson |
2 |
CSIS’ new powers should be dealt with in a separate bill | International Civil Liberties Monitoring Group | 1 |
Increase funding for libraries | Canadian Association of Research Libraries | 1 |
Provide more immediate and direct support to victims experiencing TFGBV (technology-facilitated gender based violence) | LEAF | 1 |
Provide alternative remedies to those proposed through law enforcement/criminal justice | LEAF | 1 |
Add research and education as one of the central mandates of the regulator | LEAF | 1 |
Regulate OCSPs through mandating criteria for meaningful transparency, due process requirements, etc | AccessNow | 1 |
Implement provisions that highlight the principle of protecting children | Canadian Coalition for the Rights of Children | 1 |
A bill must immediately be tabled with a fixed budget and timeline to ensure it is created soon | Chinese Canadian National Council for Social Justice | 1 |
The advisory board must be transparent and have its individuals elected | Chinese Canadian National Council for Social Justice | 1 |
The burden on the appeals process should be on the content producer to show it is legal | Chinese Canadian National Council for Social Justice | 1 |
There should be a uniform reporting function | Chinese Canadian National Council for Social Justice | 1 |
Enforcement measures should be more stringent | Chinese Canadian National Council for Social Justice | 1 |
The police role should be cautiously implemented | Chinese Canadian National Council for Social Justice | 1 |
There should be staggered obligations for different size organizations | Carmichael and Laidlaw | 1 |
Expand the due diligence defence | Google Canada | 1 |
Ensure that monetary penalties are imposed in a reasonable and proportionate manner | Google Canada | 1 |
Pingback: Links 22/12/2021: Pi in Short Supply, Alpha 20 of 7 Days to Die | Techrights
Pingback: Dec 21–22, 2021 – FRIENDS Media Update
Thanks
Need take my online class services? Reach out to Online Class Help Now. We have the best online class takers to help your do my online class for me needs.
Great information, youth should know this information too.
I am very enjoyed for this blog. It’s an informative topic. It helps me very much to solve some problems. I found many interesting things from this blog.
Hi there Dear, are you truly visiting this web page on a regular
basis, if so afterward you will without doubt get good knowledge.
I congratulate you for this quality article. reading both made me think as well as inspired. I will also read your other articles. My followers. Thank you
Now the time has changed, and you are taking money when you give a piece of free advice or discuss with those who are facing any problem. It works like the work of an Advocate who gets the fees of his discussion.