The question of what responsibility should lie with Internet platforms for the content they host that is posted by their users has been the subject of debate around in the world as politicians, regulators, and the broader public seek to navigate policy choices to combat harmful speech that have implications for freedom of expression, online harms, competition, and innovation. To help sort through the policy options, Daphne Keller, the Director of Intermediary Liability at Stanford’s Center for Internet and Society, joins the podcast this week. She recently posted an excellent article on the Balkinization blog that provided a helpful guide to intermediary liability law making and agreed to chat about how policy makers can adjust the dials on new rules to best reflect national goals.

Law Bytes
Episode 268: Sara Grimes on the Moral Panic Behind Banning Kids from Social Media and AI Chatbots
byMichael Geist

May 11, 2026
Michael Geist
May 4, 2026
Michael Geist
April 27, 2026
Michael Geist
Ep. 265 – Jason Millar on Claude Mythos, Project Glasswing, and the Governance Crisis in Frontier AI
April 20, 2026
Michael Geist
Search Results placeholder
Michael Geist on Substack
Recent Posts
The Lawful Access Two-Headed Surveillance Monster: How Bill C-22 Went Off the Rails
How Much Further Will Lawful Access Go?: Police Chief Tells Bill C-22 Hearing That Three Years of Metadata Retention Would Be “Ideal”
Bill C-22’s Groundhog Day: Why the Government’s Dismissal of Signal, Apple and the U.S. Congress Concerns Runs Back the Disastrous Online News Act Playbook
Slick Videos Won’t Save Lawful Access: Why The Government’s Bill C-22 Defence Avoids the Charter, Privacy and Security Concerns Raised By Critics
The Law Bytes Podcast, Episode 268: Sara Grimes on the Moral Panic Behind Banning Kids from Social Media and AI Chatbots

