The UK Court, Queen’s Bench division issued an important decision on the liability of Internet service providers late last week. Unlike the U.S., which established statutory immunity for intermediaries where they simply provide the forum for publication, Commonwealth countries such as the UK, Canada, and Australia still rely on common law principles leaving some question about the standard of liability for intermediaries for allegedly defamatory content posted on their sites.
Bunt v. Tilley involved an attempt to hold AOL, Tiscali, and British Telecom liable for allegedly defamatory postings. The claimant relied on the Godfrey v. Demon Internet case to argue that the court could hold the ISPs liable. That case has generated concern among ISPs in Canada as it does hold out the prospect for liability. The court was clearly uncomfortable with that decision, however, issuing a decision that was generally sympathetic to the ISPs.
In particular, the court concluded that "an ISP which performs no more than a passive role in facilitating postings on the internet cannot be deemed to be a publisher at common law." That is the good news as it provides some comfort to ISPs who can rely on this case to argue that they are not liable for doing nothing more than hosting content.
The bad news for ISPs is that they still face liability where they are put on notice about allegedly defamatory content. The court did acknowledge that "if a person knowingly permits another to communicate information which is defamatory, when there would be an opportunity to prevent the publication, there would seem to be no reason in principle why liability should not accrue."
While the decision is helpful, Canada and other Commonwealth countries should still consider establishing statutory protections for ISPs since those intermediaries continue to have incentives to remove what may be legitimate content based solely on being placed on "notice" that they are hosting allegedly defamatory content. When presented with a notice, most ISPs will remove the content without consideration for whether it is a legit claim because they run the risk of liability for failing to do so. It should take more than just a notice to remove this content. Much like child pornography and copyright, it should take a court order.