Trolls lurk in many dark recesses of the Internet. They make online browsing hurtful, defamatory, and sometimes, outright dangerous. These trolls are rarely slayed forever, and often raise their heads once again when given enough time.
The Ontario Court of Appeal recently reviewed an injunction granted in 2014 against a couple operating a website from publishing “in any manner” statement found to be defamatory towards an Ottawa lawyer, Richard Warman.
Among other grounds, the defendants sought a review of the permanent nature of the injunction as being overly broad. The very nature of the website in question was a discussion board with anonymous features, which effectively made the website inoperable. After the order for the injunction the website owners shut it down while waiting for the decision of the court.
The court reviewed the evidence presented at trial, which included the following:
- the appellants had refused, despite request therefor, to remove the defamatory statements from their website until approximately nine months after being served with the respondent’s first notice of libel and seven months after the delivery of his statement of claim;
- the appellants continued to post the respondent’s pleading and his second notice of libel on their website up to and after the commencement of trial;
- even after the jury verdict, the appellants posted a link on their website to a copy of the respondent’s statement of claim; and
- the jury found that the appellants acted with malice. This finding supported the conclusion that similar malicious postings on the appellants’ website could continue in the future.
Given the foregoing, the court found it reasonable to believe that absent a permanent injunction, this content would eventually be posted on the defendant’s site again.
But the court clarified that the potential liability for the defendants here, and the limits of the injunction, was based on whether there would be liability for third parties posting libelous statements on their forum,
 …Liability in that circumstance turns on whether the statements at issue have been deleted by the host after reasonable notice to delete has been given.
An imperfect remedy, as it would still require the plaintiff to monitor the defendants’ website. But a rather powerful remedy nonetheless.
The level of control a website operator has on anonymous comments comes up a lot in online defamation cases. The Supreme Court Canada explored this in part in Crookes v. Newton. The Court emphasized at para 85 the deliberate and knowing components for the legal definition of publication, which is a bilateral act where the publisher makes it available to a reader.
Although silence itself is not libel, failing to act to allow a continued presence for libelous statements may in fact be libel itself, especially if a defendant was made aware or had reason to be aware of defamatory information that they had sufficient control over.
These principles have already been applied in the U.K. defamation case of Godfrey v. Demon Internet Ltd.,  4 All E.R. 342 (Q.B.), where a failure to act once aware itself became a form of defamation because it could be perceived to be a deliberate act of approval, adoption, promotion or ratification.
This principle has been adopted by Canadian courts at the appellate level, but at least one summary judgment decision indicates that officers and directors of corporations may not always exhibit sufficient control over these websites to be held personally liable, especially where they do not hold editorial control.
Yet this type of solution still relies on a successful defamatory action, and usually with some pretty ugly facts. It does little to deter the worst kind of behaviour, especially the type of cyberbullying we see directed towards minors. Attempts to address this problem following the Rethaeh Parsons incident through Nova Scotia’s Cyber-Safety Act has proven unsuccessful, for now.
Other jurisdictions have had similar lack of success, and the Internet is the one place where inter-jurisdictional cooperation matters the most. New Zealand’s Harmful Digital Communications (HDC) Act has undergone extensive debate in Parliament over 3 years before being passed earlier this year, but appears to have borrowed heavily from some of the least effective aspects of the American Digital Millennium Copyright Act (DMCA).
Dany O’Brien at the Electronic Frontier Foundation states,
The HDC Act magnifies the well-documented flaws of the DMCA into the perfect heckler’s veto. This is an ideal tool for a coordinated harassing mob—or simply a large crowd that disapproves of a particular piece of unpopular, but perfectly legitimate speech. Moreover, if the original user misses the 48-hour window to respond to a takedown order, then they will have no legal avenue to restore their deleted work.
The HDC also failed to include some of the safeguards from the DMCA, such as a penalty for misrepresentation or limits on the number of complaints.
Other measures adopted abroad include a mandatory pin number used in Korea for all commenting, until hackers stole 35 million of these identification numbers. Similar attempts have been tried in Brazil, and there is a resurgence of interest for these types of registrations in France.
Canada will also have to balance strategic protections against SLAPPs (Strategic Litigation Against Public Participation). Bill 52 in Ontario, The Protection of Public Participation Act, 2015, was instituted following the 2010 recommendations by the Anti-SLAPP Advisory Panel to protect freedom of speech on matters of public interest.
Although the Bill was intended to provide protection against defamation suits when concerns are reported through the Internet, it will invariably have the effect of bolstering some trolling behaviour under the guise of public interest. The amendments to the Libel and Slander Act specifically provides very broad protections to persons beyond just the media,
Communications on Public Interest Matters
Application of qualified privilege
25. Any qualified privilege that applies in respect of an oral or written communication on a matter of public interest between two or more persons who have a direct interest in the matter applies regardless of whether the communication is witnessed or reported on by media representatives or other persons.
This does bolster the ability for citizen journalists to comment with impunity, but also presents a significant question over abuse. A citizen blogger, attending the Warman trial and commenting on it on their site, including the defamatory statements, would arguably be protected under the section. The defendants, doing the same thing on their site, likely would not.
The real issue then with dealing with online trolls may not be the permanence of any order against them, but rather preventing the trolls from multiplying elsewhere over hills on the horizon. That challenge is one which our legal system and the principles behind online defamation, has yet to equip itself for.