Today

Thursday Thinkpiece: Cheung on Search Engine Liability in the Autocomplete Era

Each Thursday we present a significant excerpt, usually from a recently published book or journal article. In every case the proper permissions have been obtained. If you are a publisher who would like to participate in this feature, please let us know via the site’s contact form.

Defaming by Suggestion: Searching for Search Engine Liability in the Autocomplete Era

By Anne S.Y. Cheung, Associate Professor, The University of Hong Kong – Faculty of Law, in “Comparative Perspectives on the Fundamentals of Freedom of Expression” (Andras Koltay, ed.), forthcoming.

Excerpt: pp 1-14

[Footnotes omitted. They can be found in the original via the link above]

Whilst different jurisdictions have yet to reach consensus on search engines’ liability for defamation, Internet giant Google is confronting judges and academics with another challenge: the basis of liability for defamation arising from its Autocomplete function. With Autocomplete, Google no longer merely presents us with snippets, excerpts of relevant web pages originating from third-party websites, after we type in our search queries. Rather, it suggests associated search words and terms to us before we even complete typing the words as originally planned, and before we even press ‘Enter’. By constantly altering the query based on each additional keystroke in the search bar, Autocomplete changes the way search queries are generated. In other words, Google anticipates, predicts or even feeds us ideas, and may redirect our interests in the process of our search attempts. For instance, if one searches for Bettina Wulff, the wife of former German President Christian Wulff, terms such as ‘escort’ and ‘prostitute’ are automatically paired up with her name in the Google search box. One can only imagine the surprise of the unsuspecting reader who had no idea of the rumour that Wulff had once been an escort, let alone the distress of Wulff herself. Although most jurisdictions are reluctant to hold search engines liable for defamation, judges seem to hold different views when it comes to such liability in the case of Autocomplete.

In 2014, for example, the Hong Kong Court of First Instance held that a claimant whose name was often paired with ‘triad member’ in Autocomplete had a good arguable case of defamation to proceed with and dismissed a claim of summary dismissal application made by Google in Dr Yeung Sau Shing Albert v Google Inc (hereinafter referred to as Yeung v Google). Earlier, in 2013, the Federal Court of Germany held Google to be liable for violating a plaintiff’s personality rights and reputation for associating his name with ‘fraud’ and ‘Scientology’ in an Autocomplete search (this case is hereinafter referred to as RS v Google). Are these decisions justified?

Most of us are likely to be hesitant in holding Google liable for defamation based on its search engine results. After all, nearly all of us are indebted to search engines, and our lives would be considerably more difficult without them. Search engines are powerful intermediaries that enable Internet users to identify and locate information from the gigantic volume of data that has flooded cyberspace. The World Wide Web is made up of up 60 trillion individual pages, with over three billion Internet users, every one of whom is a potential contributor. In the face of such daunting amounts of information, search engines play an indispensable role in identifying the best and most useful information for us. Yet, when search engines not only deliver potentially defamatory search results to us upon request, but actually suggest defamatory ideas to us, a different framework of legal analysis maybe called for.

The legal debate over the liability arising from the Autocomplete function captures the empowering and forbidding power of search engines. In examining the legal reasoning behind the Hong Kong case of Yeung v Google and German case of RS v Google, and comparing the two, this chapter argues that the orthodox approach to fixing responsibility for defamation, based either on the established English common law notion of publisher or innocent disseminator or the existing categories of passive host, conduit and caching in the relevant European Union Directive, is far from adequate to address the challenges brought about by search engines and their Autocomplete function. Whilst orthodox common law is strict in imposing liability in the case of a person’s participation in publication, and is fixated on identifying his or her state of knowledge and extent of control in the defamation action, the European Union approach is preoccupied with the over-simplified binary of seeing an intermediary as either an active or passive entity. The legal challenge posed by search engines, however, stems from the fact that they run on artificial intelligence (AI). Autocomplete predictions are automatically generated by an algorithm effectively using more than 200 signals to extrapolate information from the Internet, and then generating likely predictions from each variant of a word. The process takes place automatically, although the design of the algorithm is frequently updated and modified by engineers. In the entire process, Google retains control in generating its search results. The legal issue should be redirected towards examining the possible role played by the algorithm creators in the content or result generated. Thus, this chapter argues that, in its Autocomplete function, Google indeed plays a unique role in contributing to defamatory content. Although the Hong Kong Court has not delivered any definitive answer on the role and liability of Google Inc., in a summary application, the German Court has rightly recognised the novel legal challenge that search engine prediction technology presents and treated search engines as a special intermediary processor. As explained earlier, an Autocomplete suggestion responds to a search query in a unique way with the mere input of each additional stroke and without the user completing his or her query. In this ‘search -in-progress’, Google is neither entirely active nor entirely passive, but rather interactive. Thus, imposing liability on Google in a defamation action based on its Autocomplete function is justified in a notice-and-takedown regime when a substantive complaint has been made.

Search Engine and Autocomplete as Publisher: Yeung v Google

In Yeung v Google, the plaintiff sued Google Inc. for providing defamatory predictive suggestions through its search engine’s Autocomplete and Related Searches features. Whenever users typed Yeung’s name (Albert Yeung Sau Shing) into Google in English or Chinese, Google Autocomplete instantaneously and automatically generated a list of search suggestions in a drop-down menu before they clicked on the search button, some of which linked Yeung to the names of specific triad gangs and serious criminal offences. Likewise, when users typed his name into Google’s search box, characters or words related to triad societies were generated as outcome/results under a list of Related Searches. The plaintiff is a well-known businessman in Hong Kong and the founder of a company that engages in various business sectors, including entertainment and films, and manages a number of Hong Kong celebrities. Yeung was understandably upset by the Google search results and suggestions, and accordingly made a defamation claim against Google Inc. and sought an injunction to restrain it from publishing and/or participating in the publication of alleged libellous material. More specifically, he demanded that Google remove or prevent defamatory words from appearing or reappearing in any current or future Google searches. As Google Inc. is a US-based company, the plaintiff had to apply for leave to serve the writ of summons out of jurisdiction in the US. It was therefore necessary for Yeung to demonstrate that he had a good arguable case involving a substantive question of fact or law to be tried on the merits of his claim.

Although Justice Marlene Ng of the Hong Kong High Court delivered only a summary judgment, her reasoning was detailed (filling 100 pages) and centred largely on whether Google Inc. should be considered the publisher of the suggestions or predictions that appear in Autocomplete and Related Searches. The defence counsel’s major argument was that Google Inc. is not a publisher, as no human input or operation is required in the search process, but is rather a mere passive medium of communication. However, Justice Ng was not convinced, and subsequently ruled that there was a good arguable case for considering Google Inc. as a publisher.

Search Engine Liability

In defamation cases under common law, publication takes place when a defendant communicates a defamatory statement to a third party, and liability in defamation arises from participation in the publication of defamatory material. Under this strict publication rule, a person would be held liable for publishing a libel ‘if by an act of any description, he could be said to have intentionally assisted in the process of conveying the words bearing the defamatory meaning to a third party, regardless of whether he knew that the article in question contained those words’. Prima facie, the author, editor, publisher, printer, distributor or vendor of a newspaper is liable for the material therein.

Having said that, common law allows the defence of innocent dissemination for an individual who is not the first or main publisher of a libellous work but who ‘in the ordinary course of business plays a subordinate role in the process of disseminating the impugned article’. Well known examples of those who can make such a defence are the proprietors of libraries (Vizetelly v Mudie’s Select Library Limited) and newsvendors (Emmens v Pottle). To rely on this defence and to be seen as a secondary publisher or innocent disseminator, the defendant must show (1) that he or she was unaware or innocent of any knowledge of the libel contained in the work disseminated by him or her; (2) that there was nothing in the work or the circumstances under which it came to or was disseminated by him or her that should have led him or her to suppose that it contained a libel; and (3) that such want of knowledge was not due to any negligence on his or her part. The onus of proof is on the defendant. In comparison, a primary publisher is one who knows about or can easily acquire knowledge of the content of the article in question and has a realistic ability to control its publication. The Hong Kong Court refers to these two criteria as the ‘knowledge criterion’ and the ‘control criterion’. The former refers to the fact that a publisher must know or be taken to know ‘the gist or substantive content of what is being published’, although there may be no realisation that the content is actually defamatory in law. The latter points to the publisher’s realistic ability and opportunity to prevent and control the publication of defamatory content. The liability of the primary publisher is strict, with no defence available.

Armed with the common law principle of strict publication liability in Yeung v Google, Justice Ng concluded that Google Inc. is definitely a publisher. Applying the law to the given facts, it was obvious to Justice Ng that Google Inc. is in the business of disseminating information and had, in this case, participated in the publication and dissemination of the alleged defamatory statement. The company has created and operates automated systems that generate materials in a manner it intends, thereby providing a platform for dissemination, encouragement, facilitation or active participation in publication. If Google Inc. is indeed the publisher of its Autocomplete and Related Searches results, the next legal question is whether the company should be considered the primary or secondary publisher. It is this separate issue that proves precisely the limitation of common law in the face of contemporary technological challenges.

‘The’ Common Law

What Justice Ng did in the aforementioned case is apply the strict publication rule under an orthodox understanding of common law to an Internet service provider’s (ISP) liability. This approach was first propounded in Oriental Press Group Ltd v Fevaworks Solutions Ltd (hereinafter referred to as Fevaworks) by the Hong Kong Court of Final Appeal. The highest court in Hong Kong ruled that a provider of an online discussion forum is a secondary publisher and must bear legal liability for defamatory remarks posted by third parties, with its responsibilities being imposed from the outset, but that it does have recourse to the defence of innocent dissemination. The actual effect of the judgment is that online discussion forum providers now have to remove any alleged defamatory remarks within a reasonable time frame upon receiving notification from the complainant. The judgment has been cited as a faithful application of orthodox common law principles of publication. Yet, it should be noted that Hong Kong’s position constitutes a departure from a leading case in the area of search engine and Internet intermediary liability in England.

The English case that is of direct relevance to the present debate is Metropolitan Schools Ltd v Designtechnica Corp. (hereinafter referred to as Metropolitan Schools Ltd), the judgment on which was delivered by the English High Court in 2009. The claimant, Metropolitan Schools Ltd, was a provider of adult distance learning courses on the development and design of computer games. The first defendant, Designtechnica Corp., hosted web forums that include threads which the claimant said defamed it by accusing it of running a fraudulent practice. Google UK Ltd was another defendant because it published or caused to be published in its search engine a ‘snippet’ of information linking Metropolitan Schools to the word ‘scam’. The claimant demanded removal of the defamatory statements from the web forums and the Google search engine. The fundamental issue before the English High Court was whether Google as a search engine should be held liable for publication under common law.

Justice David Eady ruled that Google was not a publisher at all. Instead of asking whether there had been any participation in publication by Google, as generally seen in orthodox common law analysis, Justice Eady considered that the starting point should be examination of the ‘mental element’ , that is, the defendant’s degree of awareness or at least assumption of general responsibility. It was clear to him that Google does not have any mental capacity or the required knowledge because there is neither human input nor intervention when a search is performed automatically in accordance with a computer programme. Search results are generated, he said, by web-crawling robots designed by Google, which then report text matches in response to a search term. Furthermore, in the case in question, Google could not have effectively prevented the defamatory snippet from appearing in response to a user’s request.

Justice Eady’s position in Metropolitan Schools Ltd was consistent with his earlier ruling in Bunt v Tilley in which he did not treat ISPs as publishers of defamatory statements in an online discussion forum. Although one may criticise Justice Eady’s ruling as an unwarranted departure from orthodox common law principles on publication and defamation, his interpretation of common law is a response to the technological reality of the Internet age. In fact, he examined the rationale behind common law precedents, and further developed the law in light of contemporary challenges and the legislative developments in other European countries. First, he referred to Emmens v Pottle, a case dating back to 1885 that established the defence of innocent dissemination for newsvendors. Justice Eady examined the rationale behind the distinction between a publisher and a disseminator. He then commented that analogies are not always helpful, particularly when the law has to be applied to new and unfamiliar concepts. It was plain to him that, as as search engine is not the author of a defamatory statement, and is thus hardly comparable to a printer or newspaper proprietor, it can not be considered a primary publisher. Equally, it is not a library. At best, in Justice Eady’s view, it can be compared to a compiler of a conventional library catalogue, where conscious effort is involved. However, none of these analogies is entirely suited to the modern search engine. Thus, Justice Eady concluded that a search engine does not fit exactly into the category of disseminator. To a certain extent, it is even more innocent than a disseminator in passing on a defamatory statement. In 2009, there were approximately 39 billion web pages and 1.59 billion Internet users. At the time, Google compiled an index of pages from the web, and its Googlebot’s automated and pre-programmed algorithmic search processes then extracted information from that index and found matching webpages to return results that contained or were relevant to the search terms. Justice Eady preferred to characterise a search engine as a ‘facilitator’ based on its provision of a search service.

Second, Justice Eady perused the positions of various national courts on rulings concerning Google’s role in defamation as a search engine under European Union Directive 2000/31/EC (better known as the Electronic Commerce [EC] Directive). He found that none of the countries he considered, including France, Spain, the Netherlands, Portugal and Switzerland, have held Google to be liable for defamation as a result of it search engine results, with some (Portugal, Hungary and Romania) ruling that Google is only a ‘host’. In the end, Justice Eady fixed no responsibility on Google, and did not consider it to be a publisher either before or after notification of the defamatory statement in question. Google was not held liable for the publication of search results, as there was a lack of knowing involvement in publication and the company had no control over those results. With the benefit of hindsight (which will be explained further in the following section), we now know that the extent of Google’s control is much more extensive than Justice Eady envisioned it. If this newfound awareness of the technical ability of the Google search engine had been factored in, different legal reasoning may have been applied and a different legal conclusion reached. At the very least, it is unlikely that Google would have been considered a totally passive medium of communication.

Nevertheless, Justice Eady’s practical approach in examining the role of an Internet intermediary and its relation to publication is laudable. His position in considering the state of knowledge of such an intermediary at the forefront of any debate on publication and defamation was endorsed by the Court of Appeal in Tamiz v Google in 2012. The English courts need no longer worry about the common law conundrum on the vexing issue of publication for an Internet intermediary following England’s legislative reform in 2013. Unfortunately, the Hong Kong courts and those of other common law jurisdictions, which have been provided with no legislative guidelines in this respect, continue to grapple with this unresolved legal issue.

Adhering to the orthodox common law view of a strict publication rule, the Hong Kong Court of Final Appeal parted with the English approach in Fevaworks. It thus follows that Justice Ng of the Hong Kong High Court was bound in Yeung v Google by local precedent. In addition to relying on Hong Kong authority, importantly, Justice Ng also relied on the Australian authority of Trkulja v Google Inc. (No. 5) (hereinafter referred to as Trkulja), in which Justice David Beach of the Supreme Court of Victoria held that there was sufficient evidence upon which a reasonable jury, if properly directed, could return a verdict for the plaintiff and hold Google to be liable for defamation for its search results under orthodox common law principles. In Trkulja, the plaintiff was a music promoter who sued Google Inc. for search engine results that had turned up images and an article concerning his involvement with serious crime in Melbourne and alleging that rivals had hired a hit man to murder him. The jury’s verdict was that Google Inc. was a publisher of the defamatory material but was entitled to the defence of innocent dissemination for the period prior to receiving notification from the plaintiff. The company contended that the trial judge had a mistaken view of the law on publication and had wrongly directed the jury.

Justice Beach did not accept Google Inc.’s argument, and further declared that Metropolitan Schools Ltd and Tamiz v Google Inc do not represent the common law in Australia. He ruled that Google Inc. could be held liable as a publisher because it operates an Internet search engine, an automated system, precisely as intended and has the ability to block identified web pages. For this judge, a search engine is like a newsagent or a library, which might not have the specific intention to publish but does have the relevant intention for the purpose of the law of defamation.

Convincing as that reasoning may have been to Justice Ng of the Hong Kong court, she did not have the benefit of the more recent decision in Bleyer v Google Inc (hereinafter referred to as Bleyer) delivered by the Supreme Court of New South Wales in Australia. In Bleyer, the plaintiff sued Google Inc. for its search engine results turning up seven items defamatory of him and delivering them to three people. Google Inc. sought an order to permanently stay or summarily dismiss the proceedings as an abuse of process. In addition to the issue of disproportionality between the cost of bringing an action and the interest at stake, Justice Lucy McCallum had to determine the applicable law on defamation for search engines. After reviewing the English authorities in Metropolitan Schools Ltd and Tamiz v Google, and the Australian in Trkulja, Justice McCallum decided to follow the former and to distinguish the case before her from Trkulja. She ruled that there was no human input in the application of Google’s search engine apart from the creation of the algorithm, and thus that Google could not be held liable as a publisher for results hat appeared prior to notification of a complaint. Like Justice Eady, she relied on the landmark authority of Emmens v Pottle (the first challenge to the role of a newsvendor in defamation cases) and reiterated that ‘for a person to be held responsible there must be knowing involvement in the process of publication of the relevant words. It is not enough that a person merely plays a passive instrumental role in the process.’ Further, she expressed reservations about viewing Google Inc. as playing the role of secondary publisher, facilitating publication in a manner analogous to a distributor. In Justice McCallum’s view, an Internet intermediary does no more than fulfil the role of a passive medium of communication and should not be characterised as a publisher. She distinguished the decision in Trkulja from hers in Bleyer on the grounds that the former was based on Urbanchich v Drummoyne Municipal Council, which concerned the liability of the Urban Transit Authority in failing to remove defamatory posters placed on its bus shelters after receiving notice of the plaintiff’s complaint. As a result, Justice McCallum concluded that it was clear that Google Inc. was not liable as a publisher, and ordered the proceedings to be permanently stayed.

Regardless of whether one agrees with Justice Eady’s or Justice McCallum’s analysis and conclusions, both have applied common sense and fairness to examining the role of an Internet intermediary and its automated search engine system in the context of the debate over publication and defamation. As one critic comments, ‘not every act of dissemination can or should lead to liability for publishing defamatory matter’. The fundamental concept of publication in the Internet era must be carefully explored.

Comments are closed.