Can a Search Suggestion Be Defamatory?

Here is a summary article from Outlaw.com, reviewing the law in the UK and elsewhere as to whether Google’s ‘autocomplete’ function for search topics could be defamatory if one or more of the suggested completions for the search term entered had a nasty meaning. A number of countries have held Google liable, including France and Japan. The brief linked to here concludes that there would probably not be liability in the UK.

The Australian courts have followed the French – but not the UK opinion mentioned above – and have found Google liable in defamation for the suggestions that its algorithm makes when someone starts a search. Here, searching for the person’s name led to suggested sites that involved criminals.

Does this strike you as a fair result? Can Google (or another search engine that wants to help people complete their search queries based on what others have searched) prevent this result? Do they have to deal with complaints on an individual basis, so that when X complains, they just block auto-complete for any search of X? Can that be done, in practice – comparing the expense of doing so to the potential expense of damages and costs …

Would this kind of decision make it harder for those who “Google bomb” people, i.e. flood the internet with disparaging comments so that search engines turn up those comments when a person’s name is searched? (Or would the perpetrators be liable for defamation anyway?) I recall that being done to President Bush (the younger) – searching for some unlikely but pungent phrase (‘miserable failure’) produced the president as the first result. Google shut this one (and others) down: http://searchengineland.com/google-kills-bushs-miserable-failure-search-other-google-bombs-10363 . (This article, which is a few years old, notes that while Google ‘fixed’ the Google bomb, other search engines were still open to being manipulated this way.)

Comments are closed.