Avoiding defamation for third party web content

A recent UK case dealt with the issue of the republication of defamatory material by a search engine. It bolsters a New Zealand ruling that websites can avoid liability for defamation in situations where they are unaware of defamatory material and exercise no control over its publication.

In the case, MIS Limited v Google and others [2009] EWHC 1765, the plaintiff alleged that it had been defamed by various comments on a website forum. It was accepted that the comments were defamatory, and the plaintiff had a cause of action against the author of the comments and, possibly, the website operator.

However, at issue was whether Google was also liable for defamation, because it displayed the defamatory comments as preview "snippets" in its search results.

In order for a person to be liable for defamation, they must be responsible for a publication of defamatory material.

Where a defendant has actually produced and distributed defamatory material him- or her-self, it is usually easy to establish that the defendant is in fact responsible for the publication. However, a defendant may be able to establish that they were not responsible for the publication in situations where they have been "manipulated" by a third party.

For example, an organisation is unlikely to be liable if a person pins a defamatory note on its notice board, without the organisation's knowledge. But if the organisation fails to remove the material after becoming aware of it, then it may be said to have endorsed or adopted the defamation.

In the present case, the Court found that Google was not responsible for the publication of the defamatory material in its search results. Its indexes are compiled automatically and are not reviewed by anyone prior to their publication. The content is automatically published when users search on certain words. The Court stated:

"It is fundamentally important to have in mind that [Google] has no role to play in formulating the search terms. Accordingly, it could not prevent the snippet appearing in response to the user's request unless it has taken some positive step in advance. There being no input from [Google], therefore ... it cannot be characterised as a publisher at common law. It has not authorised or caused the snippet to appear on the user's screen in any meaningful sense. It has merely, by the provision of its search service, played the role of a facilitator".


The key factor is the absence of any input from Google. If Google had personnel who actively reviewed its listings and compiled the snippets, the situation would likely have been different.

It is important to note that the Court said that Google "could not prevent the snippet from appearing". Of course, Google could have prevented the snippet from appearing, for example by disabling that functionality. However, that would have been a highly intrusive and harmful standard to impose. Instead, the Court implicitly accepted that there was no reasonable way that Google could have prevented the snippet from appearing without causing significant inconvenience and cost. This, together with the express acknowledgement of Google's status as a "[mere] facilitator" could provide important defences in future cases.

Does notice matter?

The next issue considered by the Court was whether the fact that Google had been put on notice about the defamatory snippet (and possible further defamatory content) meant it had "assumed responsibility" for any further publication of such material.

The Court made a distinction between static websites and dynamic sites such as Google:

"A search engine ... is a different kind of Internet intermediary. It is not possible to draw a complete analogy with a website host. One cannot merely press a button to ensure that the offending words will never reappear on a Google search snippet: there is no control over the search terms typed in by future users. If the words are thrown up in response to a future search, it would by no means follow that [Google] has authorised or acquiesced in that process."


This line of reasoning is equally applicable to sites that operate forums, user ratings and the like: it is not (reasonably) possible to ensure that defamatory words will never appear (or reappear) on such sites, through the actions of users.

Effect of a "notice and take down" procedure

The Court also considered the relevance of Google's "notice and take down" procedure, whereby users can request that certain links and text be removed from its index.

The plaintiff had formally requested that the defamatory material be "taken down".  Google subsequently removed some material and blocked some URLs. However there was some delay between the request to remove the material and it actually being taken down. During this time, it was argued that Google must have authorised or acquiesced to the defamatory material of which it had notice.

The court said:

"It may well be that [Google's] ‘notice and take down' procedure has not operated as rapidly as [the plaintiff] would wish, but it does not follow as a matter of law that between notification and ‘take down' [Google] becomes or remains liable as a publisher of the offending material. While efforts are being made to achieve a ‘take down' in relation to a particular URL, it is hardly possible to fix [Google] with liability on the basis of authorisation, approval or acquiescence".


The result was that Google was successful in striking out the plaintiff's claim.

The New Zealand position

The Google case is consistent with an earlier New Zealand decision, Sadiq v Baycorp (31 March 2008, High Court, Auckland). That case involved a similar scenario, where incorrect and allegedly defamatory credit information was automatically published on a website. The Court said that before the website could be responsible for publishing the defamatory material:

"... there must be some action that amounts to a promotion of, or ratification of, the continuing presence of the defamatory material on the website".


An additional relevant point is the defence of "innocent dissemination" available in New Zealand under section 21 of the Defamation Act 1992. If a defendant is found to have been responsible for publishing defamatory material (unlike in the two cases discussed), then this section can provide a defence if the defendant did not know the material was defamatory (and was not negligent in failing to notice). This defence could be used, for example, by an email marketer who sends out a letter prepared by a client without realising that the letter contained defamatory material.

Because the Courts in both the Google and Sadiq cases found that the defendant was not responsible for publication (and had not "authorised" it), it was not necessary to consider the defence of innocent dissemination.

The Google decision is useful because it confirms the general approach taken by the New Zealand Court in Sadiq, but applies to a broader set of facts. Because the common law and statutory law of defamation in the two countries are similar, the Google decision is likely to be taken into account in this country when a similar case arises here.

Key lessons

The Google case (and to a lesser degree the Sadiq case) confirm that website operators can gain a robust level of protection against defamation liability by following some key precautions:

  1. If your website publishes third-party content (e.g. forums, search results of other sites, user ratings, etc), ensure that you are not seen as "assuming responsibility" for that content. In practice, this can include not exercising editorial control over articles and comments. This will not always be possible or appropriate on some websites.
  2. If editorial control is exercised, ensure that the risks of defamation are understood and considered when doing so.
  3. Ensure there is a process for properly handling complaints and requests to remove material. It is not essential to have a formal "notification and take down" process (any form of effective communication would likely put you on notice) but having a formal process may improve the handling of complaints. Removing offending material within a reasonable time of being put on notice reduces the prospect of a defamation claim succeeding.
  4. If possible, require your users to indemnify you for third party claims caused by their use of your site. This is appropriate because it is not possible to use your website terms and conditions to exclude liability to third parties for defamation.

Both of these cases, in particular the Google case, are pragmatic and useful clarifications of defamation law as it relates to websites.

Both cases also reconfirm that where a website operator is, or should be, aware of defamatory material being posted on its site (or "shuts its eyes"), it may be deemed to have "authorised or acquiesced" in making the defamatory statements.


James Carnie
Principal
james.carnie@clendons.co.nz

Guy Burgess
Senior Associate
guy.burgess@clendons.co.nz

Clendons
PO Box 1305
Auckland
New Zealand
Phone: +64 9 306 8000

This article by its nature cannot be comprehensive and cannot be relied on by clients as advice. It is provided to assist clients to identify legal issues on which they should seek legal advice. Please consult the professional staff of Clendons for advice specific to your situation.