Does Intermediary Liability stifle online Free Speech?
There is generally more free speech now than ever before. This is largely due to the ease with which the public can have a voice on the Internet. Social media, for example, encourages people to share content with their friends and (depending on their privacy settings) everyone else online.
If a person uploads false and defamatory material, they are potentially liable. However, authors are not the only parties who may be liable. Intermediaries (e.g. website owners and ISPs) may also be held responsible for user-generated content ('UGC') appearing on their sites.
In the US, the law generally gives immunity to website operators where the defamatory material was entirely provided by a third party, unless the website is ‘responsible for the creation or development of information provided through the Internet’ – i.e. unless the website had participated in creating the defamatory content.
EU hosting exemption
The European hosting defence provides that a host is generally not liable for users' information which it is merely storing. Although somewhat untested, a 'host' may include a website operator and ISP. To benefit, the host:
- must not have actual knowledge of the unlawful activity or be aware of facts or circumstances from which the unlawful activity is apparent; and
- if the host obtains such knowledge or awareness (e.g. if it is notified by a claimant), it must act quickly to remove or disable access to it.
Thus, if the website has no knowledge of the defamatory statement, or if, once it is notified of the statement, it promptly takes it down, the defence should protect the host from a damages claim (but not an injunction, which is not covered by the exemption).
In July 2011, the EU Court of Justice gave guidance (in L’Oréal v eBay, a trade mark infringement case) about when a website operator is able to benefit. It held that the defence is only available to an operator which acts as a neutral service, and which merely technically and automatically processes data from users. The defence will fail where the operator plays an active role, such as to give it knowledge of or control over the data (for example, if eBay helps its users by optimising and promoting their listings). The defence will also fail where the operator is aware of sufficient facts or circumstances which would cause a ‘diligent economic operator’ to have identified the presence of unlawful activity on its site but does not take the material down.
UK Defamation Act
In England, a website operator/ISP may also benefit from a defence under section 1 of the Defamation Act 1996 if it proves that:
- it was not the author, editor or publisher of the statement complained of,
- it took reasonable care in relation to its publication, and
- it did not know, and had no reason to believe, that what it did caused or contributed to the publication of a defamatory statement.
In a 2001 case (Godfrey v Demon Internet), the ISP Demon failed to remove from a message board defamatory comments which had been notified to it and was therefore unable to rely on the defence. The scope of the section 1 defence is fairly untested, but it is likely to apply to unmoderated UGC where the website has taken reasonable care, not encouraged the allegedly defamatory content and taken it down promptly when notified.
Free speech alert!
The hosting exemption and the section 1 defence encourage website operators and ISPs to take down material as soon as they spot a potential legal issue or receive a complaint. This is because they will normally not want the expense and hassle of proving a substantive defence (e.g. by proving the truth of the allegations). This is normally a commercially sensible strategy, but it may sometimes unjustifiably restrict free speech (e.g. if the defamatory allegations are true, honest comment or protected by privilege). In contrast, the position in the US generally allows websites to continue to publish potentially defamatory material, even after receiving a complaint.
UK defamation law is in the process of potential reform and a draft Defamation Bill is before Parliament. It is hoped that the law strikes the right balance between the rights of free speech and reputation, so that hosts can have clear legal protection in relation to user generated content without having to opt to take so much content down.
If you have any questions on this article please contact us.
Tim Pinto and Damian Simpson
User generated content boosts free speech, but carries legal risk for website hosts. They may well take it down, whatever the merits, if they receive a complaint. This helps claimants, but can lead to a stifling of free speech.
"It is hoped that the law strikes the right balance between the rights of free speech and reputation, so that hosts can have clear legal protection in relation to user generated content without having to opt to take so much content down."