Platforms told by CJEU: Do delete the derogatory remarks… If you’re asked to
In its recent judgment in the case of Eva Glawischnig-Piesczek v Facebook Ireland Limited C-18/18 the Court of Justice of the European Union (CJEU) clarified… Read more
In its recent judgment in the case of Eva Glawischnig-Piesczek v Facebook Ireland Limited C-18/18 the Court of Justice of the European Union (CJEU) clarified that a hosting provider can be required to remove defamatory content and equivalent content on a worldwide basis. This does not mean, however, that hosting providers need to actively monitor and censure content.
What is the background to this case?
Having failed to resolve the matter in correspondence, in 2016 the Austrian MP and chair of the Green Party Eva Glawischnig-Piesczek filed for an injunction against Facebook to remove a user comment which the court agreed was harmful to the applicant’s reputation, it “insulted and defamed her”. The injunction was to “cease and desist from publishing and/or disseminating photographs showing the applicant if the accompanying text contained the assertions, verbatim and/or using words having an equivalent meaning as” the initial harmful comment.
Facebook failed in its jurisdictional challenge, claiming it was subject to the laws of California or Ireland.
Facebook disabled access to the initial comment for users in Austria only. However, the Austrian first instance court went further and stated that equivalent content (for example, content derived from the original defamatory remarks or that had been shared by other users) should also be deleted. This meant that Facebook would have to take steps to actively find the equivalent content in its platform. Facebook appealed.
On appeal, the court held that equivalent content should be removed only if brought to the attention of Facebook by the applicant or third parties. The appeal therefore clarified that Facebook did not have to actively find the equivalent content on its platform.
A further appeal considered if the equivalent content of which Facebook is unaware was also in scope of the injunction: that is to say, would Facebook only need to delete the defamatory content if it was brought to its attention or does Facebook have an obligation to monitor content on its platform. As this raised matter of EU law, a reference was made to the CJEU.
According to the CJEU, does Facebook (or any other hosting provider) have an obligation to monitor content?
In its analysis, the CJEU considers two sections of the E-commerce Directive 2000/31, which provide that:
- The hosting provider shall not be liable for content if it:
- A lack of knowledge: “does not have actual knowledge of illegal activity or information and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or information is apparent”; and
- Rapid remediation: “upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information”.
- Hosting providers cannot be put under a general obligation to monitor information on their platform or to actively “seek facts or circumstances indicating illegal activity”.
In light of this, the CJEU noted that the objective of an injunction must be “effectively protecting a person’s reputation and honour” while not “imposing an excessive obligation on the hosting provider”. It followed that any obligation to search for content must not require the hosting provider to carry out an independent assessment of that content.
Is monitoring content banned altogether?
The CJEU did however note that imposing an obligation of monitoring of a specific case is not prohibited. Accordingly, a court can legitimately request the hosting provider to remove any content that is identical to the offending content (for example, something that been forwarded or shared) as part of a specific monitoring obligation.
What obligations do hosting providers have to find equivalent content?
The CJEU also considered the hosting provider’s obligations in relation to “equivalent content”, i.e. a post with a “message the content of which remains essentially unchanged” which “diverges very little form the [offending] content”. To prevent that the injunction is easily circumvented this, the CJEU clarified, must include content that is worded slightly differently, i.e. using other words or combinations of words.
It then followed that the hosting provider can be required to carry out a specific monitoring obligation of elements set out in the injunction and the “defamatory content of an equivalent nature”.
The injunction will have to provide sufficient guidance on what combination of words and content should be searched and removed as equivalent content. In this regard, the court cautioned that illegality stems from defamatory statements and not from a combination of words.
Finally, the national court will also have to consider if the hosting provider has recourse to automated search tools and technologies which will allow it to carry out the search without manual review of content.
Does this only apply in Europe?
The CJEU noted that there is no territorial limitation under EU law. This is consistent with the theme of the judgment which sought to ensure the effective protection of the reputation of those on the receiving end of defamation.
However, the CJEU reiterated the legislature’s desire to ensure compliance with international law expressed in the recitals of the E-commerce Directive. The CJEU held that for consistency in community law, due account must be taken of international rules when imposing worldwide takedown obligations.
What are the key takeaways of this case?
- Delete the defamatory data! – This case is good news for those defending reputation.
- Defining equivalent – watch this space – It will be important to provide sufficient detail in the injunction to ensure that any takedown process and subsequent monitoring effectively defeats any variations of the defamatory message and avoids the need for any parallel action. We will observe with interest where the courts draw the line between what is or is not “equivalent”. Afterall, the same sentiments can be captured by a large number of derogatory words.
- Risks to freedom of speech: The case could also have implications for freedom of expression. Commentators have noted that the removal of content not considered illegal in one country could be ordered just because it is considered illegal in another country. In this regard, the lower courts have described the comment in question as ‘hate speech’. This should serve as a sufficiently high threshold to prevent this case from being used to attack legitimate criticism on social media.
- No active censuring or monitoring by hosting platforms – The statement above does not mean, however, that “hosting providers” need to actively monitor and censure content.
Share this blog
- Adtech & martech
- Artificial intelligence
- EBA outsourcing
- Cloud computing
- Complex & sensitive investigations
- Cryptocurrencies & blockchain
- Data analytics & big data
- Data breaches
- Data rights
- Digital commerce
- Digital content risk
- Digital health
- Digital media
- Digital infrastructure & telecoms
- Emerging businesses
- Financial services
- KLick DPO
- KLick Trade Mark
- Open banking
- Software & services