Forget me not: Limits to the right of erasure
In 2015, CNIL ordered the firm to globally remove search result listings to pages containing damaging or false information about a person. The following year,… Read more
In 2015, CNIL ordered the firm to globally remove search result listings to pages containing damaging or false information about a person. The following year, Google introduced a geo-blocking feature that prevents European users from being able to see delisted links but it resisted censoring search results for people in other parts of the world.
Four years later, judgments C-507/17 and C-136/17 GC the Court of Justice of the European Union (CJEU) have confirmed that Google’s interpretation of the scope and territorial limits of the ‘right to erasure’ is correct. Both Google and privacy groups consider the decisions a great win for freedom of speech.
The cases have clarified that there is no absolute obligation to remove a person’s information from search results. Any removal or ‘de-referencing’ request should be assessed by balancing the individual’s right to privacy and the public’s right to information. If the request is granted, the search engine has an obligation to implement measures that would ‘effectively prevent’ or at least ‘seriously discourage’ access to links about the individual by users googling the individual’s name in the EU but, as the CJEU clarified, there is no obligation under EU law to carry out de-referencing that would have effect outside the EU.
So what is the legal nitty-gritty of these cases?
What is RTBF?
The right to erasure (or right to be forgotten (“RTBF”) was established in the 2014 Google Spain case C-131/12. It allows individuals to make a de-referencing request on the basis of their fundamental right to privacy and the right not to be punished without law.
The balancing exercise
The controller will need to carry out a balancing act; for the RTBF to apply, the individual’s right to privacy must outweigh any other applicable rights or interests.
As a general rule, an individual’s right to privacy will prevail over any economic interest of the search engine or the right to information of the general public except where the interference with the fundamental rights is justified due to the general public’s ‘preponderant interest’ to have access to that individual’s information for particular reasons (for example, where the information is part of a persons’ role in public life).
However, the CJEU noted that the result of this balancing exercise may vary from member state to member state and may be subject to derogations under the GDPR.
How can a search engine determine if a “substantial public interest” outweighs an individual’s RTBF in relation to special categories of personal data?
In case C-136/17 GC, the CJEU clarified the application of the de-referencing obligation to special categories of personal data. This was a joined case where various public figures complained about the news coverage of criminal and religious allegations made about them.
The CJEU clarifies that the following criteria should be followed by search engines to determine if a substantial public interest outweighs an individual’s RTBF:
- Separate legal basis for processing: The CJEU reminded us that a search engine is a controller in relation to compiling search results or ‘verification’ which it carries out on the data subject’s request. Although the extent of its responsibilities and obligations is more limited than that of a publisher, the search engine does have separate controller obligations because its processing ‘is liable significantly to affect the data subject’s fundamental rights to privacy and to the protection of the personal data’. This means that the search engine’s obligations are not dependent on the publisher’s lawful basis of processing, any exemption applicable to the publisher or the publisher’s compliance or its failure to comply with a de-referencing request. Instead, the search engine can rely on the ‘manifestly made public by the data subject’ and ‘substantial public interest’ grounds for its processing of special categories of personal data, presumably, in connection with legitimate interest.
- Consider the sensitivity of the information: Further, the CJEU held that information relating to judicial investigations, trials or ensuing convictions, is data relating to “criminal convictions and offences” regardless of the outcome of the proceedings, and similar rules will apply. Even if such data is not special categories of data, the CJEU considered that the processing of such data is ‘liable to be particularly serious because of the sensitivity of those data’ and the search engine must consider ‘substantial public interest’ when faced with a de-referencing request.
- Seriousness of interference: If a de-referencing request concerning special categories of personal data is received, the search engine must consider the ‘seriousness of the interference’ having regard to the reasons of ‘substantial public interest’ and whether the references to the person are “strictly necessary” for protecting the freedom of information of internet users who may be interested in present but also past events.
- Consider the relevance of the information and the impact on the individual’s private life: The result of this exercise will be dictated by the nature of the information, its sensitivity for the individual’s private life and the interest of the public in having that information. The same applies to data relating to criminal convictions and offences. The search engine must consider whether the information may be out of date but also the public life of the individual, their past conduct, the public’s interest at the time of the request, the content and form of the publication and the consequences of publication.
- Prioritize current information: In any event, the search engine must adjust the list of results to give priority to links that refer to the current state of the investigations and proceedings.
Territorial scope and the principle of proportionality
In C-507/17 the French data protection regulator (CNIL) considered insufficient Google’s implementation of geo-blocking that would switch the user to his or her local search engine domain according to the IP address even if they typed in for example www.google.co.jp. The CNIL argued, although unsuccessfully, that there are still ways for individuals to use other domains and in any event, whatever domain is used the search results are drawn from common databases and indexing.
The CJEU disagreed. It balanced the GDPR’s objective to afford a high level of data protection to individuals against the principle of proportionality, a general principle of EU law. In this regard, the CJEU considered that the individual should be afforded protection in the EU which is his or her “centre of interests” where any interference with fundamental rights would have “immediate and substantial effects” on the individual. However, it would be excessive to implement de-referencing in non-EU countries which may not recognise such right or which may have a different approach to freedom of information of internet users.
Paradoxically, the CJEU noted that whilst the law did not require a search engine to implement de-referencing outside the EU, the law did not prohibit such practice and a regulator or court “remains competent” in balancing the rights to conclude that the search engine operator should carry out de-referencing on all domains.
These cases remind us of the objectives of the GDPR and the overarching principle of proportionality which will rein in any excessive interpretation of the GDPR.
They also highlight the need for parties to balance competing fundamental rights, particularly in relation to any excessive requests made by individuals exercising their data protection rights. In case of negative publicity, naturally the starting point is the overriding right to privacy of the individual unless disproved by the controller due to the individual’s public life which gives rise to the public’s interest in receiving information about the individual.
Conversely, when we balance competing legitimate interests we often see businesses start with the premise that the controller’s processing will likely not affect the right to privacy of the individual and absent any excessive processing, it should be allowed. Perhaps this case shows that it may be safer to start with the premise of the individual’s overriding right to privacy and work your legitimate interest from there. Documenting that analysis and the thinking process behind the conclusions will be essential for accountability.
Share this blog
- Adtech & martech
- Artificial intelligence
- EBA outsourcing
- Cloud computing
- Complex & sensitive investigations
- Cryptocurrencies & blockchain
- Data analytics & big data
- Data breaches
- Data rights
- Digital commerce
- Digital content risk
- Digital health
- Digital media
- Digital infrastructure & telecoms
- Emerging businesses
- Financial services
- KLick DPO
- KLick Trade Mark
- Open banking
- Software & services