INTRODUCTION
On May 13, 2014, the European Court of Justice (ECJ) rendered its now well-known Google Spain decision, in which it deduced from European data protection law, a right for individuals to remove search results displaying personal information that has become out-dated or irrelevant in such a way as to harm their privacy rights.1 Though this right has come to be known as the “Right to be Forgotten”, it is in all actuality closer to a “Right to be Delisted”, being limited to search results appearing when searching for a person’s name.2
Today, the EU Right to be Delisted (RTBD) stands —in two ways— at a crossroads. First, the French Conseil d’Etat, following a years-long dispute between Google and the French data protection authority, the Commission Nationale de l’Informatique et Libertés, has queried the ECJ for a preliminary ruling to provide some much-needed guidance on the implementation of the right, and, in particular, whether removal of search results should be performed locally in the EU or in a global manner. Second, the recent entry into force of the General Data Protection Regulation (GDPR) raises the issue of whether and how its Article 17, which contains a “Right to Erasure / Right to be Forgotten”, should affect current delisting practices.
The present contribution looks to detail the impact of both of these developments upon the future of the RTBD in the European Union. In particular, it will be highlighted that, due to the lack of guidance on both procedural and substantive matters, the current implementation model of the RTBD, based on private ordering, lacks means of scalability. Therefore, the scope expansions contemplated by these two recent developments run the risk of leading the RTBD towards a practical breakdown. With this finding in mind, the contribution will submit an alternate implementation model, inspired by existing ADR systems.
The paper will begin with a short presentation of the legal concept known as the “Right to be Forgotten”, and will subsequently set out the details of the Right to be Delisted as formulated by the ECJ in its Google Spain decision. Each of the two aforementioned developments will then be analysed in turn, leading to a broader assessment of their possible impact upon the future of the Right to be Forgotten in Europe. The author’s submission will be detailed in the conclusion.
THE CONCEPT OF THE RIGHT TO BE FORGOTTEN
The term “Right to be Forgotten” refers to a legal concept resting on the premise that one should not be indefinitely held to past mistakes, misdeeds, or embarassing situations. Its scope of application covers many different kinds of situations: a young adult starting her professional career wanting pictures depicting her teenage party-going years to be removed from social media; a start-up CEO trying to prevent the press from discussing her past failed ventures so as to attract investors; an ex-convict wanting to start her life anew without being constantly reminded of her past misconduct.3 Rather than being seen as a right to be forgotten in the literal sense, the right should instead be understood as an acknowledgement of society’s capacity for forgiveness and empathy regarding past mistakes. In the specific context of online communications, the Right to be Forgotten has been depicted as a right to “digital redemption” or at least as a much-needed protection against the Internet’s capacity to act as an always available archive of human memory.4
The exercise of the Right to be Forgotten is traditionally balanced against the fundamental rights of free speech and of the public’s right to information;5 for instance, in the case of the ex-convict mentioned above, her right to start a new life may, depending on the severity of her past offence, be limited by the need for those within her new social circle to be informed and kept safe.
Expressions of the Right to be Forgotten have existed in different shapes and forms in the domestic legal systems of the Member States of the European Union, in both the online and the offline context and through various means such as personality rights (e.g., defamation law) and data protection laws.6 Given, however, that privacy laws have yet to be harmonised across the Member States of the European Union7 aside from a handful of general guidelines laid down by the jurisprudence of the European Court of Human Rights,8 there is currently no unified response to the issues raised by the Right to be Forgotten in Europe.9
GOOGLE SPAIN AND THE EU RIGHT TO BE DELISTED
It is in this context that the ECJ deduced from European data protection law a specific expression of the Right to be Forgotten in its landmark Google Spain decision of May 13, 2014.
The case concerned Mario Costeja González, a Spanish national, who, in 1998, was forced to sell off assets in a public auction so as to recover social security debts. Ten years later, while searching for his own name on Google, he found that the top search result led to a notice advertising that auction, which was published back then by local newspaper La Vanguardia and which was available on its online archive.10 Finding that the display of such out-dated and embarrassing information as a top result harmed his personality rights, especially now that he was in better financial shape, he applied to have both the notice removed by La Vanguardia, and the search result removed by Google.
To ground his claim, Mr Costeja González relied on the principles contained in Articles 12 (b) and 14 § 1 (a) of the 1995 Data Protection Directive.11 These provisions grant a so-called data right to erasure, which allow data subjects, i.e., persons whose data is processed, the right to object to and order the rectification, erasure or blocking of instances of processing of personal data deemed unlawful under the Directive. This is particularly the case where the data controller, i.e., the person responsible for the processing, no longer needs the data for the purpose of its processing, and for instance when the client of a given business requests that his or her customer file be erased from the business’ records.
Mr Costeja González’ request was rejected by the Spanish Data Protection Authority —Agencia Española de Protección de Datos— because the publication of the auction notice by that newspaper was lawfully mandated by Spanish law and was thus privileged under Article 7 (c) of the Directive. However, that authority, considering that Google enjoyed no such privilege as regards the display of its search results, ordered the search engine to take down the offending search result.12
The latter conclusion was controversial insofar as it was not clear whether the right to erasure granted by Articles 12 (b) and 14 § 1 (a) of the Directive included the right to request the takedown of initially lawful, yet out-dated, data; in other words, whether the right to erasure of data included a Right to be Forgotten.13 Thus, Google appealed the decision before the Audiencia Nacional, which then made a request to the ECJ for a preliminary ruling, asking the Court inter alia whether the Directive granted such an extended right to erasure.14
In its decision, the ECJ considered that the right to erasure also covered out-dated data given that “even initially lawful processing of accurate data may, in the course of time, become incompatible with the directive where those data are no longer necessary in the light of the purposes for which they were collected or processed. That is so in particular where they appear to be inadequate, irrelevant or no longer relevant, or excessive in relation to those purposes and in the light of the time that has elapsed”.15 Furthermore, the Court recognised that search engines played a “decisive role” in the public’s perception of individuals as they render the dissemination of personal information “accessible to any internet user making a search on the basis of [a] data subject’s name, including to internet users who otherwise would not have found the web page on which those data are published”.16
The ECJ thus ruled that search engines were, upon request, bound to remove any search results displayed upon searching a name and leading to “inadequate, irrelevant or no longer relevant” personal data. It further ruled that such delisting requests were to be individually collected, and examined by the search engines themselves, in particular so as to balance the issuer’s right to privacy against the public’s right to information in each individual case.17 Finally, the Court conferred upon the Data Protection Authorities of each Member State (DPAs) a supervisory role over the search engines’ delisting practices, notably providing them with the authority to receive appeals where a search engine refuses to honour a delisting request.18
On November 26, 2014, the Article 29 Data Protection Working Party, which was the official explanatory body of the Data Protection Directive,19 published a series of nonbinding guidelines on the RTBD.20 Among several clarifications on the personal and material scope of the right, the Working Party set out a basic list of delisting criteria.21
Though the RTBD has since been implemented by all major search engines, including Bing22 Google,23 and Yahoo!,24 much of the current discussions surrounding the topic have focused on the standards and practices of Google, a fact which is unsurprising given the latter’s dominance in the search engine market. As reported by Patrick Teffer, the vast majority of link deletion requests in 2015 were filed with Google and not with competing search engines.25 And though Google regularly publishes a transparency report containing statistics and general information about its practices,26 it does not report its individual decisions. Furthermore only around 2% of Google’s refusals have been met with an appeal to a national DPA, case law has indeed been scarce.27 In this sense, it might not be exaggerated to say that the main judge and architect of the RTBD is, as of today, Google itself.28 Though this de facto centralisation has been criticised as providing the company with too much power,29 this situation has had the benefit of stabilising the practice surrounding the RTBD, a much-needed development given the lack of guidance on the part of both the ECJ and the Working Party.30
However, that stability is now disturbed by two challenges, which look to expand the scope of the RTBD further.
TWO DEVELOPMENTS
The Google Inc. case and the geographical scope of the RTBD
The first challenge awaiting the RTBD concerns a pending request for a preliminary ruling filed before the ECJ by the French Conseil d’Etat, made in the context of an on-going litigation process involving the French DPA, the Commission Nationale de l’Informatique et des Libertés (CNIL), and concerning the controversial —and as yet unresolved— issue of the scope of the delisting obligation imposed by the RTBD.
The background of the dispute is as follows: in its Google Spain decision, the ECJ, as mentioned above, ruled that search engines were “obliged to remove” search results displaying “inadequate, irrelevant or excessive” information. However, the Court failed to specify whether offending search results had to be removed locally —i.e., only for internet users located in the European Union— or globally.
Lacking guidance on this point, Google, when it launched its search removal form back in May 2014, adopted a Top Level Domain based distribution scheme, ensuring that only search results displayed by the European versions of its service, such as google.de or google.fr, were compliant with the RTBD. This implementation model, however, did not hard lock users to their location of access through geolocation, meaning that European users only needed to type a foreign version of Google, such as google.com or google.ca, into their address bar in order to access a full list of results and thus circumvent the RTBD.31
The Art. 29 Working Party Guidelines, published a few months later, characterised Google’s practice as inefficient and further interpreted the Google Spain ruling as requiring a global delisting scheme:
The ruling sets […] an obligation of results which affects the whole processing operation carried out by the search engine. The adequate implementation of the ruling must be made in such a way that data subjects are effectively protected against the impact of the universal dissemination and accessibility of personal information offered by search engines when searches are made on the basis of the name of individuals.
Although concrete solutions may vary depending on the internal organisation and structure of search engines, de-listing decisions must be implemented in a way that guarantees the effective and complete protection of these rights and that EU law cannot be easily circumvented. In that sense, limiting de-listing to EU domains on the grounds that users tend to access search engines via their national domains cannot be considered a sufficient means to satisfactorily guarantee the rights of data subjects according to the judgment. In practice, this means that in any case de-listing should also be effective on all relevant domains, including.com.32
Google, however, doubled down on its practice. Indeed, its chosen implementation model for the RTBD had in the meantime been approved by the Advisory Council on the Right to be Forgotten, an independent group that Google appointed for the purpose of shaping its delisting practices and which was composed of high-profile scholars, entrepreneurs and stakeholders, including United Nations Special Rapporteur Frank LaRue, Le Monde’s editorial director Sylvie Kauffmann and German Parliament representative Sabine Leutheusser-Schnarrenberger.33
In its final report issued on February 6, 2015, the Advisory Council concluded that the interests of users located outside the European Union in accessing lawful content in their own jurisdiction, along with those of European users in having foreign search engine results available, outweigh the need for “absolute protection of a data subject’s rights”.34 Furthermore, the Advisory Council noted that Google.com already automatically redirected users to their local search results and that “over 95% of all queries originating in Europe are on local versions of the search engine”. With the approval of their independent expert panel, Google thus soldiered on with its domain name-based scheme.
Things started to stir again in May 2016, when the CNIL engaged formal proceedings against Google, requesting that search results be delisted globally. This prompted the company to change its ways by locking users into their ‘home’ Google through the use of geolocation technologies. This effort, however, was deemed unsatisfactory by the CNIL, which, on March 24, 2016, imposed a €100,000 fine against the search engine.35
Google appealed against the decision before the Conseil d’Etat. It also publicly criticised the CNIL’s position through an op-ed written by its Senior Vice President Kent Walker and published in the French newspaper Le Monde.36 The piece, stating that Google had already complied with the Google Spain decision to the best of its abilities, argued that the CNIL in essence ordered that “its interpretation of French law protecting the right to be forgotten should apply not just in France, but in every country in the world” (emphasis original) and that, in view of the principle of comity between nations, such order “could lead to a global race to the bottom, harming access to information that is perfectly lawful to view in one’s own country. For example, this could prevent French citizens from seeing content that is perfectly legal in France”.37
By decision of July 19, 2017, the Conseil d’Etat found that the case raised significant issues of interpretation of the Google Spain ruling, and forwarded three preliminary questions to the ECJ.38 It first asked whether the RTBD required search engines to delist offending search results in a global manner, or whether it sufficed that the search results be delisted locally, i.e., only when generated for users located in the European Union. If the ECJ were to opt for a local, EU-specific, delisting scheme, the Conseil d’Etat alternatively asked whether such scheme was to be enforced by locking users into their home jurisdiction through means of geolocation technologies, or whether a softer TLD-based approach, such as Google’s initial implementation model, was acceptable.39
Given the CNIL and the Working Party’s clear signals in favour of a global delisting scheme, it would not be too surprising to see the ECJ declare such a scheme as mandatory. However, Google’s criticism of this approach rings true insofar as the fact that the adoption of that option would have the practical effect of extending the reach of the RTBD, and thus, of EU data protection law, to situations in which it is geographically inapplicable. By doing so, it would offend the principle of comity between states.40
The second option submitted to the ECJ, i.e., a regional delisting obligation enforced through means of geolocation technologies, should be preferred. This option would respect the fact that content targeted by the RTBD may be held as lawful in other jurisdictions. As regards effectiveness, circumvention of such a scheme would only be possible through the use of dedicated technical means, such as a VPN connection or a proxy service. This would be sufficient to safeguard the stated goal of the RTBD, which, rather than takedown of the offending publication, is the partial delisting of that content so as to prevent its wide dissemination to the public.41 In this sense, a recent ruling of the Spanish Audiencia Nacional seemed to opt in favour of this option.42
That said, it should also be recognised that this discussion is, for all intents and purposes, nothing new. The question of the geographical scope of removal of unlawful content online has been a thorny issue in cyber law going all the way back to the seminal 2000 Yahoo! case, in which two French organisations sought the removal of auctions of Nazi memorabilia, which are prohibited under French law, from the website of the US-based company. In that case, the French Tribunal de Grande Instance de Paris, aware of the potential negative extra-territorial effects of a global injunction for the removal of that content, instead obliged Yahoo! to implement a regional, geolocation-based filtering scheme.43 Thus, it is surprising to see the French CNIL deviate from this precedent by requiring that Google delist its search results globally.
The entry into force of the GDPR
On May 25, 2018, the General Data Protection Regulation entered into force, creating a unified legal framework for data protection in the European Union and thus displacing the Data Protection Directive. Among many innovations, the GDPR includes a “Right to Erasure / Right to be Forgotten”, which is contained in Articles 17 and 19.44
In contrast to the limited nature of the RTBD as instituted by the ECJ in Google Spain, Article 17 of the GDPR foresees a general right of erasure of personal data, enforceable against data controllers at large, in particular when “the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed” and when “the data subject withdraws consent on which the processing is based […] and where there is no other legal ground for the processing”.45 The material scope of Article 17 of the GDPR may thus be broader than that of the RTBD, as Article 17 is purported to apply to any instance of online personal data, thus going well beyond search results. Its personal scope may consequently be broader as well, as the burden of erasure would be fostered upon a wider range of online intermediaries, including social media, news websites, content providers and hosting providers.46 Furthermore, paragraph 2 of Article 17 along with Article 19 markedly provide for an enhanced obligation of notification of the erasure to downstream data controllers, i.e., third parties to which the data targeted for erasure has been disclosed by the data controller, or which may reasonably be foreseen as processing said data for their own purpose.
According to this reading, the Right to be Forgotten in the GDPR should hereby replace and/or expand the existing regime instituted by the Data Protection Directive and Google Spain. However, whether this will actually be the case in the coming months, remains unclear.47
On the one hand, the discussions conducted around the time of the GDPR’s drafting suggest that its Right to be Forgotten was meant to apply broadly to internet actors rather than remain confined to search engines. While the privacy issues posed by these search engines were most certainly mentioned during the discussions surrounding the drafting of the Regulation, not least because the Google Spain case was pending at the time, they were only mentioned as one example amongst others, such as Facebook.48
On the other hand, the legislative intent behind the RTBF of the GDPR may be unhelpful precisely because the provision was drafted before the ECJ’s decision in Google Spain and did not account for the subsequent establishment of the RTBD. It may thus be argued that Google Spain, as it is being currently applied by search engines, is an iteration of the GDPR’s initial proposal, and that the entry into force of the latter Regulation does not significantly impact its implementation model. Indeed, given that the RTBD is still subject to refinement, as evidenced by the pending case mentioned above regarding its geographical scope, it seems unlikely that the slate will be wiped clean altogether by the entry into force of the GDPR.
Yet, there is a distinct lack of guidance regarding which changes, if any, are actually brought about by the transition between the two regimes. In this context, it is telling that while the entry into force of the GDPR has seen online intermediaries and businesses at large scramble to update their privacy policies and consent mechanisms,49 major social media players such as Facebook, Flickr and Linkedin are yet to implement specific procedures for RTBF claims. In contrast, the RTBD seems to be followed by search engines as if no regime change had occurred.
RISK OF A PRACTICAL BREAKDOWN
As shown through the two challenges described above, the future of the RTBD remains uncertain. While its current formulation may yet remain in place, that model may just as easily become displaced by a broader Right to be Forgotten having an expanded personal and material scope. In other words, the practical implications of the RTBD could shift from the regional delisting of search results by search engines to the global takedown of content by all kinds of online services.
In the latter case, however, it is submitted that the current implementation model of the RTBD —which is based upon the individual compliance of search engines and which has formed itself mostly through Google’s on-going practice— is not ready for such an expansion as it lacks means of scalability.
For instance, regarding procedural matters, there are no mechanisms in place to ensure the consistency of outcomes across multiple search engines. It is even unclear whether such consistency is even desirable in the first place.50 Furthermore, search engines currently decide upon delisting requests ex parte and based solely on the content of each submitted request. While such a procedure is already arguably deficient in terms of due process,51 it would clearly be insufficient in the context of requests for takedown of actual content rather than only search engine delisting, as such practice would run afoul of established principles regarding content takedown such as the Manila Principles on Intermediary Liability.52
Regarding substantive matters, the criteria for delisting, as established by the Working Party’s guidelines, leave much room for discretion for individual search engines.53 In addition, there lacks a unified methodology of ascertaining which substantive Member State law applies to the privacy and free speech elements of any given case.54 As such, predicting whether any link is subject to removal is already a difficult exercise under the current regime, a situation that does not bode well for a possible expansion of the RTBD.
As detailed by Daphne Keller, the entry into force of the GDPR does not solve these issues. The erasure procedure foreseen by its Articles 17 and 19 relies heavily on individual and independent private ordering; it provides data controllers with extensive discretion when deciding on each individual erasure request and offers no safeguards so as to promote consistency of decisions, or compliance with the principle of due process.55
There is thus a tangible risk that any expansion of the scope of application of the RTBD would result in a practical breakdown of its current implementation model.
PROPOSAL FOR A CENTRALISED ADR-STYLED MODEL
The above conclusion raises the question of how the RTBD can now be implemented in view of the new challenges it faces. The remainder of this piece will set out a number of potential solutions.
A first path would be to retain the current implementation model based on private decision-making and to correct its deficiencies. In this sense, Daphne Keller recommends that the procedural trappings of delisting and erasure be aligned with existing rules of European law regarding intermediary liability, contained in Articles 12 to 15 of the eCommerce directive and which provide for a notice and takedown procedure to curtail such liability.56 Failing that, she suggests that intermediaries take inspiration from these provisions along with other established standards for notice and takedown procedures such as the Manila Principles on Intermediary Liability.57
Regarding standard setting and effective oversight, Jacques De Werra —coining the term ‘Massive Online Micro-Justice’, or ‘MOMJ’— suggests that a possible model for inspiration lies in the Uniform Domain Name Dispute Resolution (UDRP) system, which since 1999, has served as a centralised dispute resolution organ for domain name disputes. As applied to the RTBD, he envisages a two-step decision-making process in which intermediaries would be placed at the first step, with an independent alternative dispute resolution body having jurisdiction on appeal. That body would be composed of industry representatives and DPA officers. It would offer proper due process, and have the additional tasks of overseeing the practice of intermediaries, punishing bad actors, and of developing commonly agreed-upon criteria for adjudication.58
A second path would be to reconsider the RTBD’s implementation model as a whole. Among those advocating this approach, Eldar Haber is of the opinion that placing the burden of the RTBD on the metaphorical shoulders of internet intermediaries in the first place is a mistake, as such practice has the effect of providing private entities with the capacity to weigh in on issues that should only be decided on by courts or designated official authorities offering the sufficient guarantees of due process. His main proposal would see that intermediaries be entirely removed from the overall erasure process, with the task of adjudicating RTBD disputes being either provided to DPAs and courts or to specially appointed administrative agencies.59
One solution worth exploring would be an expansion of the UDRP-style proposal put forward by De Werra. Instead of acting as an appeals mechanism, the alternative dispute resolution organ could be upgraded to the status of main centralised decision maker; in essence, the proposal would institute a truly international body offering requisite procedural safeguards. Requests for delisting and/or erasure would be made directly to that body, which would then decide on the issue and forward, if necessary, an order to all concerned intermediaries for the purpose of implementation. This model would handily solve the issues of scalability discussed above.
Such a model would also have the effect of liberating resources on the intermediaries’ side, since they would not need to create or maintain an ad hoc dispute resolution system. That benefit could then be put to use for the funding of the ADR body by imposing a payment obligation on intermediaries on a proportional “polluter pays” principle.
It could still be argued that the proposed model does not solve all of these issues. Indeed, an independent EU-created body as sole adjudicator of delisting requests may be subjected to an enormous administrative burden. Additionally, such a model would not diminish the uncertainty surrounding the privacy aspects of the RTBD, which are still formally contained in Member State law. To address these objections, two thoughts come to mind: for one thing, not all RTBD requests are complex cases involve the exercise of freedom of speech; it could thus be possible to imagine the hypothetical body being tasked with adjudicating, and thus filtering out, all of the “easy” requests: for instance, those that have no prima facie case, those that address pure data protection issues or those where the delisting criteria clearly point towards a solution. The remaining handful of “hard” requests —those where the balancing exercise becomes tricky— could then be forwarded to a competent national DPA or court for proper examination.
CONCLUSION
Though the Right to be Delisted has been part of EU data protection law since the ECJ’s May 2014 Google Spain decision, the operational details of the exercise of that right have remained, at best, unsettled. It is striking indeed that it is only now —four years later— that such fundamental issues such as the geographical scope of the obligation of delisting are beginning to yield concrete answers.
In this sense, it does seem that much of the current arrangements surrounding the RTBD have been the result of a de facto reaction by industry actors to the Google Spain ruling rather than a cohesive and principled effort to implement that ruling. The issues outlined through this contribution could thus be the symptoms of a more fundamental problem, and namely that there lacks a unifying vision for the future of the RTBD. Is it about delisting? Is it about erasure? Which online actors are concerned? What procedural safeguards should be implemented?
Now that the GDPR is in force, these remaining questions will need to be settled. Furthermore, to avoid a foreseeable breakdown, the implementation model of the RTBD will need to stop relying on the convenience of Google as a de facto centralised authority and be developed around a consistent and stable model, such as the institution of an ADR-style organism.