Right to be forgotten

The ‘right to be forgotten’ has been a hotly debated issue in recent years, and lies at the centre of discussions involving freedom of expression and privacy in the digital age. This note will attempt to outline what this right entails and what potential issues could be faced in its practical application. Currently, the question of the application of this right is largely restricted to European Union (EU) member states and this discussion will be limited to it.

 

Background

The ‘right to be forgotten’ has gained prominence since a matter was referred to the Court of Justice of European Union (CJEU) in 2014 by a Spanish court. In this case, Mario Costeja González had disputed the Google search of his name continuing to show results leading to an auction notice of his reposed home. The fact that Google continued to make available in its search results, an event in his past, which had long been resolved, was claimed by González as a breach of his privacy. He filed a complaint with the Spanish Data Protection Agency (AEPD in its Spanish acronym), to have the online newspaper reports about him as well as related search results appearing on Google deleted or altered. While AEPD did not agree to his demand to have newspaper reports altered, it ordered Google Spain and Google, Inc. to remove the links in question from their search results. The case was brought in appeal before the Spanish High Court, which referred the matter to CJEU. In a judgement having far reaching implications, CJEU held that where the information is ‘inaccurate, inadequate, irrelevant or excessive,’ individuals have the right to ask search engines to remove links with personal information about them. The court also ruled that even if the physical servers of the search engine provider are located outside the jurisdiction of the relevant Member State of EU, these rules would apply if they have branch office or subsidiary in the Member State.

 

Rationale

The ‘right to be forgotten’ is a misnomer, and essentially when we speak of it in the context of the proposed laws in EU, we refer to the rights of individuals to seek erasure of certain data that concerns them. The basis of what has now evolved into this right is contained in the 1995 EU Data Protection Directive, with Article 12 of the Directive allowing a person to seek deletion of personal data once it is no longer required.

Critical to our understanding of the rationale for how the ‘right to be forgotten’ is being framed in the EU, is an appreciation of how European laws perceive privacy of individuals. Unlike the United States (US), where privacy may be seen as a corollary of personal liberty protecting against unreasonable state intrusions, European laws view privacy as an aspect of personal dignity, and are more concerned with protection from third parties, particularly the media. The most important way in which this manifests itself is in where the burden to protect privacy rights lie. In Europe, privacy policy often dictates intervention from the state, whereas in the US, in many cases it is up to the individuals to protect their privacy.

Since the advent of the Internet, both the nature and quantity of information existing about individuals has changed dramatically. This personal information is no longer limited to newspaper reports and official or government records either. Our use of social media, micro-discussions on Twitter, photographs and videos uploaded by us or others tagging us, every page or event we like, favourite or share—all contribute to our digital footprint. Add to this the information created not by us but about us by both public and private bodies storing data about individuals in databases, our digital shadows begin to far exceed the data we create ourselves. It is abundantly clear that we exist in a world of Big Data, which relies on algorithms tracking repeated behaviour by our digital selves. It is in this context that a mechanism which enables the purging of some of this digital shadow makes sense.

Further, it is not only the nature and quantity of information that has changed, but also the means through which this information can be accessed. In the pre-internet era, access to records was often made difficult by procedural hurdles. Permissions or valid justifications were required to access certain kinds of data. Even for the information available in the public domain, often the process of gaining access were far too cumbersome. Now digital information not only continues to exist indefinitely, but can also be easily accessed readily through search engines. It is in this context that in a 2007 paper, Viktor Mayer-Schöenberger pioneered the idea of memory and forgetting for the digital age. He proposed that all forms of personal data should have an additional meta data of expiration date to switch the default from information existing endlessly to having a temporal limit after which it is deleted. While this may be a radical suggestion, we have since seen proposals to allow individuals some control over information about them.

In 2012, the EU released a proposal to unify data protection across Europe under a single regime, the General Data Protection Regulation. The regulation is still under consideration provides for a right to erasure under Article 17, which would enable a data-subject to seek deletion of data. Notably, except in the heading of the provision, Article 17 makes no reference to the word ‘forgetting.’ Rather the right made available in this regulation is in the form of making possible ‘erasure’ and ‘abstention from further dissemination.’ This is significant because what the proposed regulations provide for is not an overarching framework to enable or allow ‘forgetting’ but a limited right which may be used to delete certain data or search results. Providing a true right to be forgotten would pose issues of interpretation as to what ‘forgetting’ might mean in different contexts and the extent of measures that data controllers would have to employ to ensure it. The proposed regulation attempts to provide a specific remedy which can be exercised in the defined circumstances without having to engage with the question of ‘forgetting’.

The primary arguments made against the ‘right to be forgotten’ have come from its confict with the right to freedom of speech. Jonathan Zittrain has argued against the rationale that the right to be forgotten merely alters results on search engines without deleting the actual source, thus, not curtailing the freedom of expression. He has compared this altering of search results to letting a book remain in the library but making the catalogue unavailable. According to Zittrain, a better approach would be to allow data subjects to provide their side of the story and more context to the information about them, rather than allowing any kind of erasure. Unlike in the US, the European approach is to balance free speech against other concerns. So while one of the exceptions in sub-clause (3) of Article 17 provides that information may not be deleted where it is necessary to exercise the right to free speech, free speech does not completely trump privacy as the value that must be protected. On the other hand, US constitutional law would tend to give more credence to the First Amendment rights and allow them to be compromised in very limited circumstances. As per the position of the US Supreme Court in Florida Star v. B.J.F., lawfully obtained information may be restricted from publication only in cases involving a ‘state interest of the highest order’. This position would allow any potential right to be forgotten to be exercised in the most limited of circumstances and privacy and reputational harm would not satisfy the standard. For these reasons the rights to be forgotten as it exists in Article 17 may be unworkable in the US.

 

Issues in application

Significant technical challenges remain in the effective and consistent application of Article 17 of the EU Directive. One key issue is concerned with how ‘personal data’ is defined and understood, and how its interpretation will impact this right in different contexts. According to Article 17 of the EU directive, the term ‘personal data’ includes any information relating to an individual. Some ambiguity remains about whether information which may not uniquely identify a person, but as a part of small group, could be considered within the scope of personal data. This becomes relevant, for instance, where one seeks the erasure of information which, without referring to an individual, points fingers towards a family. At the same time, often the piece of information sought to be erased by a person may contain personal information about more than one individual. There is no clarity over whether a consensus of all the individuals concerned should be required, and if not, on what parameters should the wishes of one individual prevail over the others. Another important question, which is as yet unanswered, is whether the same standards for removal of content should apply to most individuals and those in public life.

The issue of what is personal data and can therefore be erased gets further complicated in cases of derived data about individuals used in statistics and other forms of aggregated content. While, it would be difficult to argue that the right to be forgotten needs to be extended to such forms of information, not erasing such derived content poses the risk of the primary information being inferred from it. In addition, Article 17(1)(a) provides for deletion in cases where the data is no longer necessary for the purposes for which they were collected or used. The standards for circumstances which satisfy this criteria are, as yet, unclear and may only be fully understood through a consistent application of this law.

Finally, once there are reasonable grounds to seek erasure of information, it is not clear how this erasure will be enforced practically. It may not be prudent to require that all copies of the impugned data are deleted such that they may not be recovered, to the extent technologically possible. A more reasonable solution might be to permit the data to continue to remain available in encrypted forms, much like certain records are sealed and subject to the strictest confidentiality obligations. In most cases, it may be sufficient to ensure that the records of the impugned data is removed from search results and database reports without actually tampering with information as it may exist. These are some of the challenges which the practical application of this right will face, and it is necessary to take them into account in enforcing the proposed regulations.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s