The GDPR and the right to be forgotten: myth or reality?

The GDPR and the right to be forgotten: myth or reality?

The new General Data Protection Regulation (“GDPR”) has been published in the Official Journal of the European Union May 4th, 2016 and will come into force in 2018.

It formalizes the right to oblivion (“right to be forgotten”) or right to erasure of personal data in a full section devoted to it (article 17). The GDPR reinforces the use of this right, which got a lot of media coverage since the judgment of the European Court of Justice (“ECJ”) of May 13th 2014 regarding Google Spain (1). In this judgment, the ECJ established, among other things, that search engines are data controllers within the meaning of the 1995 Directive 95/46/EC on personal data protection and are therefore subject to its requirements, including the right to erasure of data. This can be implemented after analysis of the necessary balance between the protection of individual privacy on the one hand and the public’s right to access and share this information on the other hand.

The right to be forgotten is indeed fundamental to the respect of privacy because the information on the web is difficult to erase and can have adverse consequences for an individual. It is not about allowing systematic deletion of all data or falsifying records and changing the past (eg. photo retouching in Stalin’s regime), the new Regulation of course reserves cases where data are needed for historical, statistical or scientific research, for public health reasons, or for exercising freedom of speech. The right to oblivion does not apply either when holding personal data is necessary for the conclusion of a contract or when required by law. Apart from these cases, the aim is to avoid living under the weight of a constant reminder of the past which is a phenomenon multiplied by digital environments.

However, criticisms are already coming out on the right to be forgotten, that would only be a cosmetic phenomenon, bringing nothing new compared to the provisions of the 1995 Directive, and regretting the missed opportunity to go further. But the GDPR explicitly states several provisions strengthening the right to be forgotten as we will see in this article (I). Then we will examine how the doubts about the interpretation of those provisions can be raised (II).

I The new provisions of the GDPR on the right to be forgotten

Not only is there a dedicated article for the right to be forgotten, but there are also several other provisions to reinforce its effects.

A / A new article on the right to be forgotten articulated with a strengthened right of opposition

Article 17 lists six cases in which any person may obtain the erasure of personal data from a data controller, as soon as possible, of personal data in six cases. You may recall that the Directive 95/46/EC mentioned data erasure only as a corollary of the right of access in the Article 12§b: “Member States shall guarantee every data subject the right to obtain from the controller:  as appropriate the rectification, erasure or blocking of data the processing of which does not comply with the provisions of this Directive, in particular because of the incomplete or inaccurate nature of the data.”

It is now specified that this right can be exercised in cases where:

  1. The data are no longer necessary in view of the purposes for which they were collected.
  2. The consent was required when collecting data (which is the case of sensitive data).
  3. A person exercising her right of opposition “for reasons relating to his particular situation.” The controller must demonstrate the existence of legitimate and compelling reasons to oppose it. The person exercises his right to object at any time in cases of commercial prospecting and profiling related to it.
  4. The data has been unlawfully processed.
  5. The data involve a minor.
  6. The erasure is under a legal obligation (e.g., new law or court decision).

This case list is intended to clarify the definition of the erasing data policy that the European Parliament wants to promote. For some authors, the statement of an exhaustive list of right to erasure cases is a regression compared to Article 12 of the Directive 95/46/EC, which simply invoked the opportunity to request deletion of data in case of non- compliance with the Directive. Beyond the interest to raise awareness of the practical applications of this right in everyday life, it does not appear that the perimeter is reduced. Indeed firstly because the list is quite exhaustive in itself and also because it allows erasure whenever the right to object is granted, a right which has been strengthened by the new Regulation.

The right to object reinforces the right to be forgotten

Indeed, the right to be forgotten can be invoked as a result of the right to object (Article 21) which has been revised and extended. This right states that “The data subject shall have the right to object, on grounds relating to his or her particular situation, at any time to processing of personal data concerning him or her […], including profiling […].The controller shall no longer process the personal data unless the controller demonstrates compelling legitimate grounds for the processing which override the interests, rights and freedoms of the data subject […].” We can see here an inversion of the proof of legitimacy since in the Directive 95/46/EC, Article 14 stated that the person concerned has the right “to object at any time on compelling legitimate grounds relating to his particular situation to the processing of data relating to him […].Where there is a justified objection, the processing instigated by the controller may no longer involve those data.” Previously, legitimate justification had to be provided by anyone to exercise the right to object; now the data controller has to provide a legitimate justification to turn down the demand. By default the request for objection is valid which leads to the validation of the erasure request, unless the data controller can find a valid reason to oppose it. We know that the overriding legitimate reason is the respect for the public right to access information. Admittedly, this text will not solve abuse of this argument by search engines but it seems, anyway, illusory to hope to find a legal formulation that would replace analytical work done case by case by the courts.

Improvement in processing times

Recital 59 of the GDPR further states for all claims relating to rights of access, rectification, erasure or blocking that « The controller should be obliged to respond to requests from the data subject without undue delay and at the latest within one month and to give reasons where the controller does not intend to comply with any such requests« . Currently, a delay of two months for treatment made by the website controllers is allowed. As for search engines, their procedures do not mention time commitment.

B / The case of erasing profiling data is taken into account by the GDPR

The case of profiling activities and personal data collected for such purposes is also handled by the GDPR. This happens to be extremely important since most of our personal data available on the web is not the result of a conscious and voluntary data recordings but the result of a complete traceability of web browsing activities by Internet intermediaries. Those data volumes are huge and represent the new Eldorado of the 21st century: the Eldoradata.

Recital 71 of the GDPR gives the definition of profiling: “any form of automated processing of personal data evaluating the personal aspects relating to a natural person, in particular to analyse or predict aspects concerning the data subject’s performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements, where it produces legal effects concerning him or her or similarly significantly affects him or her.

Two scenarios are detailed in the new Regulation: profiling for marketing purposes on the one hand and profiling for other purposes on the other hand.

Cases of profiling for marketing purposes

In those cases, the protection is drastic: anyone can request data deletion without having to justify compelling legitimate grounds (Article 17§c which refers to the right to object of Article 21§2).In doing so, the new Regulation will avoid legal contortions which currently engages the CNIL (the French Data Protection Authority) to fight against the uncontrolled use of data by social networks or data brokers. Thus in its last formal notice sent to Facebook February  8th, 2016 (which we expect the latest developments with interest, the company’s response has been sent to the CNIL on August 10th), the CNIL indicates that if the question of the combination of personal data for advertising is mentioned in the conditions of use of the social network, it is insufficient to his views. The combination of different data is not strictly required by the “contract” between the user and the social network, and therefore requires a separate approval from the user. We note that the CNIL rejects the argument of legitimate interest, noting that Facebook could not bypass this express consent by arguing that display advertising is done in the user’s interest because according to the Authority, this interest is too low and the data collection is too intrusive.

Cases of profiling for other purposes

The person may request deletion (Article 17§c which refers to the right to object of Article 21§1), but the data controller may oppose on the ground of a compelling legitimate interest. This notion of legitimate interest of the data controller is defined in recital 47 and could serve as a legal basis to treatment: “provided that the interests or the fundamental rights and freedoms of the data subject are not overriding, taking into consideration the reasonable expectations of data subjects based on their relationship with the controller.” This legitimate interest can be highlighted especially “such as where the data subject is a client or in the service of the controller.” But “The interests and fundamental rights of the data subject could in particular override the interest of the data controller where personal data are processed in circumstances where data subjects do not reasonably expect further.” A social network wanting to rely on its legitimate interest should be obliged to be more transparent about its many uses of personal data which are difficult to guess by the non-informed user. Thus, beyond the terms and conditions of service that no one reads, a social network should clearly inform the user of the many ways it intends to use its data and no default options should be implemented.

C / The “resale right” of the right of erasure

Article 17§2 details the responsibilities of the first data controller in case of dissemination of data to other data controllers. It has the obligation to transmit the deletion request it had to implement to all data controllers that use the data. It must do so by taking “reasonable steps”, “taking account of available technology and the cost of implementation.” The secondary data controllers must then erase any link, copy or reproduction of such data. This requirement is improved compared to the one stated in the 1995 Directive. Firstly, because we now have a broadcast obligation of the erasure request by default and the data controllers will have to demonstrate, if appropriate, that it is technically unreasonable. The Directive 95/46/EC put the emphasis on the technical difficulties (Article 12§c): “unless this proves impossible or involves a disproportionate effort.” Secondly, because the transmission of the erasure request is explicitly stated: “the controller […]shall take reasonable steps […]to inform controllers which are processing the personal data that the data subject has requested the erasure by such controllers of any links to, or copy or replication of, those personal data.” Previously, in the Directive 95/46/EC, it was not clear whether the “secondary” data controllers were simply informed of the request or if they had to implement it themselves.

But how can one effectively implement the right to broadcast the delete request? If the publisher’s website at the origin of the information is obliged to delete the offending content, it must transmit the request to all web players having referred to its content or resumed it on their site. It can either remove or update the web page for the offending content or block access to the content while protecting it with a password, or use special tags in the site’s robots.txt file (file at the root of a website, which contains a list of site resources that are not supposed to be indexed by robots indexing search engine). Thus, next time links will be updated, relevant content would not be resumed. This seems an effective solution for search engines. But as the robots.txt file is readable by anyone, it should not refer to URLs too explicit (some contain the name of the person and the title of the scandal), to avoid the erasure request from going public, which could cause the opposite of the desired effect (“Streisand effect”). One idea might be to anonymize URLs using a hash function, but it would require an evolution of search engines.

Beyond search engines, any website chronicling the offending content should be updated. Publishers could be required to periodically check whether or not the links they use have been « blacklisted ». In this case, not only the link should be deleted but the content that has been copied by the website should also be deleted or updated. Again, changes are necessary because this kind of monitoring is rare.

D / The emergence of a new complementary right: the right to restriction of processing

A new right to restriction of processing is defined in the GDPR (Article 18). In its definition in Article 4, it is stated that it is “the marking of stored personal data with the aim of limiting their processing in the future.” As recital 67 explains, this clause allows for suspension of the use of data by the data controller. On the web, this means temporarily turning off access to data. This deactivation lasts during the necessary time to verify the request of the person, in the following cases:

  • inaccurate data,
  • unlawful processing,
  • the data controller no longer needs the data, but they are still necessary for the person concerned for the establishment, exercise or defense of legal rights,
  • to exercise the right of to object, during the time that the data controller checks whether its own legitimate and compelling reasons prevail over those of the individual concerned.

In practice, this clause can be interpreted as an opportunity to obtain temporary suspension of access to one’s data as soon as a request for erasure has been made and for the duration of the application process, even if it will ultimately be denied. Recital 67 gives several indications of technical measures that could be taken for this purpose. This arrangement is very effective for a person using wisely the right of erasure. She will not have to bear a waiting period that may seem very long in certain situations, even if the GDPR has limited it to a month.

This new clause as interesting as it is for the person exercising it wisely, could however result in a number of abuses. Indeed, while today a majority of delisting requests are declined by search engines, they will now have to first accept the restriction of processing for each request. It is easy to imagine the misuse of this right and the violation of freedom of speech when the request concerns the site editor at the origin of the information … But we can also consider that this clause will have a positive effect and reduce the processing delay in order to rule out abusive claims and restore access to information …But what will be the costs?

E / Data shelf Life

The GDPR states a new obligation concerning the deadlines for data retention. The data controllers will have to indicate the expected duration of data retention when they first collect the data (Article 13). This also contributes to the awareness of the various parties to the notion of digital oblivion.

Detractors of the right to be forgotten use precisely the data shelf life argument to demonstrate that the new Regulation does not bring anything new. Certainly this concept exists since 1978 in France and has been codified in the Directive 95/46/EC, imposing de facto the right to have data disappear beyond a certain time limit. But this provision only appeared as a corollary to the definition of the shelf life in the same way as the right of erasure was only stated as a corollary of the right of access.

F / Very dissuasive financial penalties

Finally, whatever the interpretation of the new right to be forgotten, an important element is added in its favor: the maximal amount of fines defined in the new Regulation. Article 83§5 punishes the breach of rights stated under Articles 12 to 22, with an administrative fine capped at 20 million euros, or for a business, up to 4% of the total annual worldwide turnover in the last year, the highest amount being retained. If there is any doubt, a data controller will likely hesitate to risk such sums when simply removing the offending content costs no risk and no energy in legal defense. Critics of the right to oblivion thus question freedom of speech and the role of censor that may play Internet intermediaries subject to such financial pressure. The day such fines are actually applied may seem, however, still distant, when we see that the French authority, the CNIL has always used sparingly its own power of financial penalties, admittedly somewhat limited: maximal fines are €150,000, doubled in case of recurrence. But given the current economic war between Europe and the United States, no doubt that threat could be very useful.

As we can see, the new European Regulation on personal data protection states a series of specific and detailed provisions to implement the right to be forgotten. However, for some observers, they are not explicit enough regarding the Internet intermediaries and therefore lack their main goal which is to control their use of personal data.

II How relevant are the doubts about the applicability of the right to be forgotten?

We remember that the 2014 ECJ judgment had triggered panic among the supporters of freedom of speech on the net. However we have seen that the provisions of the new Regulation strengthen the field of the right to be forgotten. Yet there are some voices here and there to denounce the mediocrity of the statements and the triumph of the lobbies of internet intermediaries. Let’s examine them.

A / What about the responsibility of search engines confronted to the right to be forgotten?

Some authors regret that search engines are not clearly assimilated to data controllers in the GDPR. Again, the purpose of the Regulation is to remain as long as possible relevant for and it seems inappropriate to list the categories of data controllers, which could prove restrictive in the future and prevent new players to be included it (e.g., suppliers of Internet connected objects). Moreover, search engines are far from the only players to refer content since all websites may contain links to other sites.

Moreover, it seems difficult to imagine that the case law of the ECJ in the case of Google Spain in May 2014 could be challenged on this point: if a search engine had been obviously considered as a data controller in accordance with the Directive 95/46/EC, nothing in the GDPR should allow reversing the decision. The definitions that had helped support the demonstration of the ECJ on this subject have not changed in the GDPR and still apply to search engines, which have data processing activities, separate and in addition to the website editor activities and thus susceptible of affecting the fundamental rights of every person.

The only changing definitions (for the terms “processing” and “controller”) is the addition of “structured data” (art 4§2) in the definition of data processing, which concerns among others search engines.

Moreover, contrary to popular opinion, it is not proven that before the judgment of the ECJ in May 2014, it was impossible to consider search engines as data controllers in the meaning the Directive 95/46/EC. Indeed, the reference document interpretation of the Working Group « Article 29 » is the Data Protection Notice 1/2008 on aspects of data protection related to search engines and was adopted on April 4th 2008. Its conclusion is the following: « This balance can be found by distinguishing between the two different primary roles of search engine providers. In their first role, as controllers of user data (such as the IP addresses they collect from users and their individual search history), they are to be held fully responsible under the Data Protection Directive. In their second role, as providers of content data (such as the data in the index), generally they are not to be held as primarily responsible under European data protection law for the personal data they process. Exceptions are the availability of a long-term ‘cache’ and value added operations on personal data (such as search engines aimed at building profiles of natural persons). When providing such services, search engines are to be held fully responsible under the Data Protection Directive and must comply with all relevant provisions. » Rather than a restriction to the possibility to consider search engines as data controllers, this review shows that most of the world’s leading search engines providers are concerned since they collect user search history and proceed to value added operations on personal data, operations which are at the heart of their business model. Besides, the notice also stated that in some EU member states, data protection authorities had already specifically regulated the obligation of search engine providers to remove content, based on the right of objection of Article 14 of the Directive.

Therefore it appears that delisting search results is an established modality of implementation of the right to be forgotten that fully applies to search engines.

B / Right to delist versus right to be forgotten?

Another criticism is the lack of formal mention of the term « right to delist » in the new Regulation. Yet the words « erasure of any links » are quoted verbatim in Article 17. And the act of delisting search results, consisting of deletion of links pointing to a primary source of information is obviously one of the means to implement the right to be forgotten. It is a technical tool and as such it is not really relevant to mention it in a European Regulation. In short, to obtain the deletion of personal data, you can either:

  • Contact the website that originally used and published them: the primary data controller. If the request is valid, the website editor has the responsibility to remove the offending content or update it (Article 17§1). In addition, it will have to transmit to other data controllers to erase the data (or links referring to it) requests (Article 17§2). This is of course, the most effective approach for the individual since she does not have to contact all the data controllers by herself. An automatic erasure chain should spread on the web.
  • Or contact the secondary data controllers (search engines or other website publishers who have copied the information) to ask them to erase the copy of the data, or links to the primary source. They are indeed full data controllers, who are contacted under Article 17§1 and not as a result of the “resale right” of the right of erasure (Article 17§2).
  • Or do both requests in parallel.

In the first case, the data will be permanently deleted if the request is granted but they will remain visible for several months through the search engines, the time for them to update their links and caches.

In the second case, they will remain accessible on the original site and thus accessible from the search engines with requests using other keywords than the name of the person. They may also be accessible through other search engines which have not received the erasure request or other websites displaying the information. But when the erasure request is sent to major web search engines, it provides a very effective result.

This is possible because even if the primary data controller has properly objected to the request, no one prevents the person to direct her request to the secondary data controllers. They may not be able to oppose the same type of objections and may be forced to remove the data concerned and therefore in the case of search engines to remove links to the primary source. As the 2014 ECJ case law on the right to delist has shown, it is indeed valid to apply for the right to delist even if the erasure request by the primary data controller has not been granted as contrary to the right of information. Even if the initial publication is lawful and must be preserved, legitimate interest in processing by search engine is not necessarily the same. The accessibility and dissemination of information by search engines may constitute a greater interference with everyone’s private life, far bigger than the initial publication, which will quickly be relegated in the depths of the site’s archives.

The new Regulation now has an extraterritorial action (Article 3). It applies to personal data of European citizens regardless of the location of the data controller or its subcontractors. It can be inferred that the data must be protected uniformly regardless of the modalities for access, including the geographical location from which they are accessed. This is the meaning of the battle of the CNIL against Google since 2015: the CNIL asks Google to enforce the right to delist whether users access or, or any other geographical extension. Indeed, the universality of data processing provided by a search engine requires universal action to be truly effective. The new Regulation even seems to go further than the contribution of the 2014 decision: indeed, Article 17§2 states “the controller […] shall take reasonable steps, including technical measures, to inform controllers which are processing the personal data that the data subject has requested the erasure by such controllers of any links to, or copy or replication of, those personal data”.  Searching a person’s identity should clearly not return any links to the erased content, but what about other search queries? This remains to be clarified. Of course, the search with the identity is the common way but once the right to delist becomes common, why not go further? For example, wouldn’t the GDPR allow people to get erasure of links obtained by searching with keywords specific to the content? For example, if the name of the person is associated with a scandal X occurred at the date Y in the place Z, shouldn’t it be necessary to remove the links for these three keywords? One can imagine the difficulty of the exercise … but the wording of the Regulation is broad enough not to exclude it…

C / What about the responsibility of the social network providers on the right to be forgotten?

The real question concerns the Internet intermediaries such as social networks. As there has not been an ECJ judgment similar to the one of the search engines yet, interpretations of their submission to the GDPR are still many and varied. This may seem surprising since, obviously, a social network like Facebook already makes the effort to try to comply with the current Directive 95/46/EC. Even if the tools it deploys are criticized for their incompleteness or implementation difficulties, they already allow for example to delete an entire account or only some data from his account. And because they are obviously subject to the requirements of the Directive 95/46/EC, the CNIL could issue Facebook a notice for breaches of some of these requirements in January 2016. Moreover, if search engines are considered as data controllers because they store data, organize them as part of their indexing programs, back them up on their servers, and make them available to users in the form of lists of research results, it is difficult to imagine how social networks could escape this definition. They also record, organize, and present data in different formats and they manage as well personal and non-personal data.

The distinctive features of social networks

However, it is true that due to their very essence, social networks pose a real problem by allowing the manipulation of data by multiple actors. There are in fact several categories of personal data concerning a person A in a social network:

  • Data monitored and published by person A: some must be public because of the network configuration (eg profile picture) and others are public by default, but the user can make them private for a small group (this is however not always easy to set up, especially the default visibility of our publications by “friends” of our “friends”).
  • Data published by others (eg. when person B publishes a photo of person A).
  • Data generated by social networks from A activity on networks or other sites (tracing navigation through the use of cookies, allowing targeted advertising). The person is often unaware of this accumulation of her data, except when the ad targeting is particularly obvious (eg. I just searched a vacation destination, and the social network floods me with hotels or flight advertisements to this destination in the following minutes). Note: one way to escape this is to use the private navigation feature of your internet browser to disable tracing activities.

Deleting personal data, yes, but who is in control?

Having the right to erasure data you registered yourself on your social network account is both legitimate and already feasible, but spreading those erasures on other users’ accounts and controlling all the posts related to you is problematic, not counting the lack of control over the overwhelming collection of your navigation data.

While removing a publication from one’s account automatically propagates to friends’ accounts even if they had liked, shared or commented on the post, it is not the same if the “friend” has taken over the publication, by taking the photo and publishing it without reference to the original publication. Same case if a “friend” has made a publication on his own about another person. What about the responsibility of the social network? The option to delete content posted by a person B at the request of a person A seems to be excluded as it involves freedom of speech, except in the case of criminal offences. In the logic of the Directive 95/46/EC, one could imagine that any user of a social network become responsible for processing data she collected, recorded and used (by publishing it on the network), either if it were his own data or those of others, and that she was as such subject to personal data protection requirements. While recital 12 of the Directive already excluded the data processing carried out in the context of personal activities, it quoted the correspondence and the holding of address book, activities quite unlike the use of a social network. This was the thesis developed by Dr. Rebecca Wong in an excellent 2008 article titled « Social Networking: Anybody is a Data Controller ». It seems however that this idea has not prospered and that defamation actions were preferred by the complainants, rather than for non-compliance with the requirements of the Directive 95/46/EC.

In the GDPR, this idea seems definitively removed since recital 18 states that “this Regulation does not apply to the processing of personal data by a natural person in the course of a purely personal or household activity and thus with no connection to a professional or commercial activity. Personal or household activities could include correspondence and the holding of addresses, or social networking and online activity undertaken within the context of such activities.” Social network users cannot be considered as data controllers within the meaning of the GDPR. Article 2 of the Regulation also states that: “This Regulation does not apply to the processing of personal data by a natural person in the course of a purely personal or household activity”.

Therefore social networks are data controllers in their relationship with each user, not between users.

Recital 18 goes on to state: “However, this Regulation applies to controllers or processors which provide the means for processing personal data for such personal or household activities.” That is to say: it applies to social network suppliers.

In doing so, this recital returns to the old recital 12 of the Directive 95/46/EC and significantly strengthens it since it did not mention the case of social networks (they did not exist in 1995) and did not foresee the responsibility of networks as data controllers.

This is a formal response to the question of the submission of social network providers to the GDPR. The new Regulation must be interpreted in the light of its recitals which explicitly mention the case of social networks. Concerning the right to be forgotten, they have the responsibility to implement an adequate mechanism to meet the request of a person A to delete personal data it has filed voluntarily or not (traceability of web browser navigation).

However, the problem arises regarding personal data of a person A published by another person B on a social network. As B is a natural person, she will not be subject to the GDPR (the other third parties, corporations partners of the social network are of course concerned by the law on the right to erasure). This user B is, however, subject to the requirements of image rights and respect for privacy, just like today. Therefore, social network suppliers already take means to process personal data removal requests misused by third parties: reporting of abusive, hateful, defamatory content or contrary to the standards of the community, reporting of breaches to privacy and image rights, harassment, information concerning minors. Any user also has the possibility to remove her identification in a picture in Facebook.

Thus, outside of those cases detailed in the terms of services, it seems difficult to justify the erasure of personal data used in a reasonable manner by a third party removed, simply because they bother us.

The GDPR aims at protecting the use of personal data by companies or public organizations. It is not designed to manage the relationships between individuals. So the social network should be considered as a data controller subject to the GDPR in its individual relationship with each person using its services, but not as such in the links between people who use it. Individual people using the social network as part of a personal activity cannot be subject to the requirements of GDPR.

So how to go further to protect the use of personal data by other social network users?

Hopefully social networks should play an awareness role in the protection of personal data. Eg, they could provide a formal process for users to request deletion of personal data by other users, for fear of possible prosecution. It could also put in place incentives for the most reported users who do not respond to requests for removal, by temporarily banning them from the network. But this may not be in line with the legitimate interest of social networks and can in any case also be subject to abuse. But deep down, we may wonder if the real problem lies in the information published by third parties. Apart from abusive cases that can often already be reported, or outside the data collected for profiling purposes that are now supervised by the GDPR, shouldn’t everyone start cleaning their account of data they have themselves registered and published, and which in many cases are the more harmful? For this purpose, the tools already exist, which provide the historical posts log and allow removing all or part of the data. If the confidentiality of the account is well configured, the user can restrict broadcasts to her friends. With a little more customization, she can restrict and prevent her “friends” of “friends” from having access to her publications. She can also prevent her account from being found through a search engine.

From this analysis it appears that the new EU regulation on the protection of personal data is an important step on the right to be forgotten, consolidating inputs from the ECJ case law of May 2014 and offering new opportunities on personal data for European citizens. However, beyond the legislation that will always fall behind the technological innovations, it is time to adapt the education of citizens to the new digital challenges. Learning the tools to manage their personal data should be the subject of courses in the school curriculum as soon as possible to address the problem at source and give survival reflexes to children in the digital environment. In this area as elsewhere, prevention is always better than cure and the right to be forgotten should probably start by forgetting a little oneself.

(1) ECJ mai 13th 2014 Google Spain SL and Google Inc. vs Agencia Española de Protección de Datos (AEPD) and Mario Costeja González – affair C-131/12

All articles in this blog are the exclusive property of the author. Any reproduction (except for brief quotation specifying the source and author) without the express permission of the author is prohibited.

Laisser un commentaire

Votre adresse de messagerie ne sera pas publiée. Les champs obligatoires sont indiqués avec *