Schlagwort-Archive: Privacy

Your Facebook Data Just Got a Lot More Secure – Case Analysis C-362/14 Maximillian Schrems v Data Protection Commissioner

Dissecting the Safe Harbor Decision of the ECJ

[Note: For a broader overview on the topic look at our recently published Saar Blueprint by Oskar Josef Gstrein – Regulation of Technology in the European Union and beyond (10/15) which also covers the Schrems Case]

Kanad Bagchi[1]

Privacy is not something that I’m merely entitled to, it’s an absolute prerequisite.” Words of Hollywood legend Marlon Brando, which to the mind of the author, most aptly epitomizes the Opinion of the Court in its Schrems decision (“Opinion”), delivered on 6th October 2015. Long-standing concessions regarding data processing and transfer between the European Union (“EU”) and United States (“US”) were summarily dismissed in the face of competing claims to the right to privacy and data protection. The Court declared that Commission Decision 2000/520 (“Decision”) recognizing the equivalence of US data protection mechanisms, fails to ensure ‘an adequate level of protection’ for EU citizens, as mandated under Directive 95/46/EC (“Directive”), EU’s principle data protection law. Further, the Court reserved the powers of a Member State National Supervisory Authority to admit and examine claims against processing and transferring of data to third countries, irrespective of the European Commission (“Commission”) finding that a particular third country ensures an adequate level of protection. The Opinion is likely to derange data intensive businesses in the EU and US, compelling authorities on both sides of the Atlantic to rework existing transfer arrangement. In other words, the Opinion is arguably the strongest response to Edward Snowden’s revelations with respect to extensive surveillance and monitoring activities undertaken by US authorities in the recent past, and has already received much fanfare amongst privacy activists and the likes.

In the present post, the author dissects different aspects of the Opinion, in an attempt to produce more clarity and coherence on EU data protection rules and the Commission Decision on ‘Safe Harbor’, so as to underline the obligation of EU and member state authorities arising out of the same. The post also speculates on the immediate implications of the decision on US and EU tech firms and considers the momentous task ahead of the respective authorities.

Maximillian Schrems’s tryst with Privacy

In the backdrop of Edward Snowden’s revelations concerning mass scale Internet and phone surveillance conducted by the US National Security Agency, Mr. Schrems, an Austrian national, approached the Data Protection Commissioner in Ireland, insisting that Facebook Ireland be prohibited from transferring his personal data to the US. Schrems’s claim was rejected by the Commissioner on the grounds, inter alia, that the former was constrained from advancing a plea of ‘inadequacy of protection’ as the EU Commission through its Decision had concluded otherwise. On appeal however, the High Court reasoned that neither the Directive nor the Decision, when read in the light of both the Irish Constitution and the Charter of Fundamental Rights of the European Union (“Charter”), prevents national supervisory authorities from examining, in limine, a claim contesting the adequacy of protection afforded to his personal data in the third country. Finding that the above enquiry involved questions relating to the interpretation of EU law, the High Court thought fit to refer the questions to the ECJ for a preliminary ruling.

EU Safe Harbor rules and its context

Directive 95/46/EC has a twin set of objectives underpinning data protection within the EU and beyond. First, it provides a framework for the processing of personal data by member states of the EU and lays down certain safeguards pertaining to the same. Second, in the interest of international trade and business, it acknowledges and prescribes for a mechanism to ensure cross border free flow of personal data between EU member states and third countries. For the purposes of its second objective, and with which the author is most acutely concerned, the Directive prescribes for certain core principles (“safe harbor principles”) that ought to govern MS discretion in the transfer of personal data beyond EU borders. Article 25 of the Directive, inter alia, provides that a member state in approving such transfer of personal data is to satisfy herself that “…the third country in question ensures an adequate level of protection…” after considering all “…the circumstances surrounding a data transfer…” In this regard, if the Commission gathers that a third country falls short of providing for an ‘adequate level of protection’, member states ought to implement measures “…necessary to prevent any transfer of data of the same type to the third country in question…” Likewise, if the Commission finds that a third country ensures an ‘adequate level of protection’, member states are to similarly take measures in pursuance of the same.

To ensure the proper implementation of the above-mentioned principles, the Directive calls for the establishment of independent National Supervisory Authorities (“supervisory authorities”) within each member state, endowed with an extensive set of powers. For instance, MS are to consult their respective supervisory authorities while formulating internal measures to give effect to the Directive. Further, such authorities have the power to investigate and access data pertaining to processing and transfer, deliver opinions with respect to processing operations, and also the power, if not the obligation, to agitate through legal means before national courts, the incorrect or improper implementation of the Directive by member state authorities. EU citizens may approach supervisory authorities and lodge claims “…concerning the protection of his rights and freedoms in regard to the processing of personal data…”, and have the right to be informed of the outcome of their claim. In essence, a whole gamut of responsibilities relating to supervision and monitoring the implementation of the Directive has been conferred on national supervisory authorities.

In pursuance of its powers under the Directive, the Commission adopted Decision 2000/520 certifying that processes and mechanisms established by the US authorities ‘ensures’ an adequate level of protection of personal data transferred from the EU. In this regard, the Commission relied on a system of self-certification and public disclosure by organizations within the US, of their intent and willingness to abide by the safe harbor principles. The framework for the above mentioned process was implemented in accordance with the guidance provided by the US Department of Commerce through frequently asked questions. By way of derogation however, the applicability of the safe harbor principles to US organizations could be circumscribed so far as it is “…necessary to meet national security, public interest, or law enforcement requirements…”. It is important to note that the Decision was adopted in the year 2000, representing a state of affair dating back fifteen years and has remained unaffected since.

Ruling of the Court

The Court decided two sets of questions, namely, first, whether the powers of National Supervisory Authorities were constrained as a result of the Commission Decision on adequacy levels in the US and second, whether the Commission Decision was valid under extant rules of EU law.

At the outset, the Court observed that the Directive and its provisions ought to be interpreted in the light of Charter, especially Article 7 (privacy) and 8 (data protection), in as much as processing and transferring of data is liable to intrude into the Charter rights. Art. 28 (1) of the Directive therefore required member states to establish independent supervisory authorities tasked with the mandate to monitor the former’s compliance with EU law. Towards that end, the Court noted, supervisory authorities derive their power and competence directly from “…primary law of the European Union…” and operate independently to that of the Commission Decision. In the same breath, the Court determined that a Commission Decision adopted in pursuance of the Directive does not foreclose the power of the supervisory authority from examining claims relating to processing of personal data. If upon such examination, it appears that claims relating to the violation of Art. 7 & 8 of the Charter or the principles stated in the Directive are plausible, the supervisory authority ought be in a position to challenge the same in the courts of the member states, which in turn ought to refer the question to the ECJ through the preliminary reference procedure. Thus, in effect, the Court ruled that a determination by the Commission of the adequacy or inadequacy of a third country regime in protecting the rights of the individual does not prevent supervisory authorities from entertaining claims pleading to the contrary.

Although the High Court did not specifically raise the question of validity of the Commission Decision, the ECJ after perusing through the scheme of the safe harbor regime, concluded that “…until such time as the Commission decision is declared invalid by the Court, the Member States and their organs, which include their independent supervisory authorities, admittedly cannot adopt measures contrary to that decision…” Hence, it became imperative for the Court to examine the validity of the Commission Decision as against both the requirements of the Directive and the Charter.

While the Directive allows the Commission to conclude that “…a third country ensures an adequate level of protection within the meaning of paragraph 2 of this Article, by reason of its domestic law or of the international commitments…” it admittedly does not define either the content or the standard in determining the adequacy of protection afforded by the third country’s regime. Under such circumstances, the Court reasoned that ‘adequate protection’ ought to mean ‘essentially equivalent’ if not ‘identical’ to the protection afforded to citizens in the EU.

In this regard, the Court found that the system of self-certification could only constitute a reliable measure of adequacy if the same was backed by mechanisms to identify and punish errant US organizations. The Commission Decision however, to the mind of the Court, did not contain “sufficient findings” with respect to any such mechanism employed by the US to ensure an adequate level of protection. Moreover, a turning point for the Court was its finding that the safe harbor principles were to govern, albeit voluntarily, the conduct of US organizations only, without having a consequent binding effect on the US public authorities. Therefore, the Decision admitted of the possibility of the safe harbor principles and its applicability being limited by state authorities in the interest of national security or public interest. Consequently, the Court examined that the Decision was silent with respect to specifying either any limits to such state interference or to the existence of effective legal protection against the same. While EU law, interpreted in the light of the Charter and the Court’s prior rulings, limit state interference to what is “strictly necessary”, the Decision allows US authorities to store all personal data on a “generalized basis”. Such general collection and processing of data, without the possibility of an effective remedy, the Court declared, constitutes an infraction of the rights guaranteed under the Charter, including Articles 7, 8 and 47 (effective judicial protection), thereby affecting the validity of the Commission Decision.

In addition, the Court declared invalid Article 3 of the Decision in so far as it restricted the powers of the national supervisory authorities to entertain claims relating to the adequacy of protection enshrined under third country rules on data protection.

Further Comments

The Schrems decision mirrors Advocate General Bot’s opinion in most parts, barring however, some minor deviations (for a fuller enquiry here). In essence, the ECJ ruled that the present standard of protection afforded by the US does not match up to that of the EU and hence, US companies cannot be trusted with personal data belonging to EU citizens. Whether good wisdom prevailed on the Court, depends on which side of the debate one finds oneself on. Indeed, the Court conveniently assumed certain regulatory and administrative artifacts of the US system, without having provided US officials with the opportunity to be heard on the matter. Moreover, as pointed out, incessant reliance was placed by the Court, on outdated Commission reports suggesting a less rigorous approach towards the preservation of individual liberties in the US. Therefore extreme criticism has been leveled against the Court in condemning an entire system of rules and regulations on the basis of presumptuous evidence. Also the Court’s insistence on a standard of ‘adequacy’ resting on “essentially equivalent” rather than “identical” is neither precise nor helpful, leaving much to judicial oversight and less to bureaucratic discretion. While the Court did not find the process of ‘self-certification’ to be inherently repulsive to the idea of equivalent protection, it nonetheless emphasized that such certification alone was inadequate in the absence of consequent enforcement of the same. It begs the question as to whether the US authorities will now have to commission independent bodies much like the national supervisory authorities in the EU, to constantly monitor the implementation of the safe harbor principles.

Where has the decision left State authorities and Private corporations?

The EU and the US were already undergoing negotiations for a review of the Decision in the aftermath of the Snowden revelations, and it is reasonable to suggest that the Commission will have to seek more far reaching commitments from the US authorities than were previously estimated. Considering the differing standard of protection afforded to privacy rights in the EU and the US, a new agreement on the subject is likely to be long drawn and tiresome. In the meanwhile, personal data transfer for US companies is definitely going to get more cumbersome and costly, as the process of transfer would be largely governed by 28 different national rules on the subject, with each displaying varying degrees of bureaucratic skirmishes. Although there are reports suggesting that certain companies in anticipation of the decision, had already started reviewing their transfer policies, including considering moving to options like model contract clauses and binding corporate rules, the situation is a far cry from ‘business as usual’, especially for small and medium enterprises.

That apart, the Opinion has received a favorable response from privacy activists and human rights groups, especially in the light of the Court’s insistence that mass surveillance and indiscriminate sourcing of personal data constitutes a violation of the Charter rights. Further, as a result of the Opinion, supervisory authorities are likely to exercise a more active role in accessing the cross border transfer of data, which only adds yet another layer of protection to personal data. Privacy advocates are already anticipating that in the long run, European citizens may be able to contend that their data be stored and processed only within the borders of EU, much like the recent Russian agenda. Nonetheless, as things stand today, the clock has been turned back several years and much has been left to chance and uncertainty.

[1] Kanad Bagchi (kanad.bagchi@gmail.com) is an MSc Candidate in Law and Finance at the Faculty of Law, University of Oxford, UK. Formerly, he was a research assistant at Europa-Institut, Universität des Saarlandes, Germany.

Suggested Citation: Bagchi, Kanad, Your Facebook Data Just Got a Lot More Secure: Case Analysis C-362/14 Maximillian Schrems v Data Protection Commissioner, jean-monnet-saar 2015, DOI: 10.17176/20220308-174158-0

Case analysis of the ECtHR judgment in Delfi AS v. Estonia (app. No. 64569/09)

The difficulties of information management for intermediaries

By Oskar Josef Gstrein[1]

 A. Introduction

“The medium is the message”.[2] This phrase coined by the Canadian philosopher Marshal McLuhan in the 1960s seems to be nowhere as true as when it comes to the processing and distribution of information on the internet. The philosophy of media has boomed with the start of the new millennium and also other less “speculative” sciences such as law have to deal more and more with the aspects of information processing.

Since the collection of personal data has become a lucrative business model[3] there is a need for more and better regulation. However, not only the sheer content of data is important. Also aspects of accessibility and possibilities for contextualization define the “value” of data.

Recently, not only private actors try to design the future of the internal market of the European Union in that regard.[4] Regional authorities also seem to become more and more proactive in the field. The European Data Protection Supervisor Giovanni Butarelli is talking about “a defining moment for digital rights in Europe and beyond.”[5] The European Commission has declared the “Digital Single Market” one of its top priorities for the coming years.[6] National politicians like Angela Merkel warn their countries and the entire continent of falling behind in the technological arms race,[7] hence not being able to shape the future of the world. And ultimately, the regional courts keep continuing to deliver judgments which aim at redefining law and its application in the digital landscape.

It could very well be argued that especially the actors last mentioned have a constantly underestimated impact when it comes to shaping the future of cyberspace and the concept of privacy in the digital age. By now the Court of Justice of the European Union (CJEU) has delivered numerous judgments with ground-breaking character.

In the year 2014 it not only struck down the EU’s data retention directive 2006/24/EC on the 8th of April.[8] On the 13th of May it also established the right to delist information from the index of a search engine via its controversial “Google Spain” decision.[9] And it looks like with cases such as Max Schrems’ and his Europe v. Facebook campaign[10] pending before the court[11] the list will not come to an end soon.

What all of these judgments have in common is that their main legal problems are not connected with the content of the information that is being processed. What is crucial is the question of how accessibility and transferability of data is organized and evaluated from a legal perspective.

This can also be seen in the SABAM vs. Netlog judgment[12] and the UPC Telekabel Wien case.[13] Like the already mentioned decisions these cases clearly point to the fact that modern information management and its regulation is not only a matter of the content of information, but especially of the role of the so-called “intermediaries”. The regulation of intermediaries becomes an ever more important aspect when considering the future development of the digital space.[14] Their business practices and conduct is crucial for the accessibility, presentation and contextualization of information. When it comes to understanding the conditionality of liability of such service providers the Articles 12 to 14 of Directive 2000/31/EC can be helpful.[15]

However, regulation of the activities of intermediaries takes not only place within the EU. The Grand Chamber decision in the case Delfi AS v. Estonia[16] from the 16th of June 2015 the Court of Human Rights in Strasbourg (ECtHR) has delivered another important judgment which tries to strike the right balance between the fundamental rights to privacy and the freedom of expression and information.

B. The decision in Delfi AS v. Estonia

I. The background of the case

Delfi AS runs an online newsportal of national importance in the country of Estonia.[17] On the 24th of January 2006 an article with the title “SLK Destroyed Planned Ice Road”[18] was published. The report suggested that AS Saaremaa Laevakompanii (Saaremaa Shipping Company, a public limited liability company) made it impossible to use several ice roads. The latter temporarily connect the Estonian mainland to several islands in the region which SLK normally connects by offering ferry services.

The content of the report was not challenged as such. But Delfi also offered the possibility to comment on the article, which received 185 comments until the 25th of January 2006.[19] Some of them were directly relating to “L” who was a member of the supervisory board of SLK and the most visible public figure of SLK at the material time.[20] The ECtHR gave some examples of the comments under the article in his judgment:[21]

  1.  „(1) there are currents in [V]äinameri
    (2) open water is closer to the places you referred to, and the ice is thinner.
    Proposal – let’s do as in 1905, let’s go to [K]uressaare with sticks and put [L.] and [Le.] in a bag“
  2. „bloody shitheads…they bathe in money anyway thanks to that monopoly and State subsidies and have now started to fear that cars may drive to the islands for a couple of days without anything filling their purses. burn in your own ship, sick Jew!“
  3. „good that [La.’s] initiative has not broken down the lines of the web flamers. go ahead, guys, [L.] into the oven!“
  4. „[little L.] go and drown yourself […]”

L. wanted not only these comments to be removed from the website, but also asked for compensation for non-pecuniary damage. Delfi removed the comments six weeks after the publication.[22] However, when it came to compensation, the publisher denied any responsibility for the content of the comments and claimed it was only acting as intermediary service provider in that regard. Several procedures were conducted in the national courts. Finally, the last instance court in Estonia (the Supreme Court) came to the conclusion that Delfi had a responsibility to protect L from the consequences of the unlawful comments and therefore should have prevented the publication in the first place.[23] Subsequently, in October 2009, Delfi set up a more sophisticated monitoring system for the comments involving a review procedure by a set of moderators who look at any comment before it is published.[24]

II. The proceedings in Strasbourg

After all national remedies had been exhausted Delfi made an application to the ECtHR on the 4th of December 2009. The much discussed judgment[25] of the First Section of the ECtHR from the 10th of October 2013 turned down Delfi’s complaint that there was a violation of the freedom of expression by Estonia whose courts demanded from the company to manage the comments under the article more actively. However, on the 17th of February 2014 the judgment was accepted to be reviewed by the Grand Chamber of the Strasbourg court.

In the proceedings before the Grand Chamber the argumentation of the parties basically stayed the same. Delfi claimed that any responsibility of the company to prevent damage regarding the reputation of L infringed its freedom of expression under Article 10 of the European Convention of Human Rights.[26] The court thus summarizes the position of Delfi with the words:

„The applicant company called on the Grand Chamber to look at the case as a whole, including the question whether the applicant company was to be characterised as a traditional publisher or an intermediary. A publisher was liable for all content published by it regardless of the authorship of the particular content. However, the applicant company insisted that it should be regarded as an intermediary and it had as such been entitled to follow the specific and foreseeable law limiting the obligation to monitor third-party comments. It argued that intermediaries were not best suited to decide upon the legality of user-generated content. This was especially so in respect of defamatory content since the victim alone could assess what caused damage to his reputation.“[27]

Nevertheless, the Grand Chamber basically confirmed the Chamber and national courts’ judgments by coming to the conclusion:

„In connection with the question whether the liability of the actual authors of the comments could serve as a sensible alternative to the liability of the Internet news portal in a case like the present one, the Court is mindful of the interest of Internet users in not disclosing their identity. Anonymity has long been a means of avoiding reprisals or unwanted attention. As such, it is capable of promoting the free flow of ideas and information in an important manner, including, notably, on the Internet. At the same time, the Court does not lose sight of the ease, scope and speed of the dissemination of information on the Internet, and the persistence of the information once disclosed, which may considerably aggravate the effects of unlawful speech on the Internet compared to traditional media. It also refers in this connection to a recent judgment of the Court of Justice of the European Union in the case of Google Spain and Google, in which that court, albeit in a different context, dealt with the problem of the availability on the Internet of information seriously interfering with a person’s private life over an extended period of time, and found that the individual’s fundamental rights, as a rule, overrode the economic interests of the search engine operator and the interests of other Internet users […].“[28]

And

„[f]inally, turning to the question of what consequences resulted from the domestic proceedings for the applicant company, the Court notes that the company was obliged to pay the injured person the equivalent of EUR 320 in compensation for non-pecuniary damage. It agrees with the finding of the Chamber that this sum, also taking into account the fact that the applicant company was a professional operator of one of the largest Internet news portals in Estonia, can by no means be considered disproportionate to the breach established by the domestic courts […].“[29]

C. Interpretation and Context

Considering its institutional aspect the numerous and close references of the Grand Chamber of the ECtHR to EU law and CJEU jurisprudence indicates that at least in the digital space there exists a single space of human rights protection in Europe. Keeping in mind the cumbersome negotiation process concerning the EU’s accession to the ECHR this leaves some hope for a more integration-friendly future which is more strongly oriented at practical necessities than institutional battle.[30]

Materially, the Delfi case refers to the question of “self-censorship” and asks if we have to fear a future where “chilling effects” become part of the everyday experience in the online world.[31] The fact that modern information processing makes surveillance much easier than in the past results in new challenges. The concepts of liberty and freedom have to be emphasized more strongly and updated in the modern context in order to remain intact. Our societies have to create new spaces where it can be expected that no one interferes in the private sphere and where not having to show an expected status quo at any point in time is necessary. Put simply: There needs to be a part in everybody’s life where polarizing – not illegal ­ behavior is possible and accepted.

However, it is also important to emphasize that information networks now are strongly integrated into the lives of their users. With more power comes more responsibility. The fact that a news portal of national importance can be run through the internet also means that it has to be able to live up to the same standards of accountability as traditional media. This is probably the strongest argument why the judgment of the ECtHR was essentially right. The professionalism of Delfi combined with the moderate and proportionate punishment leave the impression of a sound overall evaluation of the situation.

Ultimately, the question remains what Delfi v. Estonia will or should be remembered for. Considering the special circumstances of the case involving much more resources and professionalism than when it comes to the exchange of views via the Internet it seems unlikely that it will set a precedent outside of the world of professional journalism. It would be surprising if the ECtHR and even national courts would had decided in the same way if not a medium of national importance was the place where the unlawful comments were posted. If Delfi had, for example, been a small private weblog of a person or a social community or forum things would have been different. The impact of the comments would not have been that serious.

The actual lesson to be learned from this case is that we live in an age where it is not only important whether (sensitive) data is accessible or not. The question is more and more how easily and through which means it is accessible. This aspect is largely determined by the fact how intermediaries are positioned to process the relevant piece of data and under which regulatory circumstances they are required to interfere. In which scenarios will their social responsibility to protect privacy and the dignity of a person be more important than their duty to enable the free movement of data, thoughts and speech? In order to find the right answer to this question a complex balancing process is needed which can only be successfully concluded by looking at the potential scenarios and concrete cases. There is a strong need for differentiation between the different contexts of data processing.

————————————————-

[1] Mag. Dr. Oskar Josef Gstrein, LL.M. is an Assistant Professor at the Department of Governance and Innovation of Campus Fryslân, where he is also member of the Data Research Centre. His PhD-Thesis is on the topic “The Right to be Forgotten as a Human Right”.

[2] McLuhan, Understanding Media: The extensions of Man, Mentor, 1964, New York, Chapter 1, p. 1: “In a culture like ours, long accustomed to splitting and dividing all things as a means of control, it is sometimes a bit of a shock to be reminded that, in operational and practical fact, the medium is the message.”

[3] Rashid, Surveillance is the Business Model of the Internet: Bruce Schneier, via: https://www.schneier.com/news/archives/2014/04/surveillance_is_the.html – accessed 24.07.2015.

[4] Cf. Microsoft Digital Single Market Communication Response, via: http://mscorp.blob.core.windows.net/mscorpmedia/2015/07/Microsoft-Digital-Single-Market-Communication-Response-FINAL-17-July-2015.pdf – accessed 24.07.2015.

[5] EDPS, Opinion 3/2015, Europe’s big opportunity, p. 9, via: http://bit.ly/1SJrGSq – accessed 28.07.2015.

[6] http://ec.europa.eu/priorities/digital-single-market/ – accessed 24.07.2015.

[7] Merkel said: „Europa – hier spreche ich für ganz Europa, das im Augenblick weder Google, Apple, Facebook noch andere solche Unternehmen hat – darf sich nicht nur auf seine industrielle Wertschöpfung konzentrieren, sondern muss auch darauf achten, geeignete Rahmenbedingungen zu schaffen, um große Datenmengen so zu verarbeiten, dass die Individualität geschützt ist. Darüber wird zurzeit in Europa diskutiert. Deshalb sollten wir nicht nur ablehnen, sondern wir sollten uns auch überlegen, wie wir im Konsumentenbereich noch mehr eigene europäische Unternehmen bekommen und Start-ups fördern können. Denn wir sind hierbei im Augenblick im weltweiten Vergleich nicht vorne dran.“ Rede von Bundeskanzlerin Merkel zum Deutschen Evangelischen Kirchentag am 5. Juni 2015, via: http://www.bundeskanzlerin.de/Content/DE/Rede/2015/06/2015-06-05-rede-merkel-kirchentag.html – accessed 24.07.2015.

[8] CJEU, C‑293/12 and C‑594/12, Digital Rights Ireland, Kärntner Landesregierung, ECLI:EU:C:2014:238, cf. my Blog Post in German via: https://jean-monnet-saar.eu/?p=314 – accessed 24.07.2015.

[9] Media often wrongly refers to this as the „right to be forgotten“ judgement. However, the right to delist information is not about the deletion or erasure of information. It only limits access. Cf. CJEU, C-131/12, Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González, ECLI:EU:C:2014:317. Also compare the Art 29 working group guidelines on the implementation accessible via: http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp225_en.pdf – accessed 24.07.2015. And finally a report on the success of the right to delist information from the 18.06.2015 via: http://ec.europa.eu/justice/data-protection/article-29/press-material/press-release/art29_press_material/20150618_wp29_press_release_on_delisting.pdf – accessed 24.07.2015.

[10] http://www.europe-v-facebook.org/ – accessed 24.07.2015.

[11] CJEU, C-362/14, Reference for a preliminary ruling from High Court of Ireland (Ireland) made on 25.06.2014 – Maximillian Schrems v Data Protection Commissioner.

[12] CJEU, C-360/10, Belgische Vereniging van Auteurs, Componisten en Uitgevers CVBA (SABAM) v Netlog NV, ECLI:EU:C:2012:85.

[13] CJEU, C-314/12, UPC Telekabel Wien GmbH, ECLI:EU:C:2014:192.

[14] Cf. Gasser, Schulz (editors), Governance of Online Intermediaries: Observations from a Series of National Case Studies, Berkman Center Research Publication No. 2015-5, via: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2566364 – accessed 24.07.2015.

[15] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‚Directive on electronic commerce‘), Official Journal L 178 , 17/07/2000 P. 0001 – 0016, via: http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32000L0031:en:HTML – accessed 28.07.2015. Cf. Woods, Delfi v Estonia: Curtailing online freedom of expression?, via: http://eulawanalysis.blogspot.de/2015/06/delfi-v-estonia-curtailing-online.html – accessed 28.07.2015.

[16] ECtHR, Delfi AS v Estonia, App. No. 64569/09, 16.06.2015.

[17] http://www.delfi.ee/ – accessed 28.07.2015.

[18] ECtHR, Delfi AS v Estonia, Mn 16.

[19] Ibidem, Mn 17.

[20] Ibid., Mn 16.

[21] Ibid., Mn 18.

[22] Ibid., Mn 19.

[23] Ibid., Mn 31.

[24] Ibid., Mn 32.

[25] ECtHR, Delfi AS v Estonia, App. No. 64569/09, 10.10.2013. via: http://hudoc.echr.coe.int/eng?i=001-126635 – accessed 28.07.2015; Cf. Synodinou, Intermediaries‘ liability for online copyright infringement in the EU: evolutions and confusions, Computer Law & Security Review, 2015, 31(1), p. 57 – 67; McCarthy, Is the writing on the wall for online service providers? Liability for hosting defamatory user-generated content under European and Irish law, Hibernian Law Journal, 2015, 14, p. 16 – 55.

[26] ECtHR, Delfi AS v Estonia, App. No. 64569/09, 16.06.2015, Mn 68.

[27] Ibid., Mn 66.

[28] Ibid., Mn 147.

[29] Ibid. Mn 160.

[30] Cf. the Blog posts on the topic: https://jean-monnet-saar.eu/?p=690 – accessed 28.07.2015; https://jean-monnet-saar.eu/?p=745 – accessed 28.07.2015.

[31] Cf. Cox, Delfi v. Estonia: Privacy Protection and Chilling Effect, via: http://www.verfassungsblog.de/en/delfi-v-estonia-privacy-protection-and-chilling-effect/ – accessed 28.07.2015.

Suggested Citation: Gstrein, Oskar Josef, Case analysis of the ECtHR judgment in Delfi AS v. Estonia (app. No. 64569/09), jean-monnet-saar 2015, DOI: 10.17176/20220308-165318-0