Schlagwort-Archive: Data Protection

Two years after Digital Rights Ireland: general data retention obligations might still be compatible with EU law

A review of the Advocate General’s opinion in Joined Cases C-203/15 and C-698/15

An article by Pieter Gryffroy*

A. Introduction

In its judgement of 8 April 2014 in the Digital Rights Ireland case,[1] the Court of Justice declared the 2006 Data Retention Directive (hereinafter: the Data Retention Directive)[2] invalid. The grounds for doing so were threefold. Firstly, the Directive had an insufficiently specific scope, covering “in a generalised manner, all persons and all means of electronic communication as well as all traffic data without any differentiation, limitation or exception being made in the light of the objective of fighting against serious crime”.[3] Secondly, the Directive failed to lay down objective criteria to determine the limits of access to the retained data for the purposes of prevention, detection and prosecution of serious criminal offences.[4] Thirdly, the Directive did not make any distinction between the different categories of data in relation to the duration of the retention, nor did it require that the determination of the period of retention be based on objective criteria and be limited to what is strictly necessary.[5] Therefore, while the Court recognised that the retention of data genuinely satisfies an objective of general interest,[6] it found that the interferences with the rights laid down in articles 7 (respect for private life) and 8 (protection of personal data) of the EU Charter created by the Data Retention Directive did not meet the conditions of proportionality.[7]

The decision in the Digital Rights Ireland Case created a legal vacuum in the EU. Moreover, in March 2015, the European Commission announced that it would not propose a new legal initiative concerning the subject matter of the Directive that was struck down by the Court.[8] As such it was left to the Member States to create their own legal framework for data retention following the annulment of the Directive, either by introducing new rules or adapting the existing national legislation transposing the Directive, in so far as necessary. After all, it is not because the Directive as such was invalidated, that the national laws transposing the Directive are necessarily incompatible with EU law, given that Member States retain some discretion in formulating the implementing provisions. The opinion of Advocate General Saugmandsgaard Øe,[9] which is under review in this article, addresses the interpretation of Digital Rights Ireland in the national context. More specifically, it assesses the general data retention obligations, imposed on telecommunications service providers by national laws in Sweden and the UK, as in force after the judgment in Digital Rights Ireland.[10]

B. Facts and Questions

I. Case C-203/15

On 9 April 2014, the day after the Court of Justice had rendered its judgment in Digital Rights Ireland, the telecom operator Tele2 Sverige (hereinafter: Tele2) notified the Swedish Post and Telecommunications Authority (hereinafter: PTS) that it would cease to retain data as required by the Swedish national rules implementing the Data Retention Directive. Tele2 also proposed to delete the data it had retained in application of those rules. The telecom operator did so because it had concluded that the Swedish rules in force, i.e. those transposing the Directive, were incompatible with the EU Charter of Fundamental Rights.[11] Notably, the Swedish rules did not only suffer from largely the same defects as the Directive, but even included a general retention obligation with a larger scope than the obligation under the Directive,[12] combined with lenient rules of access.[13]

In reaction to Tele2’s decision, the National Police Board complained to the PTS, stating that Tele2’s refusal to further cooperate endangered the police’s law enforcement activities.[14] On the 27th of June 2014, the PTS ordered Tele2 to resume its obligations under Swedish law within the month.[15] Tele2 contested this decision before the Stockholm administrative court, which dismissed the action. Thereupon Tele2 appealed that judgment before the Stockholm Administrative Court of Appeal, which referred the matter to the Court of Justice.[16]

In essence, the referring Court asks the Court of Justice whether the Swedish rules encompassing restrictions of articles 7 and 8 of the EU Charter are compatible with EU law, specifically article 52(1) of the Charter and article 15 of Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications) (hereinafter: the E-Privacy Directive).[17] Both provisions specify conditions under which national law may restrict the scope of the right to privacy and data protection, which they aim to protect.

II. Case C-698/15

In the wake of the Digital Rights Ireland case, the UK Parliament pushed through a new act, allowing security services continued retention and access to the data that was previously retained and accessed under the legislation implementing the Data Retention Directive. The so-called Data Retention and Investigatory Powers Act 2014 faced substantial resistance from Members of Parliament, civil rights organisations and legal scholars for being passed too speedily and not striking an appropriate balance between security, privacy and freedom of enterprise.[18] In the end, three applications were filed for judicial review of the lawfulness of the Act before the High Court of Justice (England and Wales), specifically against the broad power of the Home Secretary “to require public telecommunications operators to retain [all] communications data for a maximum period of 12 months”.[19]

The High Court declared that the act was inconsistent with EU law, specifically with the requirements laid down in Digital Rights Ireland, which the court considered to apply to national law on this issue.[20] However, on appeal, brought by the Home Secretary, the Court of Appeal (England and Wales) (Civil Division) provisionally stated that the Court of Justice did not lay down requirements for national law in Digital Rights Ireland, but simply identified and described protections and safeguards, which were absent from the harmonized EU regime.[21] Because clarification was still needed, the Court of Appeal referred the matter to the Court of Justice.

In essence, the referring Court asks whether Digital Rights Ireland creates mandatory requirements of EU law “applicable to a Member State’s domestic regime governing access to data retained in accordance with national legislation” which are to be applied by a Members State “in order to comply with Articles 7 and 8 of the [Charter]”.[22]

By decision of 10 March 2016 the Court joined both cases for the purposes of the oral hearing and the rendering of the judgment.[23] The common question in both of the cases under review is whether Digital Rights Ireland has created specific requirements under EU law to be adhered to by Member States that enact national provisions on data retention in the absence of any EU Directive requiring them to do so. In other words, does Digital Rights Ireland prevent Sweden and the UK from maintaining a general retention obligation under national law? After, all, the Directive was invalidated, amongst other reasons, because it contained an overly broad scope allowing the general blanket retention of all metadata[24] relating to electronic communications. Nonetheless, it remains unclear if this has the effect of prohibiting a Member State from autonomously introducing or maintaining a general obligation to retain communications data. The referring court in case C-698/15 additionally wonders whether the Court’s rejection of the Directive’s access regime because of the lack of objective criteria delineating the limits of such access has the effect of imposing certain requirements on UK legislation relating to subsequent access of the retained data.[25]

C. The Advocate General’s Answer

Despite the formulation by the UK court of its question, the AG only addresses the retention portion of the issue, namely the question whether “in the light of Digital Rights Ireland, Article 15(1) of Directive 2002/58 and Articles 7, 8 and 52(1) of the Charter are to be interpreted as precluding Member States from imposing on service providers a general obligation to retain data such as that at issue in [the cases under review], regardless of any safeguards that might accompany such an obligation”.[26] Such an approach can be understood, given the fact that only retention of data is still regulated on the EU level after the Court’s annulment of the Data Retention Directive. One could argue that the AG should have given reasons for seemingly ignoring the access issue raised by the referring court in C-698/15. However, as will become clear further on, the AG’s opinion provides an answer to this part of the question as well.

In his answer, the AG first considers that general data retention obligations fall within the scope of the E-Privacy Directive[27] and are allowed under it, provided the conditions of Art. 15 of that Directive have been fulfilled. The AG specifies that although Art. 11 of the Data Retention Directive inserted an Art. 15(1a) in the E-Privacy Directive, preventing the data retention obligations under the Data Retention Directive from being derogated from under national law, this does not indicate that general data retention obligations are as such incompatible with the regime of the E-Privacy Directive, but rather attests to the EU legislator’s will at the time to attain exhaustive harmonisation on the topic.[28] Art. 15(1a) of the E-Privacy Directive and Art. 11 of the Data Retention Directive, inserting said article in the E-Privacy Directive, have retroactively been nullified by the Court in Digital Rights Ireland, together with the rest of the Data Retention Directive. Nonetheless, the AG makes a valid point. Art. 15(1a) of the E-Privacy Directive prohibiting derogation through national law from the rules of the Data Retention Directive, including the Data Retention Directive’s broad general retention obligation, indeed does not hint at such an obligation being irreconcilable with the E-Privacy Directive, but rather points to the opposite conclusion.

Next, the AG addresses the question whether the provisions of the Charter apply to the matter.[29] In as far as retention is concerned, the AG states that since Member States are still bound to implement the provisions of the E-Privacy Directive, the Charter binds them as well, following Art. 51(1), first sentence of the Charter.[30] In other words, in the AG’s view, Member States introducing a general data retention obligation such as those at issue are implementing EU law. While it might seem questionable whether MS introducing or maintaining a general data retention obligation are actually implementing EU law, the AG is correct in his assessment in as far as article 15 of the E-Privacy Directive allows for a general data retention obligation to be introduced by the national law of the MS’s. Although implementing said obligation would be deviating from the substantive law of the E-Privacy Directive, it is the Directive and therefore EU law itself, which allows for the deviation. As such the MS’s would still be implementing EU law when introducing or maintaining a general data retention obligation, even in the absence of specific EU law requiring such an obligation to be present in national law, as the former Data Retention Directive did.[31]

However, national provisions governing the access to retained data for the purpose of fighting serious crime fall outside the scope of the E-Privacy Directive in any case,[32] and as such the Charter does not apply to them. Nonetheless, as the AG points out, “the raison d’être of a data retention obligation is to enable law enforcement authorities to access the data retained, and so the issue of the retention of data cannot be entirely separated from the issue of access to that data”.[33] Moreover, “provisions governing access are of decisive importance when assessing the compatibility with the Charter of provisions introducing a general data retention obligation in implementation of Article 15(1) of Directive 2002/58. More precisely, provisions governing access must be taken into account in the assessment of the necessity and proportionality of such an obligation.”[34]

Lastly, the AG ponders the question of the compatibility of a general data retention obligation. First the AG succinctly concludes that it is clear from Digital Rights Ireland that general data retention obligations are a serious interference with both the right to privacy and data protection as guaranteed under Arts. 7 and 8 of the Charter, and with several of the rights contained in the E-Privacy Directive.[35] Having established this, the AG moves on to assessing whether such an interference can be justified, under the cumulative conditions of Art. 15 of the E-Privacy Directive and Art. 52(1) of the Charter.

Six conditions are identified: “the retention obligation must have a legal basis, it must observe the essence of the rights enshrined in the Charter; it must pursue an objective of general interest; it must be appropriate for achieving that objective; it must be necessary in order to achieve that objective; [and] it must be proportionate, within a democratic society, to the pursuit of that same objective.”[36] Although several of those grounds were already mentioned in the Digital Rights Ireland case, the AG revisits each of them separately.

With regards to the first condition, the AG concludes that a general data retention obligation “must be established in legislative or regulatory measures possessing the characteristics of accessibility, foreseeability and adequate protection against arbitrary interference”. However, whether this is the case for Sweden and the UK is left up to the national court to determine, being in a privileged position to assess their national regimes.[37]

In assessing the second condition, the AG concludes that a general data retention obligation is capable of respecting the essence of the right enshrined in Arts. 7 and 8 of the EU Charter, in as far as sufficient safeguards are put in place, effectively protecting personal against abuse, specifically unlawful access and use. It remains for the referring Courts to verify the existence of such safeguards.[38]

In relation to the third condition, requiring the existence of a genuine objective of general interest, the AG finds that combating serious crime could qualify as such an objective. Combating ordinary offences or ensuring the “smooth conduct of proceedings other than criminal proceedings“, however, cannot in itself justify the adoption or maintenance of a general data retention obligation.[39]

In light of the foregoing, the next conditions considered by the AG are the appropriateness (4th condition) and the necessity (5th condition) of a general data retention obligation for supporting the fight against serious crime. The AG finds that such an obligation is liable to contribute to the fight against serious crime[40], but in order to be limited to what is strictly necessary to attain the set objective, certain safeguards must be observed, including at least all the guarantees described by the Court of Justice in paragraphs 60-68 of Digital Rights Ireland. With this the AG confirms the mandatory nature of the requirements set out in Digital Rights Ireland. As for the preceding conditions, it remains up to the referring courts to assess whether the national regimes in question fulfill these conditions.[41]

Finally, the AG stresses that any general data retention obligation must be proportionate within a democratic society, taking into account both “ the advantages associated with giving the authorities whose task it is to fight serious crime a certain ability to examine the past” and  “the serious risks which, in a democratic society, arise from the power to catalogue the private lives of individuals and to catalogue a population in its entirety.” In making this assessment the referring courts have to take account of all relevant circumstances. Additionally, the AG emphasizes that the mandatory requirements laid down in Digital Rights Ireland are no more than minimum safeguards and “consequently, a national regime which includes all of those safeguards may nevertheless be considered disproportionate, within a democratic society, as a result of a lack of proportion between the serious risks engendered by such an obligation, in a democratic society, and the advantages it offers in the fight against serious crime.”[42]

In light of the above reasoning, the AG concludes that Member States are not precluded under EU law “from imposing on providers of electronic communications services an obligation to retain all data relating to communications effected by the users of their services”, in as far as the mandatory conditions laid down in Digital Rights Ireland are observed and all other conditions specified in the opinion have been observed.[43]

D. Conclusion

The AG’s opinion provides a pragmatic solution, allowing for a status quo of national legislation on data retention within the EU, in so far as sufficient safeguards are in place or put in place. The AG’s pragmatism can further be illustrated by his approach to the issue of access to retained data for the purpose of combating serious crime. Although the AG admits that strictly speaking, EU law does not regulate this topic, he takes the view that national provisions governing access should nonetheless be taken into account when assessing the data retention that precedes it because it would be artificial to entirely separate data retention from the subsequent use of that data and the rules regulating access.[44]

Notwithstanding this attractive pragmatism, the question remains whether the Court will follow the same reasoning. Throughout the opinion, the AG seems to assume that each of the Court’s objections in Digital Rights Ireland to the broad and unspecified scope of the Data Retention Directive were to be read only in conjunction with all its other objections. Following that reasoning, it was the cumulative effect of all the Directive’s shortcomings that convinced the Court to annul the Data Retention Directive, without any of the constitutive elements having decisive influence. Undoubtedly, the cumulative effect of a broad scope, a lack of limitation and objectivism relating to access and a lack of differentiation in retention duration made the Retention Directive extremely problematic, each shortcoming reinforcing the negative impact of the others. As the AG points out, in Digital Rights Ireland, the Court only came to the conclusion that the EU legislature had exceeded the limits imposed on it by the principle of proportionality after making all its objections, without explicitly stating that the broad scope of the data retention regime as such went beyond what was necessary.[45]

However, when reading paragraphs 57-59 of Digital Rights Ireland carefully, one could also conclude that the Court hints at the conclusion that the scope of the Data Retention Directive went beyond what was necessary and at the very least had clear objections to the broad scope of the Directive in and by itself. Especially problematic is the absence of any linking factors to an involvement, even extremely indirectly, of the user in serious crime and the lack of exceptions for communications falling under professional secrecy. Therefore, the assumption that a general data retention obligation is no longer possible under EU law and that only pre-identified communications can be retained, with special provisions for communications that are subject to obligations of professional secrecy is not far-fetched.

Nonetheless, I would tend to agree with the AG. It is not the retention of all metadata in itself that is problematic, although the potential of this data to catalogue entire populations is unquestionable.[46] As the AG puts it: “a general data retention obligation need not invariably be regarded as, in itself, going beyond the bounds of what is strictly necessary for the purposes of fighting serious crime. However, such an obligation will invariably go beyond the bounds of what is strictly necessary if it is not accompanied by safeguards concerning access to the data, the retention period and the protection and security of the data.”[47] Indeed, it is arguable that the mere retention of the metadata, notwithstanding its enormous potential, is quite harmless. It is only the subsequent access and (ab)use that is potentially harmful to the data subjects. As such, in order to maximize the potential of the tools for fighting serious crime, and to minimize the risks of abuse, the focus should be on providing sufficient safeguards relating to retention duration and conditions of access and subsequent use.

The Court’s verdict will be awaited eagerly.

————————————————

*Pieter Gryffroy is a research assistant at the Jean-Monnet-Chair of Prof. Dr. Giegerich for European Law and European Integration. He studied law in Leuven (LLB and LLM at the KU Leuven) and in Saarbrücken (Europa-Institut).

[1] CJEU, Joined Cases C-293/12 and C-594/12, Digital Rights Ireland, ECLI:EU:C:2014:238.

[2] Directive 2006/24/EC of the European Parliament and of the Council of 15 March 2006 on the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks and amending Directive 2002/58/EC,OJ L 105, p. 54-63.

[3] CJEU, Digital Rights Ireland, paras. 57-59.

[4] CJEU, Digital Rights Ireland, paras. 60-62.

[5] CJEU, Digital Rights Ireland, paras. 63-65.

[6] CJEU, Digital Rights Ireland, para. 44.

[7] CJEU, Digital Rights Ireland, paras. 45-69.

[8] See e.g. http://www.reuters.com/article/us-eu-data-telecommunications-idUSKBN0M82CO20150312 (last accessed 22/08/2016).

[9] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 (Tele2 Sverige AB v Post-och telestyrelsen) and C-698/15 (Secretary of State for Home Department v Watson and others), ECLI:EU:C:2016:572; hereinafter: Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15.

[10] Notably, in the UK the rules had been enacted after Digital Rights Ireland, although not without protest. In Sweden, the applicable national law was still unaltered and thus modeled after the provision of the Directive (see Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 8-49).

[11] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 50.

[12] Nonetheless relating only to metadata and not to the content of the communications.

[13] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 10-33.

[14] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 51.

[15] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 52.

[16] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 53-55.

[17] Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications), OJ L 201, p. 37-47.

[18] See e.g. http://www.bbc.co.uk/news/uk-28305309 (last accessed 23/08/2016); http://www.bbc.com/news/uk-politics-28237111 (last accessed 23/08/2016); http://www.dimt.it/2015/01/21/striking-a-balance-among-security-privacy-and-competition-the-data-retention-and-investigatory-powers-act-2014-drip/ (last accessed 23/08/2016).

[19] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 56; again, all data relates to all metadata and does not concern the content of the communications.

[20] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 58.

[21] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 59.

[22] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 60.

[23] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 65.

[24] Metadata is used in this article to signify the information that relates to a certain electronic communication, but excluding the actual content of the message. Included are e.g. date, time and duration; source and destination; location; type of communication and type of equipment used.

[25] See the formulation “including, in particular, paragraphs 60 to 62 thereof”, referring to the Court’s view on access rules in Digital Rights Ireland; Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 60, first question.

[26] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 67.

[27] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 87-97.

[28] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 98-116.

[29] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 117-125.

[30] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 121-122.

[31] In the opposite case, if one were to take the opinion that Art. 15 of the E-Privacy Directive does not allow for a general data retention obligation in national law, the MS would still be legislating within an area covered by EU law, albeit then contrary to its provisions (such as Art. 6 and 9 of the E-Privacy Directive concerning traffic data and location data other than traffic data).

[32] See Art. 1(3) of Directive 2002/58; see also Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 123-124.

[33] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 125.

[34] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 125.

[35] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 126-127.

[36] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 132.

[37] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 134-154.

[38] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 155-160.

[39] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 161-174.

[40] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 175-184.

[41] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 185-245.

[42] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 246-262.

[43] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 263.

[44] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 125.

[45] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, paras. 197-200.

[46] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 259.

[47] Opinion of AG Saugmandsgaard Øe to CJEU, Joined Cases C-203/15 and C-698/15, para. 205.

Copyright of the image: Defense Advanced Research Projects Agency (DARPA), https://commons.wikimedia.org/wiki/File:DARPA_Big_Data.jpg?uselang=de.

Taking a look at two cases in the margin of the CJEU’s “Privacy Spring”, before and after the General Data Protection Regulation: Weltimmo and Bara

An article by Pieter Gryffroy*

 

I. Introduction

On 4 May 2016, the official text of the General Data Protection Regulation (Regulation (EU) 2016/679), replacing the former Data Protection Directive (Directive 95/46/EC), was published in the EU’s Official Journal,[1] together with the specific new Directive on the exchange of personal information between competent authorities for the purposes of crime prevention, investigation and prosecution.[2] Thus, the EU’s legislative data protection reform initiated in 2012 came to a successful conclusion. However, the legislator was not the only important actor creating momentum in the reform of the EU’s data protection rules. During the same timeframe, the Court of Justice adopted a particularly pro-active stance in this field, in cases such as Digital Rights Ireland,[3] Google Spain,[4] and Schrems.[5] The CJEU’s activism lead to its recent case law being referred to as the CJEU’s “Privacy Spring”.[6] In the margin of the CJEU’s “Privacy Spring” and the more famous decisions, judgment was rendered in Weltimmo[7] and Bara[8] on 1 October 2015, fived days before the Schrems judgment. While most commentators overlooked or ignored these cases, they provide significant clarifications on questions of EU data protection law under Directive 95/46/EC. Addressing the Weltimmo and Bara cases is long overdue, but still highly relevant. Although the General Data Protection Regulation (hereinafter: GDPR) has entered into force on 24 May 2016, it shall only apply from 25 May 2018 onwards and thus Directive 95/46/EC (hereinafter: the Directive) will remain applicable for nearly two years to come.[9] Moreover, the CJEU’s case law will remain relevant under the GDPR as well. Therefore, this article will analyse both cases and their impact, before and after the application of the GDPR in 2018.


II. Weltimmo

1. Facts

Weltimmo is a company registered in Slovakia that operates a real estate website for Hungarian properties. In that capacity it processes personal data of the advertisers on the website. Property owners looking to sell can obtain an advertisement on the website free of charge for the duration of one month, after which a fee is charged. Many homeowners decided to use the free part of Weltimmo’s services, but were reluctant to pay the fee. They requested Weltimmo to delete both the advertisements and their personal data they had provided after the first month had lapsed. Weltimmo refused to do this and instead charged the customers in question. The fees went unpaid and consequently, Weltimmo forwarded the personal data of the customers who were in default to debt collection agencies.[10]

In reaction to this, the affected customers lodged a complaint with the Hungarian Data Protection Authority (further: HDPA) which considered itself competent on the basis of the statute transposing Directive 95/46/EC into Hungarian Law and fined Weltimmo approximately EUR 32.000 for its actions.[11] Weltimmo then brought an action before the competent national court, claiming that the HDPA had no jurisdiction over it.[12] The court rejected this reasoning but overturned the HDPA’s decision because of a lack of clarity about certain facts. Weltimmo nonetheless appealed on a point of law and argued that under Art. 4(1) of the Directive Slovakian law was applicable, which the Hungarian authority was not allowed to apply pursuant to Art. 28 (6) of the Directive.[13] Instead, it should have asked the Slovakian authority to intervene if it considered this necessary. In the end, the Hungarian Supreme Court stayed proceedings to refer several questions to the Court of Justice (hereinafter: CJEU). In essence, the referring court asked the CJEU whether Hungarian law was applicable in the case at hand, although the company was incorporated in Slovakia, and whether the HDPA was competent to take action, applying Hungarian law. The referring court also wanted to know if the HDPA could take action, even if only limited, in case Slovakian law was deemed to be applicable.


2. Decision

In regards to the question whether Hungarian law is applicable, the CJEU considers the meaning of “establishment” in the sense of Art. 4 (1) a) of the Directive.[14] That article prescribes that each MS shall apply its national rules adopted pursuant to the Directive when the processing of personal data takes place in the context of the activities of the establishment of the controller on that MS’s territory. The CJEU repeats its argumentation from the Google Spain case and states that the concept of establishment, justifying the application of EU law, “implies the effective and real exercise of activity through stable arrangements”.[15] The legal form of those arrangements, including the place of incorporation, does not matter.[16] Nor does the extent of the real activity.[17] The CJEU finds that Weltimmo did pursue an effective and real activity in Hungary, since it runs a website in Hungarian, aimed at Hungarian properties, which charges fees after the introductory period of one month has lapsed.[18] Thus, for the purposes of the Directive, it is established in Hungary.[19] The CJEU then goes on to examine whether the processing of personal data by Weltimmo was carried out in the context of that establishment. It finds, referring to its Google Spain and Lindqvist[20] cases, that there can be no doubt that Weltimmo’s activity of loading personal data on its Internet page must be considered as a processing of personal data in the sense of Art. 2 (d) of the Directive.[21] Therefore, Hungarian law applies to Weltimmo’s processing of personal data and under Art. 4, read in conjunction with Art. 28 of the Directive, the HDPA is competent to act, being an organ of the Hungarian State.

Although it was not necessary in the case at hand, the CJEU also addresses the question what action the HDPA could have taken had Slovakian law been applicable.[22] The CJEU holds that in any case, the HDPA may investigate any complaint it receives, before even knowing the applicable law.[23] However, when the HDPA or another national data protection authority comes to the conclusion that the law of another MS is applicable, it cannot impose penalties or sanctions outside the territory of its own MS because those sanctions have their legal basis in the national law of said MS.[24] In such a case, the national data protection authority in question has, under the duty of cooperation of Art. 28(6) of the Directive, to request the supervisory authority of the MS whose law is applicable to intervene, potentially on the basis of the information gathered by the first national data protection authority.[25]


3. Evaluation

In the Weltimmo Case, the CJEU clearly honours the territoriality principle that underpins the system of the Directive. Following this approach, national laws provide for the precise extent of the powers of the national data protection authorities, and the jurisdictional reach is territorially limited not only because of the national nature of the laws in question, but also because of the conflicting jurisdiction of the neighbouring Members States, having their own national laws on the subject and their own supervisory data protection authorities, all based on the Directive.

The decision in Weltimmo prevents companies from escaping the harsher enforcement of one EU Member State by creating an alternate corporate reality linking them to another. In doing so, the CJEU aims to protect the right to privacy and data protection of the EU citizens dealing with such corporate actors. That companies have an interest at all in attaching themselves to the law and supervisory authority of a different MS is because at the present time not all national data protection authorities are equally active and a certain disparity in the rules transposing the Directive into national law cannot be avoided. The GDPR aims to change that in order to guarantee legal certainty and create a level playing field for competitors.

First of all, the GDPR has introduced a single set of rules, to be applied uniformly across the EU. While issues of interpretation can never be ruled out, even with regard to the same set of rules,[26] this will eliminate the incentives and possibility for companies to artificially and strategically try to attach themselves to certain MS with either more lenient rules or, more importantly, implementation deficits due to an inactive national data protection authority. Secondly, while the GDPR does not abandon the territoriality principle in relation to competence altogether,[27] it contains novel and inventive procedures for cooperation, mutual assistance, joint operations and a consistency mechanism.[28] Moreover, all national data protection authorities have to present activity reports annually, which will be made public.[29] All of this aims at ensuring consistency in the application of the regulation by the national authorities. It also seeks to encourage the national supervisory authorities to take an active stance and aims to mobilize all of them to an optimal extent. Nonetheless, differences in absolute activity levels will foreseeably remain. Ultimately, however, it is the duty of the national courts in cooperation with the CJEU (Art. 267 TFEU) to ensure the uniform interpretation of the GDPR provisions throughout the Union.

Thus, in conclusion, the Weltimmo case used the principle of territoriality to effectively address the issue that some companies might be inclined to artificially pick which national law to comply with and which national data protection authority to deal with. The GDPR will change that status quo by providing for a single set of rules, to be applied in uniformity by supervisory authorities across the EU. This should eliminate the problems present in the Weltimmo case. Nonetheless, Art. 23 GDPR permits national legislators to diverge to a considerable extent from certain of the GDPR’s provisions, including the provisions covering the situation at issue in Weltimmo. In time, incentives to artificially attach a company to a more lenient MS law will revive. At that time, the CJEU’s decision in Weltimmo might provide guidance.[30]


III. Bara

1. Facts

The applicants in the Bara case are self-employed persons who declared their personal income to the Romanian tax authority ANAF. ANAF subsequently passed this information on to the national health insurance fund (CNAS).[31] On the basis of that information CNAS asked the persons concerned for the payment of arrears of contributions to the fund.[32] The applicants challenged this transfer of information based on an internal governmental protocol, which happened without their consent and without them being informed beforehand.[33] While Romanian law provides for certain data transfers from public authorities to the fund, it does not allow information relating to the income received by a data subject to be passed on.[34] Under these circumstances, the referring court asked the Court of Justice four questions. The three first questions were deemed inadmissible, while the 4th question had to be reformulated.[35] In essence, the CJEU addresses the question “whether Articles 10, 11 and 13 of Directive 95/46 must be interpreted as precluding national measures […] which allow a public administrative body in a Member State to transfer personal data to another public administrative body and their subsequent processing, without the data subjects being informed of that transfer and processing.”[36]


2. Decision

The CJEU first establishes that the case indeed concerns the processing of personal data, which, subject to the exceptions set out in Art. 13 of the Directive, must comply with the principles of data quality set out in Art. 6 and which must be justified on one of the grounds contained in Art. 7.[37] Moreover, the data controller must also comply with Arts. 10 and 11 of the Directive. These articles provide that, subject to the exceptions in Art. 13 and Art. 11(2), the data subject must be informed about the identity of the controller of the data, the purpose of the processing and certain categories of additional information such as the recipients of the information, in as far as the data subject does not know this information already and the information is necessary to guarantee fair processing in relation to the data subject.[38] The CJEU analyses the case both from Art. 10 (data obtained directly from the data subject) and Art. 11 (data that was not directly obtained from the data subject), i.e., from the point of view of both the ANAF and the CNAS, and comes to the same conclusion: under the provisions of the Directive, the data subjects involved should have been informed beforehand.[39] In this context, the CJEU does not assess whether, as required by the text of the Directive, “such further information is necessary, having regard to the specific circumstances in which the data are collected/processed, to guarantee fair processing in relation to the data subject”.[40] Instead, the CJEU simply states that the information was not previously provided and moves on to the question whether the exceptions of Art. 11(2) or Art. 13 can apply.[41] With this reasoning the CJEU strongly implies that providing the relevant information beforehand is necessary to guarantee fair processing in all circumstances. The CJEU then observes that while Arts. 11(2) and 13 of the Directive allow Romanian law to provide for legislative exceptions to Arts. 10 and 11 of the Directive, it has failed to do so in the case at hand. Although Romanian law provides for certain transfers of personal data between public authorities, it does not do so for information related to the income of the data subject. The governmental protocol on which the transfer was based cannot be qualified as a legislative measure and accordingly, the exceptions of Arts. 11(2) and 13 of the Directive cannot be applied.[42] Consequently, the CJEU concludes that Arts. 10, 11 and 13 of the Directive preclude national measures such as the government protocol at issue in the Bara case. While the CJEU only refers to the provision of the Directive, it is important to note that the requirement of a legislative basis for transfers of personal data flows directly from the EU Charter. First, personal data must be processed fairly, on the basis of consent of the data subject or on another ground, laid down by law (Art. 8(2), first sentence EU Charter). Second, even when personal data has initially been processed lawfully, any restriction of an EU citizen’s right to data protection must be provided by law and meet the principle of proportionality (Art. 8 read together with Art. 52(1) EU Charter). Since transferring personal data between authorities without the data subject’s consent or knowledge constitutes such a restriction, the Charter requires MS law to expressly provide for it.


3. Evaluation

The Court of Justice again takes a data protection friendly view in the Bara case, requiring the data subject to be informed beforehand in all cases where his or her personal data is being transferred, even between public authorities. Nonetheless, articles 11(2) and 13 of the Directive allow national legislators to enact rules deviating from this right to prior information. The only problem in the Bara case was that Romania had not sufficiently precisely done so. The GDPR will affect the current situation in two ways. Firstly, the GDPR confirms the CJEU’s finding that certain information must always be provided to the data subject before processing his or her personal data, such as the purpose of the processing, the legal basis, the categories of data concerned and the recipients of such data.[43] Additionally, the data controller also has to inform the data subject of further information necessary to ensure a fair and transparent processing in respect of the data subject. Such information concerns e.g. the time for which the data will be stored, the fact that the data subject has a right of access and a right to file a complaint, and the source of the data.[44] Secondly, while the GDPR still allows MS to deviate from the right of the data subject to be informed when his or her personal data is being collected or transmitted, it imposes stricter conditions on the national legislative measures imposing such restrictions.[45] Thus, in conclusion, the GDPR honours the sentiment of the Bara case by imposing stricter conditions on the data controller, obliging them to provide more information to the data subject about the collection and/or transmission of his or her personal data. Additionally, any restrictions of this obligation are subjected to stricter preconditions and safeguards, in line with the requirements of the EU Charter of Fundamental Rights.


IV. Conclusion

In both the Weltimmo and the Bara case, the Court of Justice took a pro-active stand, aimed at providing EU citizens with proper privacy and data protection under the current legal framework, much like it did in the more famous cases, such as Google Spain and Schrems. In Weltimmo, the CJEU ensured that companies could not avoid the law of the MS where they pursue the real activities in the context of which the personal data is processed, by artificially attaching themselves to the law and enforcement regime of another, more lenient, MS. In Bara, the CJEU protected the right of the individual data subject to be informed about the collection and/or transmission of his or her personal data, subject to specific exceptions laid down by law and not just an internal and unpublished governmental protocol. Notably, although these judgments concern interpretations of the Directive, they will remain relevant even after 25 May 2018, when the GDPR will apply. While the GDPR will change the legal situation in both cases, its provisions seeks to ensure at least an equivalent protection of personal data as envisioned by the CJEU and required by Art. 8 of the Charter of Fundamental Rights of the EU.

————————————————

*Pieter Gryffroy is a reserach assistant at the Jean-Monnet-Chair of Prof. Dr. Giegerich for European Law and European Integration. He studied law in Leuven (LLB and LLM at the KU Leuven) and in Saarbrücken (Europa-Institut).

[1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), OJ L 119, p. 1–88. It replaces Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, Official Journal L 281, p. 31-50.

[2] Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA, OJ L 119, p. 89–131.

[3] CJEU, Joined Cases C-293/12, Digital Rights Ireland, ECLI:EU:C:2014:238.

[4] CJEU, Case C-131/12, Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González, ECLI:EU:C:2014:317 (hereinafter: Google Spain).

[5] CJEU, Case C-362/14, Maximillian Schrems v Data Protection Commissioner, ECLI:EU:C:2015:650.

[6] See Zanfir, How CJEU’s “Privacy Spring” construed the human rights shield in the digital age, January 2015, p. 1, 10-11, available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2604895 (last accessed 15/06/2016).

[7] CJEU, Case C-230/14, Weltimmo s.r.o. v. Nemzeti Adatvédelmi és Információszabadság Hatóság, ECLI:EU:C:2015:639 (hereinafter: Weltimmo).

[8] CJEU, Case C-201/14, Smaranda Bara and Others v. Președintele Casei Naționale de Asigurări de Sănătate, Casa Naţională de Asigurări de Sănătate, Agenţia Naţională de Administrare Fiscală (ANAF), ECLI:EU:C:2015:638 (hereinafter: Bara).

[9] Article 99 of Regulation (EU) 2016/679.

[10] CJEU, Case C-230/14, Weltimmo, para. 9.

[11] CJEU, Case C-230/14, Weltimmo, para. 10.

[12] CJEU, Case C-230/14, Weltimmo, para. 11.

[13] CJEU, Case C-230/14, Weltimmo, para. 12.

[14] The CJEU also considers the factual circumstances of establishment in its preliminary observations, see CJEU, Case C-230/14, Weltimmo, para. 15-18.

[15] CJEU, Case C-230/14, Weltimmo, para. 28.

[16] Id.

[17] CJEU, Case C-230/14, Weltimmo, para. 31.

[18] CJEU, Case C-230/14, Weltimmo, para. 32.

[19] CJEU, Case C-230/14, Weltimmo, para. 33.

[20] CJEU, Case C-101/01, Criminal proceedings against Bodil Lindqvist, ECLI:EU:C:2003:596.

[21] CJEU, Case C-230/14, Weltimmo, para. 35-38.

[22] See especially CJEU, Case C-230/14, Weltimmo, para. 54-60.

[23] CJEU, Case C-230/14, Weltimmo, para. 57.

[24] CJEU, Case C-230/14, Weltimmo, para. 56-57, 59.

[25] CJEU, Case C-230/14, Weltimmo, para. 57, 58.

[26] Although notably, Art. 23 does still allow MS to restrict the scope of certain provisions, disturbing the level playing field.

[27] See Art. 55 of Regulation (EU) 2016/679.

[28] See Arts. 60-76 of Regulation (EU) 2016/679.

[29] See Art. 59 of Regulation (EU) 2016/679.

[30] Although it should be noted than in such a scenario, Art. 4 of the Directive will no longer exist as a legal ground, expressly providing that the law of a MS applies to the processing of personal data by a controller in the context of an establishment on that MS’s territory. The GDPR does not contain an express provision on the territorial nexus to be applied in determining the applicable diverging MS law under the GDPR.

[31] CJEU, Case C-201/14, Bara, para. 14.

[32] Id.

[33] CJEU, Case C-201/14, Bara, para. 15.

[34] CJEU, Case C-201/14, Bara, para. 16.

[35] CJEU, Case C-201/14, Bara, para. 18-28.

[36] CJEU, Case C-201/14, Bara, para. 28.

[37] CJEU, Case C-201/14, Bara, para. 28-30.

[38] Read Arts. 10 and 11 of the Directive; CJEU, Case C-201/14, Bara, para. 31 and following.

[39] CJEU, Case C-201/14, Bara, para. 32-34 and 42-43.

[40] Art. 10/11 of the Directive respectively.

[41] CJEU, Case C-201/14, Bara, para. 35 q. and 44 q.

[42] CJEU, Case C-201/14, Bara, para. 36-41 and 45.

[43] See Arts. 13(1) and 14(1) of Regulation (EU) 2016/679; subject to the exceptions in Arts. 13(4) and 14(5) of the regulation.

[44] See Arts. 13(2) and 14(2) of Regulation (EU) 2016/679; subject to the exceptions in Arts. 13(4) and 14(5) of the regulation.

[45] See Art. 23 of Regulation (EU) 2016/679; the justifying grounds however stay the same.

Copyright of the image: Defense Advanced Research Projects Agency (DARPA), https://commons.wikimedia.org/wiki/File:DARPA_Big_Data.jpg?uselang=de.

The EU-US Privacy Shield: An Effective New Framework for Transatlantic Data Flows or A Weak Compromise Doomed to Fail?

An article by Pieter Gryffroy*

 

I. INTRODUCTION

 

On 2 February 2016, the European Commission announced[1] that it had reached an agreement with US authorities on a new safe harbour regime, after the old regime had been invalidated by the Court of Justice on 6 October 2015 in its judgment in the Schrems case,[2] already reported on here. The new regime, called the EU-US Privacy Shield, was agreed upon just in time to meet the deadline set by the Article 29 working party (which represents the national data protection authorities) after the Schrems judgment, failing which national data protection authorities within the EU would have taken coordinated enforcement actions to ensure effective protection of the personal data of EU citizens in the US.[3] The article 29 working party will now assess the new deal on transfer of personal data across the Atlantic.[4] Whilst the full text of the new agreement is not yet accessible to the large public, the key elements of the agreement are known and have already attracted a lot of criticism. This contribution will first touch upon the background of the new agreement, explaining why it was necessary and how it functions within the framework of EU data protection law. Secondly, the contribution will assess whether the EU-US Privacy Shield is up to the task of ensuring an effective protection of personal data of EU citizens contained in transatlantic data flows.

 

II. BACKGROUND: THE OLD SAFE HARBOUR AGREEMENT STRUCK DOWN BY THE COURT OF JUSTICE

 

Before one can assess the impact of the successor to the safe harbour agreement, it is crucial to understand the role of the original agreement in the framework of the EU’s data protection regulation. Equally, a basic understanding of the Schrems case, leading to the agreement’s invalidation by the Court of Justice is indispensable for a good understanding of the context of the new agreement.

Although the European legislator could never have foreseen the extent of the current data flow between the EU and third countries, notably the US, Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data[5] already provided for rules on the transfer of personal data of European citizens from the EU member states to third countries in its chapter IV. Whereas the directive is clear and relatively precise on the principles relating to data protection within the EU, the rules on the transfer of personal data to third countries merely provide that member states may only allow the transfer of personal data when the country in question “ensures an adequate level of protection”.[6] In order to allow an EU-wide approach to third country data transfers, the Directive endows the Commission with powers to make EU-wide determinations of the adequacy of the level of protection of a third country, either in the positive or in the negative.[7] In both cases member states are by the letter of the Directive bound to take the measures necessary to comply with the Commission’s decision. Additionally, the Directive also mandates the Commission to enter into negotiations with certain third countries in order to guarantee an effective protection of personal data of EU citizens through a specific agreement with the third country concerned.[8] Upon reaching a satisfying agreement, the Commission can then render a decision, finding that the country in question ensures an adequate level of protection of the personal data of EU citizens.[9]

It is within this exact context that between 1998 and 2000 the EU and the US federal Department of Commerce developed the safe harbour privacy principles for the transfer of personal data between the EU member states and the US, culminating in Commission Decision 2000/520/EC on the adequacy of the protection provided by the safe harbour privacy principles and related frequently asked questions issued by the US Department of Commerce.[10] In accordance with the agreement between the EU and the US, the US Department of Commerce issued the safe harbour principles,[11] enabling US organizations to self-certify by either joining a self-regulatory privacy program that adheres to the principles or by developing their own privacy policies, provided that they are in conformity with the principles. In order to benefit from the safe harbour mark of quality, facilitating the provision of services to EU citizens, the organization had to publicly declare that they adhere to the principles, at which point further compliance became mandatory. In principle however, adherence was voluntary and non-adherence did not prevent a US organization from receiving personal data coming from the EU. Such organizations were simply required to give notice to those using theirs services that their privacy policy possibly did not comply with international standards. Nonetheless, the benefit of compliance seemed to outweigh the cost and most major organizations adhered to the principles, albeit in diverging manners. Decision 2000/520/EC created a blanket allowance for the different approaches to privacy in the transatlantic flow of personal data, as long as they declared to adhere to the principles. Following Art. 3 of Decision 2000/520/EC the national data protection authorities of a EU member state could only suspend the flow of personal data in connection with a self-certified organization if either the US authority found a violation of the principles (Art. 3(1)(a)) or if there extraordinary circumstances presented itself, creating a substantial likelihood that the principles were being violated by a certified organization (Art. 3(1)(b)).

The safe harbour agreement came under heavy fire in 2013, when Edward Snowden revealed that US intelligence agencies, notably the NSA through its “PRISM” program accessed personal data stored on US servers to conduct surveillance on people worldwide, gathering personal data en masse and virtually without restriction. This included personal data of European citizens, which had been sent to the US. At that time, the Commission had already started renegotiating the safe harbour agreement. However, it should be pointed out that the safe harbour principles never applied to national agencies such as the NSA in the first place, since they are not covered by Decision 2000/520/EC.[12] Although there is a separate discussion to be had about the protection of personal data of European citizens by private US companies as such, the main problem with the safe harbour agreement was that it allowed for a transatlantic data flow, which was then, so it was revealed, subject to mass surveillance by US authorities, contrary to the EU’s principles of data protection. The principles as such, however, were not scrutinized in the Court’s judgment.[13]

Amongst others, an Austrian law student by the name of Maximilian Schrems argued that the very fact that such operations can take place in the US shows that the US does not provide an adequate level of protection, meeting the EU’s standard of privacy protection, thereby inherently invalidating the safe harbour agreement between the US and the EU. Just like hundreds of millions of people around the world, Schrems was a user of Facebook’s social network, which had self-certified under the safe harbour regime. In Europe, all personal data Facebook collects, passes through the server of its European seat in Ireland, before being sent anywhere else, notably to the US main servers. Although Facebook itself may have complied with the safe harbour principles, Schrems complained before the Irish Data Protection Commissioner that the safe harbour regime as such did not offer adequate protection for his personal data against surveillance by the US.

The Irish national data protection authority first rejected this complaint because it considered itself bound by the Commission’s decision on the adequacy of the level of data protection offered by the US. The Court however,[14] decided that the neither Directive 95/46/EC nor Decision 2000/520/EC of the Commission can, in the light of the Charter of Fundamental Rights of the EU, be interpreted as precluding a national data protection authority “from examining the claim of a person concerning the protection of his rights and freedoms in regard to the processing of personal data relating to him which has been transferred from a Member State to that third country when that person contends that the law and practices in force in the third country do not ensure an adequate level of protection.”[15] However, as long as the Commission decision stood, member states could not adopt measures contrary to the decision. Therefore, the Court analysed the decision and found that given the lack of any limit on US state interference, allowing the storage of personal data on a generalized basis without an effective remedy, Decision 2000/520/EC violates the fundamental rights guaranteed under Art. 7, 8 and 47 of the EU charter.[16] Moreover, under Art. 25(6) of Directive 95/46/EC, the Commission must find in its decision, “duly stating reasons, that the third country concerned in fact ensures, by reason of its domestic law or its international commitments, a level of protection of fundamental rights essentially equivalent to that guaranteed in the EU legal order”.[17] The Court found that Decision 2000/520 EC did not state that the US did in fact ensure an adequate level of protection, nor could it have, given that the aforementioned US practices violated the EU Charter. Consequently, the ECJ concluded that the decision was invalid.[18]

The direct consequence of the Schrems judgment was that the main legal framework governing the data flow between the EU and the US suddenly disappeared, creating significant uncertainty and burdening the national data protection authorities of the member states with the task to assess all remaining options for data transfer on a case-by-case basis and to take action to stop the transatlantic flow of personal data where necessary. There is however nothing that suggests that the actual flow of personal data from the EU to the US has declined following the judgment. The national data protection authorities, through the Article 29 working party, adopted a practical approach under which certain data transfer tools could still be used, at the same time setting a deadline for the negotiation of a new safe harbour agreement by the end of January 2016.[19] The Commission has roughly met that deadline, announcing the EU-US Privacy Shield as the successor to the safe harbour agreement on 2 February 2016. The new agreement is the product of nearly three years of negotiations, which started in 2013 after the Snowden leak and were intensified following the Schrems judgment. While the text of the agreement is being finalized before its planned release to the public at the end of February, it has already attracted a lot of critique. This will be discussed in what follows.

 

III. THE NEW AGREEMENT

 

Although the full text remains to be disclosed, the Commission has already indicated the main elements of the new agreement, which are structured around three pillars.

The first pillar provides that US companies importing personal data from Europe will have to commit to “robust” obligations concerning the processing of such data. The US Department of Commerce will monitor that companies publish such commitments, making them enforceable under US Law. Any company handling personal data coming from Europe has to commit to comply with decisions by European data protection authorities. In essence, this is no more than a restatement of the previously existing safe harbour regime.

The second pillar aims at limiting the surveillance mechanisms used by certain US authorities such as the NSA, which were revealed after the Snowden leak and were an important factor in invalidating the original safe harbour agreement. The new agreement aims to establish clear safeguards and transparency obligations for organs of the US government accessing personal data. In this regard, the US has assured the EU that the access to personal data by its authorities and agencies “for reasons of law enforcement or national security will be subject to clear limitations, safeguards and [will be accompanied by] oversight mechanisms.”[20] The remaining powers will be used only to the extent they are necessary and proportionate, supposedly ruling out indiscriminate mass surveillance. The arrangement will be reviewed yearly. It seems however to remain up to the US to determine the exact interpretation of these vague terms, and with US law still explicitly allowing mass surveillance,[21] strong concerns are being voiced by the media and privacy advocates. Given that the Court of Justice invalidated Decision 2000/520/EC precisely because it allowed the US to store personal data on a generalized basis,[22] one can expect that if no further clarification of these terms follows, the Court might not accept these vague commitments.

The third pillar of the new agreement aims at ensuring an effective protection of the rights of EU citizens to protection of their personal data, through the enactment of a Judicial Redress Act by the US Congress, providing for several redress possibilities. US companies will have strict deadlines to reply to complaints, the EU’s national data protection authorities will have the possibility of referring complaints directly to the US Department of Commerce and the Federal Trade Commission, and there will be possibilities for alternative dispute resolution, free of charge. The Judicial Redress Act will, however, not cover matters of intelligence and public security. For complaints concerning access by US authorities to personal data, EU citizens will have access to a newly created ombudsman. While these remedies might sound promising, it has been pointed out that European citizens are in large part forced to find judicial redress in the US, which has been met with criticism. Moreover, it is very questionable whether the proposed ombudsman will be an effective remedy to protect European citizens from violations of their right to protection of their personal data committed by US authorities. In its judgment in the Schrems case, the Court of Justice had put special emphasis on the right to have an effective remedy.[23]

Even without knowing the final text of the agreement and the exact extent of the obligations the US has entered into, it is already clear that the new agreement will not satisfy privacy advocates that had called for far greater commitments on the US side, especially concerning its mass surveillance practices. In the eyes of the critics, the agreement is a weak compromise, containing little or no relevant changes, desperately agreed upon to replace the main legal basis for the transatlantic flow of personal data, which is of major economic importance for multinationals with data-intense activities, Facebook just being one of numerous examples.

In conclusion, it is clear that the legal vacuum left by the Schrems judgment has raised the economic and political pressure on both sides, given the mutual interest in the transatlantic flow of personal data and the ever-growing economic importance of the services connected to this flow of data. Whether the resulting agreement will be a workable compromise, capable of protecting the personal data of Union citizens across the Atlantic remains to be seen.

IV. THE NEXT STEPS

While the agreement as such has been concluded, the EU-US Privacy Shield still has a way to go before it becomes operative. Currently, the Commission is drafting an adequacy decision as provided for under Art. 25(6) of Directive 95/46/EC. Before taking this decision, the Commission has to consult the article 29 working party, which has committed to deliver its opinion by the end of March, and the so-called article 31 Committee, constituted of member state representatives.[24] In the meantime the US side will make preparations to establish the framework and monitoring mechanisms agreed upon. Notwithstanding the heavy criticism it seems likely that the Commission will take a decision giving effect to this agreement in the coming months.

It is clear that the negotiating parties have tried to address the concerns the Court of Justice had voiced in the Schrems case. However, as the agreement stands, without further clarifications of the broad and vague commitments relating to US government access to personal data, and without addition of more effective judicial remedies, the author must admit that the decision implementing the EU-US Privacy Shield seems predetermined to make a round trip to Luxembourg and back to Brussels only to be renegotiated and renamed once more.

 

————————————————

*Pieter Gryffroy is a reserach assistant at the Jean-Monnet-Chair of Prof. Dr. Giegerich for European Law and European Integration. He studied law in Leuven (LLB and LLM at the KU Leuven) and in Saarbrücken (Europa-Institut).

[1] http://europa.eu/rapid/press-release_IP-16-216_en.htm, accessed 15/02/2016.

[2] CJEU, case C-362/14, Maximilian Schrems v Data Protection Commissioner, ECLI:EU:C:2015:650.

[3]http://ec.europa.eu/justice/data-protection/article-29/press-material/press-release/art29_press_material/2015/20151016_wp29_statement_on_schrems_judgement.pdf, accessed 15/02/2016.

[4] http://ec.europa.eu/justice/data-protection/article-29/press-material/press-release/art29_press_material/2016/20160203_statement_consequences_schrems_judgement_en.pdf, accessed 15/02/2016.

[5] Official Journal L 281 , 23/11/1995 P. 0031 – 0050.

[6] Art. 25 (1) Directive 95/46/EC. Art. 25 (2) provides for some broad assessment criteria.

[7] See Art. 25 (4) and 25(6) of Directive 95/46/EC.

[8] Art. 25(5) Directive 95/46/EC.

[9] Art. 25(6) Directive 95/46/EC.

[10] Commission Decision 2000/520/EC of 26 July 2000 pursuant to Directive 95/46/EC of the European Parliament and of the Council on the adequacy of the protection provided by the safe harbour privacy principles and related frequently asked questions issued by the US Department of Commerce (notified under document number C(2000) 2441), Official Journal L 215 , 25/08/2000 P. 0007 – 0047.

[11] See Annex I of Decision 2000/520/EC. The seven main principles are: notice, choice, onward transfer (requiring notice and choice), security, data integrity, access and enforcement.

[12] This follows from the limited scope of application of Directive 95/46/EC itself, in pursuance of which the Decision was taken. In its Art. 3(2) the Directive states that it does not apply to matters of public security, defence, state security and activities of the state in areas of criminal law.

[13] CJEU, case C-362/14, Maximilian Schrems v Data Protection Commissioner, ECLI:EU:C:2015:650, para. 98.

[14] The matter was taken to the high court in Ireland and reached the Court of Justice through a preliminary ruling.

[15] CJEU, case C-362/14, Maximilian Schrems v Data Protection Commissioner, ECLI:EU:C:2015:650, paras. 38-66.

[16] CJEU, case C-362/14, Maximilian Schrems v Data Protection Commissioner, ECLI:EU:C:2015:650, paras. 67-98.

[17] CJEU, case C-362/14, Maximilian Schrems v Data Protection Commissioner, ECLI:EU:C:2015:650, para. 96.

[18] CJEU, case C-362/14, Maximilian Schrems v Data Protection Commissioner, ECLI:EU:C:2015:650, para. 106.

[19] http://ec.europa.eu/justice/data-protection/article-29/press-material/press-release/art29_press_material/2015/20151016_wp29_statement_on_schrems_judgement.pdf, accessed 15/02/2016;The article 29 working party had agreed that should the Commission fail, the national data protection authorities would take all necessary actions, including coordinated enforcement actions, to ensure the effective protection of personal data of European citizens. It should be noted that this would have been a herculean task.

[20] http://europa.eu/rapid/press-release_IP-16-216_en.htm, accessed 15/02/2016.

[21] Congress is currently struggling, trying to rewrite some of the US’s laws on surveillance. However, it remains to be seen what will effectively change, see e.g. http://www.theguardian.com/technology/2015/jun/06/surveillance-privacy-snowden-usa-freedom-act-congress, accessed 15/02/2016. An example of legislation still allowing mass surveillance is the Foreign Intelligence Surveillance Act.

[22] CJEU, case C-362/14, Maximilian Schrems v Data Protection Commissioner, ECLI:EU:C:2015:650, paras. 34, 93-94.

[23] CJEU, case C-362/14, Maximilian Schrems v Data Protection Commissioner, ECLI:EU:C:2015:650, para. 95.

[24] In theory, should the Committee render a negative opinion, the matter would come before the Council, which could render an opposing decision (Art. 31 Directive 95/46/EC).

 

Copyright of the image: Defense Advanced Research Projects Agency (DARPA), https://commons.wikimedia.org/wiki/File:DARPA_Big_Data.jpg?uselang=de.

Regulation of Technology in the EU and beyond – General Data Protection Regulation, Safe Harbor (C-362/14), Data Retention and more

Saar Blueprint features analysis of Safe Harbor Decision „Maximillian Schrems v Data Protection Commissioner“ (C-362/14) and other recent developments

The most recent publication in our Saar Blueprint series has the title „Regulation of Technology in the EU and beyond – The state of play in autumn 2015.“ The analysis features the state of play in the negotiations on the General Data Protection Regulation, a comment on the Safe Harbor Decision (C-362/14) and the Umbrella agreement, an overview on recent developments in Data Retention and more.

You can download the text via this link. We wish you pleasant reading!

Case analysis of the ECtHR judgment in Delfi AS v. Estonia (app. No. 64569/09)

The difficulties of information management for intermediaries

By Oskar Josef Gstrein[1]

 A. Introduction

“The medium is the message”.[2] This phrase coined by the Canadian philosopher Marshal McLuhan in the 1960s seems to be nowhere as true as when it comes to the processing and distribution of information on the internet. The philosophy of media has boomed with the start of the new millennium and also other less “speculative” sciences such as law have to deal more and more with the aspects of information processing.

Since the collection of personal data has become a lucrative business model[3] there is a need for more and better regulation. However, not only the sheer content of data is important. Also aspects of accessibility and possibilities for contextualization define the “value” of data.

Recently, not only private actors try to design the future of the internal market of the European Union in that regard.[4] Regional authorities also seem to become more and more proactive in the field. The European Data Protection Supervisor Giovanni Butarelli is talking about “a defining moment for digital rights in Europe and beyond.”[5] The European Commission has declared the “Digital Single Market” one of its top priorities for the coming years.[6] National politicians like Angela Merkel warn their countries and the entire continent of falling behind in the technological arms race,[7] hence not being able to shape the future of the world. And ultimately, the regional courts keep continuing to deliver judgments which aim at redefining law and its application in the digital landscape.

It could very well be argued that especially the actors last mentioned have a constantly underestimated impact when it comes to shaping the future of cyberspace and the concept of privacy in the digital age. By now the Court of Justice of the European Union (CJEU) has delivered numerous judgments with ground-breaking character.

In the year 2014 it not only struck down the EU’s data retention directive 2006/24/EC on the 8th of April.[8] On the 13th of May it also established the right to delist information from the index of a search engine via its controversial “Google Spain” decision.[9] And it looks like with cases such as Max Schrems’ and his Europe v. Facebook campaign[10] pending before the court[11] the list will not come to an end soon.

What all of these judgments have in common is that their main legal problems are not connected with the content of the information that is being processed. What is crucial is the question of how accessibility and transferability of data is organized and evaluated from a legal perspective.

This can also be seen in the SABAM vs. Netlog judgment[12] and the UPC Telekabel Wien case.[13] Like the already mentioned decisions these cases clearly point to the fact that modern information management and its regulation is not only a matter of the content of information, but especially of the role of the so-called “intermediaries”. The regulation of intermediaries becomes an ever more important aspect when considering the future development of the digital space.[14] Their business practices and conduct is crucial for the accessibility, presentation and contextualization of information. When it comes to understanding the conditionality of liability of such service providers the Articles 12 to 14 of Directive 2000/31/EC can be helpful.[15]

However, regulation of the activities of intermediaries takes not only place within the EU. The Grand Chamber decision in the case Delfi AS v. Estonia[16] from the 16th of June 2015 the Court of Human Rights in Strasbourg (ECtHR) has delivered another important judgment which tries to strike the right balance between the fundamental rights to privacy and the freedom of expression and information.

B. The decision in Delfi AS v. Estonia

I. The background of the case

Delfi AS runs an online newsportal of national importance in the country of Estonia.[17] On the 24th of January 2006 an article with the title “SLK Destroyed Planned Ice Road”[18] was published. The report suggested that AS Saaremaa Laevakompanii (Saaremaa Shipping Company, a public limited liability company) made it impossible to use several ice roads. The latter temporarily connect the Estonian mainland to several islands in the region which SLK normally connects by offering ferry services.

The content of the report was not challenged as such. But Delfi also offered the possibility to comment on the article, which received 185 comments until the 25th of January 2006.[19] Some of them were directly relating to “L” who was a member of the supervisory board of SLK and the most visible public figure of SLK at the material time.[20] The ECtHR gave some examples of the comments under the article in his judgment:[21]

  1.  „(1) there are currents in [V]äinameri
    (2) open water is closer to the places you referred to, and the ice is thinner.
    Proposal – let’s do as in 1905, let’s go to [K]uressaare with sticks and put [L.] and [Le.] in a bag“
  2. „bloody shitheads…they bathe in money anyway thanks to that monopoly and State subsidies and have now started to fear that cars may drive to the islands for a couple of days without anything filling their purses. burn in your own ship, sick Jew!“
  3. „good that [La.’s] initiative has not broken down the lines of the web flamers. go ahead, guys, [L.] into the oven!“
  4. „[little L.] go and drown yourself […]”

L. wanted not only these comments to be removed from the website, but also asked for compensation for non-pecuniary damage. Delfi removed the comments six weeks after the publication.[22] However, when it came to compensation, the publisher denied any responsibility for the content of the comments and claimed it was only acting as intermediary service provider in that regard. Several procedures were conducted in the national courts. Finally, the last instance court in Estonia (the Supreme Court) came to the conclusion that Delfi had a responsibility to protect L from the consequences of the unlawful comments and therefore should have prevented the publication in the first place.[23] Subsequently, in October 2009, Delfi set up a more sophisticated monitoring system for the comments involving a review procedure by a set of moderators who look at any comment before it is published.[24]

II. The proceedings in Strasbourg

After all national remedies had been exhausted Delfi made an application to the ECtHR on the 4th of December 2009. The much discussed judgment[25] of the First Section of the ECtHR from the 10th of October 2013 turned down Delfi’s complaint that there was a violation of the freedom of expression by Estonia whose courts demanded from the company to manage the comments under the article more actively. However, on the 17th of February 2014 the judgment was accepted to be reviewed by the Grand Chamber of the Strasbourg court.

In the proceedings before the Grand Chamber the argumentation of the parties basically stayed the same. Delfi claimed that any responsibility of the company to prevent damage regarding the reputation of L infringed its freedom of expression under Article 10 of the European Convention of Human Rights.[26] The court thus summarizes the position of Delfi with the words:

„The applicant company called on the Grand Chamber to look at the case as a whole, including the question whether the applicant company was to be characterised as a traditional publisher or an intermediary. A publisher was liable for all content published by it regardless of the authorship of the particular content. However, the applicant company insisted that it should be regarded as an intermediary and it had as such been entitled to follow the specific and foreseeable law limiting the obligation to monitor third-party comments. It argued that intermediaries were not best suited to decide upon the legality of user-generated content. This was especially so in respect of defamatory content since the victim alone could assess what caused damage to his reputation.“[27]

Nevertheless, the Grand Chamber basically confirmed the Chamber and national courts’ judgments by coming to the conclusion:

„In connection with the question whether the liability of the actual authors of the comments could serve as a sensible alternative to the liability of the Internet news portal in a case like the present one, the Court is mindful of the interest of Internet users in not disclosing their identity. Anonymity has long been a means of avoiding reprisals or unwanted attention. As such, it is capable of promoting the free flow of ideas and information in an important manner, including, notably, on the Internet. At the same time, the Court does not lose sight of the ease, scope and speed of the dissemination of information on the Internet, and the persistence of the information once disclosed, which may considerably aggravate the effects of unlawful speech on the Internet compared to traditional media. It also refers in this connection to a recent judgment of the Court of Justice of the European Union in the case of Google Spain and Google, in which that court, albeit in a different context, dealt with the problem of the availability on the Internet of information seriously interfering with a person’s private life over an extended period of time, and found that the individual’s fundamental rights, as a rule, overrode the economic interests of the search engine operator and the interests of other Internet users […].“[28]

And

„[f]inally, turning to the question of what consequences resulted from the domestic proceedings for the applicant company, the Court notes that the company was obliged to pay the injured person the equivalent of EUR 320 in compensation for non-pecuniary damage. It agrees with the finding of the Chamber that this sum, also taking into account the fact that the applicant company was a professional operator of one of the largest Internet news portals in Estonia, can by no means be considered disproportionate to the breach established by the domestic courts […].“[29]

C. Interpretation and Context

Considering its institutional aspect the numerous and close references of the Grand Chamber of the ECtHR to EU law and CJEU jurisprudence indicates that at least in the digital space there exists a single space of human rights protection in Europe. Keeping in mind the cumbersome negotiation process concerning the EU’s accession to the ECHR this leaves some hope for a more integration-friendly future which is more strongly oriented at practical necessities than institutional battle.[30]

Materially, the Delfi case refers to the question of “self-censorship” and asks if we have to fear a future where “chilling effects” become part of the everyday experience in the online world.[31] The fact that modern information processing makes surveillance much easier than in the past results in new challenges. The concepts of liberty and freedom have to be emphasized more strongly and updated in the modern context in order to remain intact. Our societies have to create new spaces where it can be expected that no one interferes in the private sphere and where not having to show an expected status quo at any point in time is necessary. Put simply: There needs to be a part in everybody’s life where polarizing – not illegal ­ behavior is possible and accepted.

However, it is also important to emphasize that information networks now are strongly integrated into the lives of their users. With more power comes more responsibility. The fact that a news portal of national importance can be run through the internet also means that it has to be able to live up to the same standards of accountability as traditional media. This is probably the strongest argument why the judgment of the ECtHR was essentially right. The professionalism of Delfi combined with the moderate and proportionate punishment leave the impression of a sound overall evaluation of the situation.

Ultimately, the question remains what Delfi v. Estonia will or should be remembered for. Considering the special circumstances of the case involving much more resources and professionalism than when it comes to the exchange of views via the Internet it seems unlikely that it will set a precedent outside of the world of professional journalism. It would be surprising if the ECtHR and even national courts would had decided in the same way if not a medium of national importance was the place where the unlawful comments were posted. If Delfi had, for example, been a small private weblog of a person or a social community or forum things would have been different. The impact of the comments would not have been that serious.

The actual lesson to be learned from this case is that we live in an age where it is not only important whether (sensitive) data is accessible or not. The question is more and more how easily and through which means it is accessible. This aspect is largely determined by the fact how intermediaries are positioned to process the relevant piece of data and under which regulatory circumstances they are required to interfere. In which scenarios will their social responsibility to protect privacy and the dignity of a person be more important than their duty to enable the free movement of data, thoughts and speech? In order to find the right answer to this question a complex balancing process is needed which can only be successfully concluded by looking at the potential scenarios and concrete cases. There is a strong need for differentiation between the different contexts of data processing.

————————————————-

[1] Oskar Josef Gstrein (gstrein@europainstitut.de) is a research assistant of Prof. Thomas Giegerich, LLM at the Jean-Monnet Chair for European Integration at the Europa-Institut of the Saarland University. He is author of several articles in the field of European institutional law, European human rights protection and privacy issues. His PhD-Thesis is on the topic “The Right to be Forgotten as a Human Right”.

[2] McLuhan, Understanding Media: The extensions of Man, Mentor, 1964, New York, Chapter 1, p. 1: “In a culture like ours, long accustomed to splitting and dividing all things as a means of control, it is sometimes a bit of a shock to be reminded that, in operational and practical fact, the medium is the message.”

[3] Rashid, Surveillance is the Business Model of the Internet: Bruce Schneier, via: https://www.schneier.com/news/archives/2014/04/surveillance_is_the.html – accessed 24.07.2015.

[4] Cf. Microsoft Digital Single Market Communication Response, via: http://mscorp.blob.core.windows.net/mscorpmedia/2015/07/Microsoft-Digital-Single-Market-Communication-Response-FINAL-17-July-2015.pdf – accessed 24.07.2015.

[5] EDPS, Opinion 3/2015, Europe’s big opportunity, p. 9, via: http://bit.ly/1SJrGSq – accessed 28.07.2015.

[6] http://ec.europa.eu/priorities/digital-single-market/ – accessed 24.07.2015.

[7] Merkel said: „Europa – hier spreche ich für ganz Europa, das im Augenblick weder Google, Apple, Facebook noch andere solche Unternehmen hat – darf sich nicht nur auf seine industrielle Wertschöpfung konzentrieren, sondern muss auch darauf achten, geeignete Rahmenbedingungen zu schaffen, um große Datenmengen so zu verarbeiten, dass die Individualität geschützt ist. Darüber wird zurzeit in Europa diskutiert. Deshalb sollten wir nicht nur ablehnen, sondern wir sollten uns auch überlegen, wie wir im Konsumentenbereich noch mehr eigene europäische Unternehmen bekommen und Start-ups fördern können. Denn wir sind hierbei im Augenblick im weltweiten Vergleich nicht vorne dran.“ Rede von Bundeskanzlerin Merkel zum Deutschen Evangelischen Kirchentag am 5. Juni 2015, via: http://www.bundeskanzlerin.de/Content/DE/Rede/2015/06/2015-06-05-rede-merkel-kirchentag.html – accessed 24.07.2015.

[8] CJEU, C‑293/12 and C‑594/12, Digital Rights Ireland, Kärntner Landesregierung, ECLI:EU:C:2014:238, cf. my Blog Post in German via: http://jean-monnet-saar.eu/?p=314 – accessed 24.07.2015.

[9] Media often wrongly refers to this as the „right to be forgotten“ judgement. However, the right to delist information is not about the deletion or erasure of information. It only limits access. Cf. CJEU, C-131/12, Google Spain SL and Google Inc. v Agencia Española de Protección de Datos (AEPD) and Mario Costeja González, ECLI:EU:C:2014:317. Also compare the Art 29 working group guidelines on the implementation accessible via: http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp225_en.pdf – accessed 24.07.2015. And finally a report on the success of the right to delist information from the 18.06.2015 via: http://ec.europa.eu/justice/data-protection/article-29/press-material/press-release/art29_press_material/20150618_wp29_press_release_on_delisting.pdf – accessed 24.07.2015.

[10] http://www.europe-v-facebook.org/ – accessed 24.07.2015.

[11] CJEU, C-362/14, Reference for a preliminary ruling from High Court of Ireland (Ireland) made on 25.06.2014 – Maximillian Schrems v Data Protection Commissioner.

[12] CJEU, C-360/10, Belgische Vereniging van Auteurs, Componisten en Uitgevers CVBA (SABAM) v Netlog NV, ECLI:EU:C:2012:85.

[13] CJEU, C-314/12, UPC Telekabel Wien GmbH, ECLI:EU:C:2014:192.

[14] Cf. Gasser, Schulz (editors), Governance of Online Intermediaries: Observations from a Series of National Case Studies, Berkman Center Research Publication No. 2015-5, via: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2566364 – accessed 24.07.2015.

[15] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‚Directive on electronic commerce‘), Official Journal L 178 , 17/07/2000 P. 0001 – 0016, via: http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:32000L0031:en:HTML – accessed 28.07.2015. Cf. Woods, Delfi v Estonia: Curtailing online freedom of expression?, via: http://eulawanalysis.blogspot.de/2015/06/delfi-v-estonia-curtailing-online.html – accessed 28.07.2015.

[16] ECtHR, Delfi AS v Estonia, App. No. 64569/09, 16.06.2015.

[17] http://www.delfi.ee/ – accessed 28.07.2015.

[18] ECtHR, Delfi AS v Estonia, Mn 16.

[19] Ibidem, Mn 17.

[20] Ibid., Mn 16.

[21] Ibid., Mn 18.

[22] Ibid., Mn 19.

[23] Ibid., Mn 31.

[24] Ibid., Mn 32.

[25] ECtHR, Delfi AS v Estonia, App. No. 64569/09, 10.10.2013. via: http://hudoc.echr.coe.int/eng?i=001-126635 – accessed 28.07.2015; Cf. Synodinou, Intermediaries‘ liability for online copyright infringement in the EU: evolutions and confusions, Computer Law & Security Review, 2015, 31(1), p. 57 – 67; McCarthy, Is the writing on the wall for online service providers? Liability for hosting defamatory user-generated content under European and Irish law, Hibernian Law Journal, 2015, 14, p. 16 – 55.

[26] ECtHR, Delfi AS v Estonia, App. No. 64569/09, 16.06.2015, Mn 68.

[27] Ibid., Mn 66.

[28] Ibid., Mn 147.

[29] Ibid. Mn 160.

[30] Cf. the Blog posts on the topic: https://jean-monnet-saar.eu/?p=690 – accessed 28.07.2015; https://jean-monnet-saar.eu/?p=745 – accessed 28.07.2015.

[31] Cf. Cox, Delfi v. Estonia: Privacy Protection and Chilling Effect, via: http://www.verfassungsblog.de/en/delfi-v-estonia-privacy-protection-and-chilling-effect/ – accessed 28.07.2015.