Hostname: page-component-7479d7b7d-wxhwt Total loading time: 0 Render date: 2024-07-10T12:31:31.226Z Has data issue: false hasContentIssue false

Privacy: Back to the Roots

Published online by Cambridge University Press:  06 March 2019

Extract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

Phenomena such as cloud computing, but also ambient technology, chain informatization, and social networking sites put into question the continuing applicability and relevance of existing legal frameworks, in particular the European Data Protection Directive (henceforth the DPD or the Directive), which dates back to 1995. Its framework of assigning roles of controller and processor appears to stand up no longer. It can be argued that it does not help any more in assigning responsibility for the processing of personal data. By a strict application of the DPD, the data subject can even be construed as playing the role of controller. Not only does the functioning of the principles of the roles need a reconsideration, but other essential principles, such as that of purpose-binding, require it, as well. For example, because data, which in a social networking context are disclosed to friends, are also used for targeted advertising and tailoring services, etc., the purpose-binding called for by the DPD becomes, at the very least, opaque. Furthermore, assigning responsibility to the actual processor in charge is just as unclear. Therefore, these current phenomena make clear that the conceptual foundations of the legislative frameworks, which purport to facilitate and protect privacy, require reflection.

Type
Articles
Copyright
Copyright © 2012 by German Law Journal GbR 

References

1 Cloud computing knows many definitions, but for the purpose of this article it is sufficient to describe it as a networked body of web-based services providing online storage capacity and applications.Google Scholar

2 Ambient intelligence (AmI) implies a real-time adaptive environment, in which most adaptive decisions are taken by machines in a process of machine-to-machine communication. These decisions are based on what is called autonomic profiling, which severely restricts human intervention while needing a continuous and dynamic flow of information.Google Scholar

3 “Chain informatization” refers to the automated sharing of information between private-sector organizations and government agencies, but also between the organizations within the concerned sector itself.Google Scholar

4 Directive 95/46/EC, of the European Parliament and of the Council of 24 Oct. 1995 on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data, 1995 O.J. (L 281) 31. Incidentally, there are more major frameworks in this area which are growing old. Compare, for example, the OECD Guidelines, which date back to 1980. David Wright, Paul De Hert & Serge Gutwirth, Are the OECD Guidelines at 30 Showing Their Age?, 54 Comm. ACM 119.Google Scholar

5 For a useful discussion of the relevance of existing frameworks for privacy protection for this new technological phenomenon, compare Ann Cavoukian, Privacy in the Clouds, 1 Identity Info. Soc'y 89 (2008), available at http://www.springerlink.com/content/e13m644537204002/fulltext.pdf.Google Scholar

6 “Controller” means “the natural or legal person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data.” Council Directive 95/46/EC, supra note 4, art. 2.Google Scholar

7 “Processor” “shall mean a natural or legal person, public authority, agency or any other body which processes personal data on behalf of the controller … .” Id. Google Scholar

8 Nadezhda Purtova, Property Rights in Personal Data: A European Perspective 176 (2011). Dr. Purtova claims that the rigid structure of this framework causes confusion and thus undermines the effectiveness of its mechanisms of accountability.Google Scholar

9 Opinion 1/2010 of the Article 29 Data Protection Working Party on the Concepts of “Controller” and “Processor” (16 Feb. 2010), available at http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2010/wp169_en.pdf.Google Scholar

10 Compare Opinion 5/2009 of the Article 29 Data Protection Working Party on Online Social Networking, §3.1 (12 June 2009), available at http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2009/wp163_en.pdf (dealing with the concept of data controller in the context of online social networking).Google Scholar

11 Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of the Regions: A Comprehensive Approach on Personal Data Protection in the European Union, COM (2010) 609 final (4 Nov. 2010). The Commission conducted a public consultation in 2009 on the review of the current legal framework. See the replies to this consultation archived at http://ec.europa.eu/justice/news/consulting_public/news_consulting_0003_en.htm. At the time of putting the finishing touches on this article, an unofficial version of the Proposal for a Regulation of the European Parliament and of the Council on the Protection of Individuals with Regard to the Processing of Personal Data and the Free Movement of Such Data (General Data Protection Regulation) circulated. On the whole, the general gist of the Proposal was much in line with objectives as mentioned in the Commission document announcing the review. In summary, the main policy objectives for the Commission are to: (1) Modernize the EU legal system for the protection of personal data, in particular to meet the challenges resulting from globalization and the use of new technologies; (2) strengthen individuals’ rights, and at the same time reduce administrative formalities to ensure a free flow of personal data within the E.U. and beyond; (3) improve the clarity and coherence of the EU rules for personal data protection and achieve a consistent and effective implementation and application of the fundamental right to the protection of personal data in all areas of the Union's activities.Google Scholar

12 Viviane Reding, Vice-President of the European Comm'n Responsible for Justice, Fundamental Rights and Citizenship, Doing the Single Market Justice (16 Sept. 2010), available at http://www.lisboncouncil.net/component/downloads/?id=368. See also Reding, Viviane, The Upcoming Data Protection Reform for the European Union, 1 Int'l Data Privacy L. 3 (2011), available at http://idpl.oxfordjournals.org/content/1/1/3.full.pdf+html.Google Scholar

13 “There is an inherent conflict between the protection of personal data and the free trans-border flow of personal data.” Wright, De Hert & Gutwirth, supra note 4, at 123.Google Scholar

14 For clarity's sake it is noted here that the opinions expounded in this paper are informed by a liberal-democratic outlook.Google Scholar

15 The German Constitutional Court traced the right to privacy to the fundamental right to the free development of one's personality. Article 2(1) of the German Constitution states, “The value and dignity of the person based on free self-determination as a member of a free society is the focal point of the order established by the Basic Law.” Grundgesetz für die Bundesrepublik Deutschland [Grundgesetz] [GG] [Basic Law], 23 May 1949, BGBl. I. The general personality right as laid down in Article 2(1) GG in connection with Article 1(1) GG serves to protect these values.Google Scholar

16 Bundesverfassungsgericht [BVerfG - Federal Constitutional Court] Case No. 1 BvR 256/08, 2 Mar. 2010, 121 BVerfGE 1 (Ger.); Press Release, Bundesverfassungsgericht, Konkrete Ausgestaltung der Vorratsdatenspeicherung nicht verfassungsgemäss, BVerfG Press Release 11/2010 (2 Mar. 2010), available at http://www.bverfg.de/pressemitteilungen/bvg10-011.html (English translation available at http://www.bundesverfassungsgericht.de/pressemitteilungen/bvg10-011en.html).Google Scholar

17 Christian DeSimone, Pitting Karlsruhe Against Luxembourg? German Data Protection and the Contested Implementation of the EU Data Retention Directive, 11 German L.J. 291, 316 (2010) (arguing that Germany by this ruling risked a supranational legal crisis with adverse impact on European Union integration).Google Scholar

18 Council Directive 95/46/EC, supra note 4. Article 8 of the Charter of the European Union states, “Everyone has the right to the protection of personal data concerning him or her.” Charter of Fundamental Rights of the European Union art. 8, Dec. 18, 2000, 2000 O.J. (C 364) 1.Google Scholar

19 Organization for Economic Co-operation & Development, OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (23 Sept. 1980), available at http://www.oecd.org/document/18/0,2340,en_2649_34255_1815186_1_1_1_1,0.html.Google Scholar

20 Council of Europe, Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data, 28 Jan. 1981, E.T.S. No. 108, available at http://conventions.coe.int/Treaty/EN/Reports/HTML/108.htm.Google Scholar

21 In the U.K., for example, reference can be made to the Data Protection Act (1984) and in France to the Act Regarding Informatics, Files and Liberties (1978). For an insightful table of the world-wide diffusion of data protection legislation, see Colin J. Bennett & Charles D. Raab, The Governance of Privacy: Policy Instruments in Global Perspective 127 (2006).Google Scholar

22 Some member states applied strict limitations whereas other states applied no limitations at all. Neil Robinson et al., RAND Europe, Review of the European Data Protection Directive 6 (2009), available at http://www.ico.gov.uk/upload/documents/library/data_protection/detailed_specialist_guides/review_of_eu_dp_directive.pdf.Google Scholar

23 Id. at 7.Google Scholar

24 Id. at 24.Google Scholar

25 Antoinette Rouvroy & Yves Poullet, The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy, in Reinventing Data Protection? 45, 68 (Serge Gutwirthet al. eds., 2009).Google Scholar

27 The notion of consent is, however, controversial. It will be discussed later on, in the section dealing with the weaknesses of the DPD.Google Scholar

28 Bygrave, Lee A., Data Protection Law: Approaching Its Rationale, Logic and Limits 66 (2002).Google Scholar

29 Bygrave observes that these criteria can be classified into 5 categories. The two most pertinent categories here are (1) that the processing is necessary to execute a task in the public interest and (2) that it is carried out in pursuance of legitimate interests that override the conflicting interests of the individual. Id. at 66 n.251.Google Scholar

30 In the Proposal a new article is introduced providing the conditions of the right to be forgotten, including the right to obtain erasure of any public Internet link to, copy of, or replication of the personal data relating to the data subject contained in any publicly available communication service.Google Scholar

31 Council Directive 95/46/EC, supra note 4, pmbl., recital 25.Google Scholar

32 The Article 29 Working Party has tried to provide a solution in a rather broad definition of the concept of personal data. It suggests that in order to find that data relate to a person, they must contain an element of content, a purpose element, or a result element. Only then, says the Working Party, can the data be classified as personal. A “content” element is present when it relates to a person in the most common understanding of the word, e.g., an RFID chip in a passport. A “purpose” element is present when the data are (likely to be) used with the purpose to treat an individual or influence his behavior, e.g., a call log for a telephone. A “result” element is present when the use of data may have an impact on a person, e.g., monitoring of a taxi's position to optimize service having an impact on drivers. Opinion 4/2007 of the Article 29 Data Protection Working Party on the Concept of Personal Data, at 10–11 (20 June 2007), available at http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2007/wp136_en.pdf.Google Scholar

33 A Comprehensive Approach on Personal Data Protection in the European Union, supra note 11, at 5. Furthermore, in data-mining situations, decisions are taken about persons based on data that fit a profile, but these are not necessarily applicable to every individual affected by this decision.Google Scholar

34 Cavoukian, supra note 5, at 90.Google Scholar

35 In the Eurobarometer report of 2008 the survey results show that 64% of respondents are aware that organizations that collect personal information must provide individuals with information about their identity, the purpose of the collection, and the intention to share the data with other organizations. Even though this is a high percentage, it may still be doubted whether the respondents have actually read the required statements. The Gallup Org., Flash Eurobarometer 225: Data Protection in the European Union: Citizens’ Perceptions 31 (2008), available at http://ec.europa.eu/public_opinion/flash/fl_225_en.pdf.Google Scholar

36 Robinson et al., supra note 22, at 29 (“Privacy policies are written by lawyers, for lawyers, and appear to serve little useful purpose for the data subject due to their length, complexity and extensive use of legal terminology.”) Consumers themselves appear to feel that current mechanisms do not help them to understand their rights. Cf. Office for Developed & Transition Economies, Consumers Int'l, Privacy@net: An International Comparative Study of Consumer Privacy on the Internet 26–27 (2001), available at http://www.consumersinternational.org/media/304817/privacy@net-%20an%20international%20comparative%20study%20of%20consumer%20privacy%20on%20the%20internet.pdf.Google Scholar

37 The Commission is of the opinion that the Directive already offers the Member States the possibility to provide for wide exemptions from notification in cases where low risk is involved or when the controller has appointed a data protection official. At most, some further simplification would be useful and should be possible without amending the existing Articles. Commission of the European Communities First Report on the Implementation of the Data Protection Directive (95/46/EC), COM (2003) 265 final (15 May 2003).Google Scholar

38 Robinson et al., supra note 22, at 32.Google Scholar

39 Robinson notes that the cost to the US national economy just for reading each privacy policy was estimated to be $365bn, based on the length of time it takes to read a privacy policy and the monetary value of that time. Id. at 30. Because the current obligation to notify all data processing operations to the DPAs is considered a cumbersome obligation which does not in itself provide “any real added value for the protection of individuals’ personal data,” the Commission proposes to revise and simplify the current notification system. A Comprehensive Approach on Personal Data Protection in the European Union, supra note 11, at 10.Google Scholar

40 Robinson et al., supra note 22, at 29.Google Scholar

41 According to Bennett and Raab, because normative information privacy principles are not self-enforcing, public agencies play a role in the enforcement and oversight of data protection legislation. The most important are the supervisory bodies required under the EU Directive. Bennett & Raab, supra note 21, at 133.Google Scholar

42 A Comprehensive Approach on Personal Data Protection in the European Union, supra note 11, at 17. In the Proposal the Commission has taken note of these complaints and introduced a centralized European Data Protection Board with stronger enforcement powers.Google Scholar

43 Bennett & Raab, supra note 21, at 146.Google Scholar

44 Id. at 35. Suggestion by the Commission is made with the goal of strengthening the existing provision on sanctions by including criminal sanctions in cases of serious data protection violations. In addition, DPAs and other associations representing data subjects’ interests should also be granted the power to bring an action before the courts when infringements of data protection rules affect more individuals than one. A Comprehensive Approach on Personal Data Protection in the European Union, supra note 11, at 9. This proposal returns in the unofficial Proposal. The suggested fines that may be imposed by supervisory authorities in the Proposal are quite daunting—up to a maximum of €1,000,000.Google Scholar

45 Many are the roles of the data-protection authorities. Bennett and Raab denote seven roles: ombudsmen, auditors, consultants, educators, negotiators, policy advisers and enforcers. Bennett & Raab, supra note 21, at 134.Google Scholar

46 Robinson et al., supra note 22, at 39.Google Scholar

47 Cf. Gallup Org., supra note 35.Google Scholar

48 Koops, Bert-Jaap & Leenes, Ronald, 'Code’ and the Slow Erosion of Privacy, 12 Mich. Telecomm. & Tech. L. Rev. 115, 123–29 (2005) (presenting a helpful and succinct overview of the concepts of privacy and privacy laws). See also Philosophical Dimensions of Privacy: An Anthology (Ferdinand D. Schoeman ed., 1984).Google Scholar

49 Bennett & Raab, supra note 21, at 7. According to Bennett and Raab, the concept has “an aesthetic and humanistic affinity with individual autonomy and dignity.” But also that it can be justified in “philosophical, political, or utilitarian terms.”Google Scholar

50 Solove, Daniel J., Understanding Privacy 1 (2009) (“Commentators have declared it ‘essential to democratic government,’ critical to ‘our ability to create and maintain different sorts of social relationships with different people,’ necessary for ‘permitting and protecting an autonomous life,’ and important for ‘emotional and psychological tranquility.'”). It has been hailed as “an integral part of our humanity,” “the heart of our liberty,” and “the beginning of all freedom.” See also James Q. Whitman, The Two Western Cultures of Privacy: Dignity Versus Liberty, 113 Yale L.J. 1151 (2004).Google Scholar

51 Eberle, Edward J., Human Dignity, Privacy and Personality in German and American Constitutional Law, 1997 Utah L. Rev. 963 (1997).Google Scholar

52 Id. at 1000.Google Scholar

53 Bundesverfassungsgericht [BVerfG – Federal Constitutional Court], Case No. 1 BvR 209/83, 15 Dec. 1983, 65 BVerfGE 1 (Ger.).Google Scholar

54 Eberle, supra note 51, at 1010.Google Scholar

55 Bundesverfassungsgericht [BVerfG – Federal Constitutional Court], Case No. 1 BvR 435/68, 24 Feb. 1971, 30 BVerfGE 173 (Ger.).Google Scholar

56 Bundesverfassungsgericht [BVerfG – Federal Constitutional Court], Case No. 1 BvR 536/72, 5 June 1973, 35 BVerfGE 202 (Ger.).Google Scholar

57 “The rights to the free development of one's personality and human dignity secure for everyone an autonomous sphere in which to shape one's private life by developing and protecting one's individuality.” Id. Google Scholar

58 Shoemaker, David W., Self-Exposure and Exposure of the Self: Informational Privacy and the Presentation of Identity, 12 Ethics & Info. Tech. 3, 13 (2010).Google Scholar

60 Frankfurt, Harry G., Freedom of the Will and the Concept of a Person, 68 J. Phil. 5 (1971).Google Scholar

61 Agre, Philip E., Introduction to Technology and Privacy: The New Landscape 1, 8 (Philip E. Agre & Marc Rotenberg eds., 1998) (following Erving Goffman, The Presentation of Self in Everyday Life (1959), “People construct their identities, [Goffman] suggested, through a negotiation of boundaries in which the parties reveal personal information selectively according to a … moral code that [Goffman] called the ‘right and duty of partial display.'”).Google Scholar

62 Shoemaker, supra note 58, at 13.Google Scholar

63 Rachels, James, Why Privacy Is Important, 4 Phil. & Pub. Aff. 323, 331 (1975) (stating that if we cannot control who has access to us, we cannot control patterns of behavior we need to adopt).Google Scholar

64 Benn, Stanley I., Privacy, Freedom, and Respect for Persons, in Philosophical Dimensions of Privacy: An Anthology at 223, 228 (Ferdinand D. Schoeman ed., 1984).Google Scholar

65 Benn puts it succinctly: “By the principle of respect for persons, then, I mean the principle that every human being, insofar as he is qualified as a person, is entitled to this minimal degree of consideration.” Id. at 229.Google Scholar

66 Brandeis, Louis D. & Warren, Samuel D., The Right to Privacy, 4 Harv. L. Rev. 193 (1890).Google Scholar

67 Rouvroy & Poullet, supra note 25, at 53.Google Scholar

68 GG, art. 1.Google Scholar

69 GG, art. 2.Google Scholar

70 Rachels, supra note 63.Google Scholar

71 Goffman, Erving, The Presentation of Self in Everyday Life (1959).Google Scholar

72 Post, Robert C., The Social Foundations of Privacy: Community and Self in the Common Law Tort, 77 Calif. L. Rev. 957 (1989).Google Scholar

73 Id. at 963 (referring to Goffman, who says that for a complete man to be complete, “individuals must hold hands in a chain of ceremony, each giving deferentially with proper demeanor to the one on the right what will be received deferentially from the one on the left,” Erving Goffman, The Nature of Deference and Demeanor, in Interaction Ritual: Essays on Face-to-Face Behavior 47, 84–85 (1967)).Google Scholar

74 Kahn, Jonathan, Privacy as a Legal Principle of Identity Maintenance, 33 Seton Hall L. Rev. 371 (2003).Google Scholar

75 Post, supra note 72, at 973.Google Scholar

76 Nissenbaum, Helen, Privacy as Contextual Integrity, 79 Wash. L. Rev. 119 (2004).Google Scholar

77 Compare, e.g., Post, supra note 72.Google Scholar

78 Westin, Alan F., Privacy and Freedom 7 (1967).Google Scholar

79 Arthur Raphael Miller, The Assault on Privacy: Computers, Data Banks, and Dossiers 25 (1971).Google Scholar

80 Bundesverfassungsgericht [BVerfG – Federal Constitutional Court], Case No. 1 BvR 209/83, 15 Dec. 1983, 65 BVerfGE 1 (Ger.).Google Scholar

81 DeSimone, supra note 17, at 294. DeSimone contends that this line of thinking stems from the theory of role playing as developed by the German legal philosopher Paul Tiedemann.Google Scholar

82 Bundesverfassungsgericht [BVerfG – Federal Constitutional Court], Case No. 1 BvR 209/83, 15 Dec. 1983, 65 BVerfGE 1, para. 43 (Ger.).Google Scholar

83 Id. at para. 44.Google Scholar

84 Schwartz, Paul, The Computer in German and American Constitutional Law: Towards and American Right of Informational Self-Determination, 37 Am. J. Comp. L. 675, 690 (1989).Google Scholar

85 Bundesverfassungsgericht [BVerfG – Federal Constitutional Court], Case No. 1 BvR 209/83, 15 Dec. 1983, 65 BVerfGE 1, para. 46 (Ger.). All legislation must be checked for a valid legislative basis, clarity of norms, and observance of the principles of proportionality.Google Scholar

86 Robinson et al., supra note 22.Google Scholar

87 Viviane Reding, Vice-President of the European Comm'n, E.U. Justice Comm'r, Your Data, Your Rights: Safeguarding Your Privacy in a Connected World (16 Mar. 2011), available at http://europa.eu/rapid/pressReleasesAction.do?reference=SPEECH/11/183.Google Scholar

88 Westin, Alan F., Social and Political Dimensions of Privacy, 59 J. Soc. Issues 431, 431 (2003).Google Scholar

89 Rotenberg, Marc, Fair Information Practices and the Architecture of Privacy (What Larry Doesn't Get), 2001 Stan. Tech. L. Rev. 1, ¶¶ 29–30.Google Scholar

90 Simitis presciently predicted the same development in the pre-Internet era already. “The process of consent is no more than a ‘mystification’ that ignores the long-standing experience that the value of a regulatory doctrine such as ‘informed consent’ depends entirely on the social and economic context of the individual activity.” Spiro Simitis, Reviewing Privacy in an Information Society, 135 U. Pa. L. Rev. 707, 737 (1987).Google Scholar

91 Note also Rotenberg's painfully accurate attack on Lessig's reference to a single web certification association as proof that a standard response to questions of data practices is “choice”. Rotenberg, supra note 89, at ¶ 33.Google Scholar

92 Rouvroy & Poullet, supra note 25, at 61. They refer to Burkert, who said that privacy may be considered a “fundamentally fundamental right.”Google Scholar

93 Opinion 15/2011 of the Article 29 Data Protection Working Party on the Definition of Consent, at 5 (13 July 2011), available at http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2011/wp187_en.pdf.Google Scholar

94 Michael Scott Moore, Germany's New Right to Online Privacy, Spiegel Online, 28 Feb. 2008, http://www.spiegel.de/international/germany/0,1518,538378,00.html (last visited 12 Mar. 2012). The court president admitted it was an unprecedented move to introduce a new civil right in this way.Google Scholar

95 Bundesverfassungsgericht [BVerfG – Federal Constitutional Court], Case No. 1 BvR 209/83, 15 Dec. 1983, 65 BVerfGE 1 (Ger.).Google Scholar

96 In the Esra case, the Court arrived at the conclusion that information about the core area of private life is subject to absolute protection. It even trumps the artistic freedom of the author of the roman a clef, entitled Esra, in which the author depicted his former girlfriend and her mother very vividly and intimately while it contained a traditional disclaimer that all characters in it were invented. Paul M. Schwartz & Karl-Nikolaus Peifer, Prosser's Privacy and the German Right of Personality: Are Four Privacy Torts Better Than One Unitary Concept?, 98 Calif. L. Rev. 1925, 1960 (2010) (citing BVerfG 13 June 2007 (Esra case), BVerfGE 119(1), para 88).Google Scholar

97 Bundesverfassungsgericht [BVerfG – Federal Constitutional Court], Case No. 1 BvR 370/07, 27 Feb. 2008, 120 BVerfGE 274 (Ger.).Google Scholar

98 The law gave police and state officials too much power to spy on individuals using “trojan horse” software which can be delivered by email.Google Scholar

99 Cf. GG, arts. 1 & 2.Google Scholar

100 It appears that the property discussion has definitely crossed the Atlantic. See the recent impressive doctoral thesis by Nadezhda Purtova from Tilburg University in The Netherlands. Purtova, supra note 8.Google Scholar

101 Schwartz, Paul M., Beyond Lessig's Code for Internet Privacy: Cyberspace Filters, Privacy-Control, and Fair Information Practices, 2000 Wis. L. Rev. 743, 764 (2000).Google Scholar

102 Id. at 754 (coining the “blinking twelve” problem because many Americans already with VCRs did not bother to read the manual, which would instruct them on how to set the time on the recorder, and therefore the display kept showing a blinking twelve).Google Scholar

103 Solove, Daniel J., Conceptualizing Privacy, 90 Calif. L. Rev. 1087, 1113–14 (2002).Google Scholar

104 Purtova, supra note 8, at 73 (arguing that there is a logic of property in one's entitlement to defend one's own against the world). The erga omnes effect stems from the discussion in English legal literature as to when a right is admitted into a category of property rights. Id. According to some, the right must be alienable, it must die when the object perishes, or it must take effect against an indefinite number of persons until that time (erga omnes effect). Id. Google Scholar

105 Id. at 259.Google Scholar

106 Locke, John, Second Treatise of Government ch. V, § 27 (C.B. Macpherson ed., Hackett Publ'g Co., Inc. 1980) (1690).Google Scholar

107 Purtova, supra note 8, at 265. Purtova even goes as far as claiming that her theory of propertization of personal data is consistent with the principle of informational self-determination as it occurs in Article 7 of the Data Protection Directive.Google Scholar

108 Koops & Leenes, supra note 48, at 186–87 (deeming the commodification solution “ineffective” and concludes that it “ultimately fails”).Google Scholar

109 Dommering argues in a sense along the same line when he says the organization of a market of personal data will never replace public law supervision but will be a welcome addition. E.J. Dommering, Recht op persoonsgegevens als zelfbeschikkingsrecht, in 16 Miljoen BN'ers? Bescherming van Persoonsgegevens in het Digitale Tijdperk 83, 98 (J.E.J. Prins et al. eds., 2010).Google Scholar

110 Kang, Jerry & Buchner, Benedikt, Privacy in Atlantis, 18 Harv. J.L. & Tech. 229 (2004).Google Scholar

111 Id. at 263.Google Scholar

112 Look at the reality TV shows and blogging. Apparently the individuals concerned do not exercise their control to enhance their privacy but rather transform themselves into entertainment packages.Google Scholar

113 Kang & Buchner, supra note 110, at 266.Google Scholar

114 Schwartz, supra note 101, at 761.Google Scholar

115 Poullet, Yves, Data Protection Legislation: What Is at Stake for Our Society and Democracy?, 25 Computer L. & Security Rev. 211, 223 (2009).Google Scholar

116 See also the results obtained in EU research projects, such as Future IDentity Info. Soc. [FIDIS], http://www.fidis.net (last visited 12 Mar. 2012), and especially D16.3: Towards Requirements for Privacy-Friendly Identity Management in eGovernment, FIDIS (J.C. Buitelaar, M. Meints & E. Kindt eds., 14 June 2009), available at http://www.fidis.net/fileadmin/fidis/deliverables/new_deliverables3/2009_06_14_Fidis_D16.3_Reqs_PF_eGov_v1.2_final.pdf. See also PRIME - Privacy & Identity Mgmt. Eur., https://www.prime-project.eu (last visited 8 Mar. 2012); PrimeLife—Privacy & Identity Mgmt. Eur. Life, http://www.primelife.eu (last visited 12 Mar. 2012). An effort also worth mentioning here is the introduction of “a new ‘species,’ the Personal Data Guardian, created through a fusion of law and technology.” Jerry Kang et al., Self-Surveillance Privacy, 97 Iowa L. Rev. 809 (2012).Google Scholar

117 Poullet, supra note 115, at 224.Google Scholar

118 Marco Casassa Mont et al., Towards Accountable Management of Identity and Privacy: Sticky Policies and Enforceable Tracing Services, HP Laboratories Bristol (19 Mar. 2003), available at http://www.hpl.hp.com/techreports/2003/HPL-2003-49.pdf.Google Scholar

119 See for example the summary descriptions of two PRIME tools in D 7.12: Behavioural Biometric Profiling and Transparency Enhancing Tools, FIDIS 57–59 (Mireille Hildebrandt ed., 4 Mar. 2009), available at http://www.fidis.net/fileadmin/fidis/deliverables/fidis-wp7-del7.12_behavioural-biometric_profiling_and_transparency_enhancing_tools.pdf. The PRIME Data Track Tool allows the user to exercise his rights of deletion of, correction of, or access to the data about the user currently stored by a profiling server. Another PRIME tool relevant here is the Assurance Control Function. This is a kind of counter-profiling tool, because it is based on audits, seals, and reputation mechanisms. It will give the user information on the presence of the data controller in blacklists or disclosure lists and will indicate whether the controller has been certified by privacy seals of different kinds. The user can thus reach an informed decision on whether to trust the data controller or not.Google Scholar

120 I cite in this respect part of the definition formulated in the FIDIS deliverable, D7.7: RFID, Profiling, and AmI, FIDIS (Mireille Hildebrandt & Martin Meints eds., 31 Aug. 2006), available at htrtp://www.fidis.net/fileadmin/fidis/deliverables/fidis-wp7-del7.7.RFID_Profiling_AMI.pdf.Google Scholar

** The point would be to have some idea of the selection mechanisms (application of profiles) that may be applied, allowing a person adequate anticipation. To be able to achieve this the data subject needs access—in addition to his own personal data and a profiling/reporting tool—to additional external data sources, allowing some insight in the activities of the data controller. Based on this additional information the data subject could perform a kind of counter profiling.Google Scholar

121 P3P was developed by a group of private companies, known as the World Wide Web Consortium. Cf. Tim Berners-Lee & Mark Fischetti, Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web by Its Inventor (1999). See also P3P 1.0: A New Standard in Online Privacy, Platform for Privacy Preferences Initiative (12 May 2006), http://www.w3.org/P3P/brochure.html (last visited 12 Mar. 2012).Google Scholar

122 The Art 29 Working Group of the E.U. has rejected P3P because it leads to an inversion of responsibility, since use of P3P in the absence of a framework of enforceable data protection rules risks shifting the onus primarily onto the individual user to protect himself, a development which would undermine the internationally established principle that it is the ‘data controller’ who is responsible for complying with data protection principles. It is, of course, interesting that this opinion is exactly contrary to my argument of taking individual freedom of choice as a starting point. Opinion 1/98 of the Working Party on the Protection of Individuals with Regard to the Processing of Personal Data on Platform for Privacy Preferences (P3P) and the Open Profiling Standard (OPS), at 1–3 (16 June 1998), available at http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/1998/wp11_en.pdf.Google Scholar

123 Kang & Buchner, supra note 110, at 262.Google Scholar

124 Lawrence Lessig, Code and Other Laws of Cyberspace 160–61 (1999).Google Scholar

125 Kang & Buchner, supra note 110, at 263.Google Scholar

126 The principles of Privacy by Design can be summarized as follows. Technology should be designed in such a way that as little personal data are processed as are necessary; the ICT system should grant the user effective means for control; designers and users of a new ICT system must take care that the user is informed about the way of functioning of the system; access should be limited to authorized persons; quality of data should be supported by technical means; in case of the use of data for different purposes within the same system, the separate processes should be separated in a safe manner.Google Scholar

127 Interestingly the Proposal introduces the data protection impact assessment which is to be submitted to the supervisory authority before the processing begins.Google Scholar

128 Reding, supra note 87, at 2.Google Scholar

129 Koops & Leenes, supra note 48, at 187. But see J.J.F.M. Borking, Privacyrecht is code: Over het gebruik van Privacy Enhancing Technologies 453 (2010).Google Scholar

130 On the priority of these two data concepts, see Hert, Paul De & Gutwirth, Serge, Privacy, Data Protection and Law Enforcement. Opacity of the Individual and Transparency of the Power, in Privacy and the Criminal Law 61, 74 (Erik Claes et al. eds., 2006).Google Scholar

131 Robinson et al., supra note 22.Google Scholar

132 It is fair to note that the report is aware of the importance of a human rights approach. “Our research indicated broad agreement that a human rights approach is important and should be retained.” Id. at 47.Google Scholar

133 Id. at 51.Google Scholar

134 Id. at 51–52.Google Scholar

135 See D17.4: Trust and Identification in the Light of Virtual Persons, FIDIS ch. 3, at 16–37 (David-Olivier Jaquet-Chiffelle & Hans Buitelaar eds., 25 June 2009), available at http://www.fidis.net/fileadmin/fidis/deliverables/new_deliverables/fidis-wp17-del17.4_Trust_and_Identification_in_the_Light_of_Virtual_Persons.pdf.Google Scholar

136 Robinson et al., supra note 22, at 60.Google Scholar

137 Dommering, supra note 109, at 97. Consolidated Versions of the Treaty on European Union and of the Treaty Establishing the European Community art. 6, 19 Dec. 2006, 2006 O.J. (C 321 E) 1, in conjunction with the Charter of Fundamental Rights of the European Union, supra note 18, ch. VII, art. 52(3).Google Scholar

138 Brownsword, Roger, Rights, Regulation, and the Technological Revolution 41–43 (2008)Google Scholar

139 Floridi, Luciano, The Ontological Interpretation of Informational Privacy, 7 Ethics & Info. Tech. 185 (2005).Google Scholar

140 The Proposal takes the same stance and works this out in a modernized setting, but it is still hampered by the double motive for the DPD. It is the digital economy which is to be strengthened or at least not held back by bothersome protection measures for the privacy of its citizens. Note the extensive attempt the Proposal makes to improve consistency of the measures across the E.U. so as to enable a prosperous internal market. “Personal data protection therefore plays a central role in the Digital Agenda for Europe, and more general in the Europe 2020 Strategy.”Google Scholar

141 Poullet, Yves & Dinant, Jean-Marc, Towards New Data Protection Principles in a New ICT Environment, 5 Internet, L. & Pol. E-Journal 1 (2007).Google Scholar

142 Floridi, supra note 139, at 195 (“[T]he right to informational privacy [is] … a right to personal immunity from unknown, undesired or unintentional changes in one's own identity as an informational entity … .”).Google Scholar

143 Westin's definition says it is the claim of an individual to determine what information about himself or herself should be known to others. Westin, supra note 88, at 431.Google Scholar