Data Protection Directive

The Data Protection Directive, officially Directive 95/46/EC, enacted in October 1995, is a European Union directive which regulates the processing of personal data within the European Union (EU) and the free movement of such data. The Data Protection Directive is an important component of EU privacy and human rights law.

Directive 95/46/EC
European Union directive
TitleDirective on the protection of individuals with regard to the processing of personal data and on the free movement of such data
Made byEuropean Parliament and Council
Journal referenceL281, 23 November 1995, p. 31–50
History
Date made24 October 1995
Came into force13 December 1995
Implementation date24 October 1998
Preparative texts
Commission proposalC311, 27 November 1992, p. 30–61
Other legislation
Amended byRegulation (EC) No 1882/2003
Repealed

The principles set out in the Data Protection Directive are aimed at the protection of fundamental rights and freedoms in the processing of personal data.[1] The General Data Protection Regulation, adopted in April 2016, has superseded the Data Protection Directive and became enforceable on 25 May 2018.[2]

Context

The right to privacy is a highly developed area of law in Europe. All the member states of the European Union (EU) are also signatories of the European Convention on Human Rights (ECHR). Article 8 of the ECHR provides a right to respect for one's "private and family life, his home and his correspondence", subject to certain restrictions. The European Court of Human Rights has given this article a very broad interpretation in its jurisprudence.

In 1973, American scholar Willis Ware published Records, Computers, and the Rights of Citizens, a report that was to be influential on the directions these laws would take.[3][4]

In 1980, in an effort to create a comprehensive data protection system throughout Europe, the Organisation for Economic Co-operation and Development (OECD) issued its "Recommendations of the Council Concerning Guidelines Governing the Protection of Privacy and Trans-Border Flows of Personal Data".[5] The seven principles governing the OECD’s recommendations for protection of personal data were:

  1. Notice—data subjects should be given notice when their data is being collected;
  2. Purpose—data should only be used for the purpose stated and not for any other purposes;
  3. Consent—data should not be disclosed without the data subject’s consent;
  4. Security—collected data should be kept secure from any potential abuses;
  5. Disclosure—data subjects should be informed as to who is collecting their data;
  6. Access—data subjects should be allowed to access their data and make corrections to any inaccurate data
  7. Accountability—data subjects should have a method available to them to hold data collectors accountable for not following the above principles.[6]

The OECD Guidelines, however, were non-binding, and data privacy laws still varied widely across Europe. The United States, meanwhile, while endorsing the OECD's recommendations, did nothing to implement them within the United States.[6] However, the first six principles were incorporated into the EU Directive.[6]

In 1981, the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data was negotiated within the Council of Europe. This convention obliges the signatories to enact legislation concerning the automatic processing of personal data, which many duly did.

In 1989 with German reunification, the data the Stasi in East Germany collected became well known, increasing the demand for privacy in Germany. At the time West Germany already had privacy laws since 1977 (Bundesdatenschutzgesetz). The European Commission realized that diverging data protection legislation amongst EU member states impeded the free flow of data within the EU and accordingly proposed the Data Protection Directive.

Content

The directive regulates the processing of personal data regardless of whether such processing is automated or not.

Scope

Personal data are defined as "any information relating to an identified or identifiable natural person ("data subject"); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity;" (art. 2 a).

This definition is meant to be very broad. Data are "personal data" when someone is able to link the information to a person, even if the person holding the data cannot make this link. Some examples of "personal data" are: address, credit card number, bank statements, criminal record, etc.

The notion processing means "any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction;" (art. 2 b).

The responsibility for compliance rests on the shoulders of the "controller", meaning the natural or artificial person, public authority, agency or any other body which alone or jointly with others determines the purposes and means of the processing of personal data; (art. 2 d)

The data protection rules are applicable not only when the controller is established within the EU, but whenever the controller uses equipment situated within the EU in order to process data. (art. 4) Controllers from outside the EU, processing data in the EU, will have to follow data protection regulation. In principle, any online business trading with EU residents would process some personal data and would be using equipment in the EU to process the data (i.e. the customer's computer). As a consequence, the website operator would have to comply with the European data protection rules. The directive was written before the breakthrough of the Internet, and to date there is little jurisprudence on this subject.

Principles

Personal data should not be processed at all, except when certain conditions are met. These conditions fall into three categories: transparency, legitimate purpose, and proportionality.

Transparency

The data subject has the right to be informed when his personal data is being processed. The controller must provide his name and address, the purpose of processing, the recipients of the data and all other information required to ensure the processing is fair. (art. 10 and 11)

Data may be processed only if at least one of the following is true (art. 7):

  • when the data subject has given his consent.
  • when the processing is necessary for the performance of or the entering into a contract.
  • when processing is necessary for compliance with a legal obligation.
  • when processing is necessary in order to protect the vital interests of the data subject.
  • processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller or in a third party to whom the data are disclosed.
  • processing is necessary for the purposes of the legitimate interests pursued by the controller or by the third party or parties to whom the data are disclosed, except where such interests are over-ridden by the interests for fundamental rights and freedoms of the data subject. The data subject has the right to access all data processed about him. The data subject even has the right to demand the rectification, deletion or blocking of data that is incomplete, inaccurate or not being processed in compliance with the data protection rules. (art. 12)

Legitimate purpose

Personal data can only be processed for specified explicit and legitimate purposes and may not be processed further in a way incompatible with those purposes. (art. 6 b) The personal data must have protection from misuse and respect for the "certain rights of the data owners which are guaranteed by EU law."[7]

Proportionality

Personal data may be processed only insofar as it is adequate, relevant and not excessive in relation to the purposes for which they are collected and/or further processed. The data must be accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that data which are inaccurate or incomplete, having regard to the purposes for which they were collected or for which they are further processed, are erased or rectified; The data shouldn't be kept in a form which permits identification of data subjects for longer than is necessary for the purposes for which the data were collected or for which they are further processed. Member States shall lay down appropriate safeguards for personal data stored for longer periods for historical, statistical or scientific use. (art. 6).

When sensitive personal data (can be: religious beliefs, political opinions, health, sexual orientation, race, membership of past organisations) are being processed, extra restrictions apply. (art. 8).

The data subject may object at any time to the processing of personal data for the purpose of direct marketing. (art. 14)

An algorithmic-based decision which produces legal effects or significantly affects the data subject may not be based solely on automated processing of data. (art. 15) A form of appeal should be provided when automatic decision making processes are used.

Supervisory authority and the public register of processing operations

Each member state must set up a supervisory authority, an independent body that will monitor the data protection level in that member state, give advice to the government about administrative measures and regulations, and start legal proceedings when data protection regulation has been violated. (art. 28) Individuals may lodge complaints about violations to the supervisory authority or in a court of law.

The controller must notify the supervisory authority before he starts to process data. The notification contains at least the following information (art. 19):

  • the name and address of the controller and of his representative, if any;
  • the purpose or purposes of the processing;
  • a description of the category or categories of data subject and of the data or categories of data relating to them;
  • the recipients or categories of recipient to whom the data might be disclosed;
  • proposed transfers of data to third countries;
  • a general description of the measures taken to ensure security of processing.

This information is kept in a public register.

Transfer of personal data to third countries

Third countries is the term used in legislation to designate countries outside the European Union. Personal data may only be transferred to third countries if that country provides an adequate level of protection. Some exceptions to this rule are provided, for instance when the controller himself can guarantee that the recipient will comply with the data protection rules.

The Directive's Article 29 created the "Working party on the Protection of Individuals with regard to the Processing of Personal Data", commonly known as the "Article 29 Working Party". The Working Party gives advice about the level of protection in the European Union and third countries.

The Working Party negotiated with United States representatives about the protection of personal data, the Safe Harbour Principles were the result. According to critics the Safe Harbour Principles do not provide for an adequate level of protection, because they contain fewer obligations for the controller and allow the contractual waiver of certain rights.

In October 2015 the European Court of Justice ruled that the Safe Harbour regime was invalid as a result of an action brought by an Austrian privacy campaigner in relation to the export of subscribers' data by Facebook's European business to Facebook in the USA.[8] The US and European Authorities worked on a replacement for Safe Harbour and an agreement was reached in February 2016, leading to the European Commission adopting the EU-US Privacy Shield framework on 12 July 2016.

In July 2007, a new, controversial,[9] passenger name record (PNR) agreement between the US and the EU was undersigned.[10]

In February 2008, Jonathan Faull, the head of the EU's Commission of Home Affairs, complained about the United States bilateral policy concerning PNR.[11] The US had signed in February 2008 a memorandum of understanding[12] (MOU) with the Czech Republic in exchange of a visa waiver scheme, without first consulting Brussels.[9] The tensions between Washington and Brussels are mainly caused by the lower level of data protection in the US, especially since foreigners do not benefit from the US Privacy Act of 1974. Other countries approached for bilateral Memoranda of Understandings included the United Kingdom, Estonia, (Germany) and Greece.[13]

Implementation by the member states

EU directives are addressed to the member states, and are not legally binding for individuals in principle. The member states must transpose the directive into internal law. Directive 95/46/EC on the protection of personal data had to be transposed by the end of 1998. All member states had enacted their own data protection legislation.

Comparison with United States data protection law

As of 2003, the United States has no single data protection law comparable to the EU's Data Protection Directive.[14]

United States privacy legislation tends to be adopted on an ad hoc basis, with legislation arising when certain sectors and circumstances require (e.g., the Video Privacy Protection Act of 1988, the Cable Television Protection and Competition Act of 1992,[15] the Fair Credit Reporting Act, and the 1996 Health Insurance Portability and Accountability Act, HIPAA (US)). Therefore, while certain sectors may already satisfy parts of the EU Directive most do not.[16] The United States prefers what it calls a 'sectoral' approach[17] to data protection legislation, which relies on a combination of legislation, regulation, and self-regulation, rather than governmental regulation alone.[18][19] Former US President Bill Clinton and former Vice-President Al Gore explicitly recommended in their "Framework for Global Electronic Commerce" that the private sector should lead, and companies should implement self-regulation in reaction to issues brought on by Internet technology.[20]

The reasoning behind this approach has as much to do with American laissez-faire economics as with different social perspectives.[21] The First Amendment of the United States Constitution guarantees the right to free speech.[22] While free speech is an explicit right guaranteed by the United States Constitution, privacy is an implicit right guaranteed by the Constitution as interpreted by the United States Supreme Court,[23] although it is often an explicit right in many state constitutions.[24]

Europe's extensive privacy regulation is justified with reference to experiences under World War II-era fascist governments and post-War Communist regimes, where there was widespread unchecked use of personal information.[25][26][27] World War II and the post-War period was a time in Europe when disclosure of race or ethnicity led to secret denunciations and seizures that sent friends and neighbours to work camps and concentration camps.[6] In the age of computers, Europeans’ guardedness of secret government files has translated into a distrust of corporate databases, and governments in Europe took decided steps to protect personal information from abuses in the years following World War II.[28] (Germany) and France, in particular, set forth comprehensive data protection laws.[29]

Critics of Europe's data policies, however, have said that they have impeded Europe's ability to monetize the data of users on the internet and are the primary reason why there are no Big Tech companies in Europe, with most of them instead being in the United States. [30] Furthermore, with Alibaba and Tencent joining the ranks of the world's 10 most valuable tech companies in recent years,[31] even China is moving ahead of Europe in the performance of its digital economy,[32] which was valued at $5.09 trillion in 2019 (35.8 trillion yuan).[33]

China and the US together comprised 75% of all patents filed related to leading information technologies such as blockchain, 50% of global spending on the Internet of Things, more than 75% of the world market for cloud computing, and 90% of the market capitalization of the world's 70 largest digital platforms. The EU's share is only 4%.[32]

Meanwhile, Europe's preoccupation with the US is likely misplaced in the first place, as China and Russia are increasingly identified by European policymakers as "hybrid threat" aggressors, using a combination of propaganda on social media and hacking to intentionally undermine the functioning of European institutions.[34]

Replacement by the General Data Protection Regulation

On 25 January 2012, the European Commission (EC) announced it would be unifying data protection law across a unified European Union via legislation called the "General Data Protection Regulation." The EC's objectives with this legislation included:[35]

  • the harmonisation of 27 national data protection regulations into one unified regulation;
  • the improvement of corporate data transfer rules outside the European Union; and
  • the improvement of user control over personal identifying data.

The original proposal also dictated that the legislation would in theory "apply for all non-EU companies without any establishment in the EU, provided that the processing of data is directed at EU residents," one of the biggest changes with the new legislation.[35] This change carried on through to the legislation's final approval on 14 April 2016, affecting entities around the world. "The Regulation applies to processing outside the EU that relates to the offering of goods or services to data subjects (individuals) in the EU or the monitoring of their behavior," according to W. Scott Blackmer of the InfoLawGroup, though he added "[i]t is questionable whether European supervisory authorities or consumers would actually try to sue US-based operators over violations of the Regulation."[2] Additional changes include stricter conditions for consent, broader definition of sensitive data, new provisions on protecting children's privacy, and the inclusion of "rights to be forgotten."[2]

The EC then set a compliance date of 25 May 2018, giving businesses around the world a chance to prepare for compliance, review data protection language in contracts, consider transition to international standards, update privacy policies, and review marketing plans.

See also

References

  1. Kennedy, Wendy (2020). Data Privacy Law: A Practical Guide (Third ed.). G. E. Kennedy & L. S. P. Prabhu. p. 45. ISBN 978-0-9995127-4-6.
  2. Blackmer, W.S. (5 May 2016). "GDPR: Getting Ready for the New EU General Data Protection Regulation". Information Law Group. InfoLawGroup LLP. Archived from the original on 14 May 2018. Retrieved 22 June 2016.
  3. Pfleeger, Charles P.; Pfleeger, Shari Lawrence; Margulies, Jonathan (2015). Security in Computing (PDF). Pearson Education. ISBN 978-0-13-408504-3. Archived from the original (PDF) on 14 July 2015. Retrieved 19 December 2020. Few people recognize Willis [Ware]'s name today; more people are familiar with the European Union Data Protection Directive that is a direct descendant of the [1973 report] from his committee for the U.S. Department of Human Services. Willis would have wanted it that way: the emphasis on the ideas and not on his name.
  4. Ware, Willis H. (2008). RAND and the information evolution : a history in essays and vignettes (PDF). RAND Corporation. ISBN 978-0-8330-4513-3. Secretary of Health, Education, and Welfare Elliot Richardson had become concerned about the vast amount of personal data that the government held about its citizens. ... He impaneled the Secretary’s Advisory Committee on Automated Personal Data Systems to examine the issue and solicited the participation of Willis Ware (who had just completed his tenure with the DSB security activity) as an individual knowledgeable about system security... Ware became chair of the committee that he described to a colleague as “the most politically balanced group I’ve worked with. We had young v. mature people, ethnicities of all kinds, lawyers v. non-lawyers, experts v. lay persons, male v. female, politically active individuals v. politically passive ones.” [In] 1972, the committee report [Records, Computers, and the Rights of Citizens: Report, MIT Press, 1973] was delivered... [It] achieved several significant goals:
    • It conceived and defined the Code of Fair Information Practices, which has become the foundation for personal-information privacy law and privacy doctrine in the United States and worldwide (e.g., the European Union position).
    • The Code set the relationship—one might call it the rules of engagement—between (1) the organizations collecting personal information and the data systems that held it and (2) the individual citizen about whom the personal data had been assembled.
    • It provided the intellectual basis for the Privacy Act of 1974, which, in turn, set the framework for other law.; It created the Privacy Protection Study Commission (PPSC).
  5. Guidelines on the Protection of Privacy and Transborder Flows of Personal Data The Organization for Economic Co-Operation and Development, last modified 5 January 1999.
  6. Shimanek, Anna E. (2001). "Do you Want Milk with those Cookies?: Complying with Safe Harbor Privacy Principles". Journal of Corporation Law. 26 (2): 455, 462–463.
  7. "Protection of personal data – European Commission". European Commission.
  8. "Judgement of the Court (Grand Chamber) – 6 October 2015". InfoCuria. 6 October 2015. Retrieved 22 June 2016.
  9. A divided Europe wants to protect its personal data wanted by the U.S., Rue 89, 4 March 2008
  10. "New EU-US PNR Agreement on the processing and transfer of Passenger Name Record (PNR) data". libertysecurity.org. Archived from the original on 12 January 2012.
  11. Brussels attacks new U.S. security demands, EUobserver. See also Statewatch newsletter February 2008
  12. http://www.statewatch.org/news/2008/mar/us-czech-mou-visas-etc.pdf
  13. Statewatch, March 2008
  14. See Julia M. Fromholz, The European Union Data Privacy Directive, 15 Berkeley Tech. L.J. 471, 472 (2000); Dean William Harvey & Amy White, The Impact of Computer Security Regulation on American Companies, 8 Tex. Wesleyan L. Rev. 505 (2002); Kamaal Zaidi, Harmonizing U.S.-EU Online Privacy Law: Toward a U.S. Comprehensive Regime For the Protection of Personal Data, 12 Mich.St. J. Int'l L. 169 (2003).
  15. Legislation, USA (1992). "CABLE TELEVISION CONSUMER PROTECTION AND COMPETITION ACT OF 1992" (PDF). Retrieved 18 March 2010.
  16. Fromholz, supra
  17. Lloyd, Ian J. (2011). Information technology law (6th ed.). Oxford [etc.]: Oxford University Press. p. 26. ISBN 978-0199588749.
  18. Clinton, William J.; Gore, Jr., Albert (1 July 1997). "A Framework for Global Electronic Commerce". technology.gov. Archived from the original on 21 December 2006. Retrieved 18 December 2006.
  19. R., Schriver, Robert (20 February 2018). "You Cheated, You Lied: The Safe Harbor Agreement and its Enforcement by the Federal Trade Commission". Fordham Law Review. 70 (6).
  20. Clinton & Gore, supra
  21. Fatema, K. (2016). "A Semi-Automated Methodology for Extracting Access Control Rules from the European Data Protection Directive". 25 (32). Cite journal requires |journal= (help)
  22. United States Const. amend. I.
  23. See, for example, Roe v. Wade, 410 US 113 (1973)
  24. See, for example, Article 1 of the California Constitution: "All people are by nature free and independent and have inalienable rights. Among these are … privacy."
  25. Ryan Moshell, ...And Then There was one: The Outlook for a Self-Regulatory United States Amidst a Global Trend Toward Comprehensive Data Protection, 37 Tex. Tech. L. Rev. 357, 358
  26. "The History Place – World War II in Europe Timeline: November 9/10 1938 – Kristallnacht, the Night of Broken Glass". historyplace.com.
  27. Kotzker, Jason A. "The Great Cookie Caper: Internet Privacy and Target Marketing at Home and Abroad Notes & Comments 15". St. Thomas Law Review. St. Thomas Law Review 2002–2003. 15: 727.
  28. Marsha Cope Huie, Stephen F. Laribee & Stephen D. Hogan, The Right to Privacy and Person Data: The EU Prods the U.S. and Controversy Continues, 9 Tulsa J. Comp. & Int'l L. 391, 441 (2002)
  29. Id. at footnote 4.
  30. "Fuzzy Anonymity Rules Could Stymie EU's Big Data Sharing Ideas". CPO Magazine. 1 May 2020.
  31. "Beijing's battle to control its homegrown tech giants". TODAYonline.
  32. https://unctad.org/en/PublicationsLibrary/der2019_overview_en.pdf
  33. "Value-added of China's digital economy totals 5 trillion USD in 2019: white paper - Xinhua | English.news.cn". www.xinhuanet.com. Retrieved 23 October 2020.
  34. "EU vows tougher response on hybrid threats". POLITICO. 24 July 2020.
  35. "New draft European data protection regime". m law group. 2 February 2012. Retrieved 22 June 2016.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.