Informational Self Determination in Cloud Computing. Data Transmission and Privacy with Subcontractors


Tesis de Máster, 2015

106 Páginas, Calificación: 95,5


Extracto


Table of Content

Abstract

Procedure of the Paper

Abbreviations

Introduction

First Chapter
1 Foundation for Privacy Protection in Cloud Computing
1.1 Business Opportunity Cloud Computing
1.2 Private Persons Paradoxical Behaviour
1.3 Legal Framework in the EU
1.4 Legal Framework in Germany

Second Chapter
2 Application Area
2.1 Personal Scope
2.1.1 Anonymous Data
2.1.2 Pseudonymised Data
2.1.3 Encrypted Data in Cloud Computing
2.1.4 Interim Conclusion to the Personal Scope
2.2 Geographical Applicable Law
2.2.1 ECJ Decisions on the Geographical Scope
2.2.2 Interim Conclusion to the Geographical Scope

Third Chapter
3 Data Processing Legitimacy
3.1 Consents in Cloud Computing
3.2 Lawful Personal Data Processing based on Contracts
3.2.1 Distinction between Contract Data Processing and Functional Transmission...
3.2.2 Privileged Contract Data Processing
3.3 Data Processing within the EU including Subcontractors
3.4 Practical Assessment of the Legitimacy Criteria in Cloud Computing
3.5 Interims Conclusion to Chapter 3

Fourth Chapter
4 Data Transmission Outside the EU
4.1 Legal Foundation
4.2 International Agreements
4.3 Data transfer to Countries Outside the EU with an Unsatisfactory Data Protection Level
4.3.1 Standard Contractual Clauses
4.3.2 Binding Corporate Rules
4.4 Interim Conclusion to Chapter 4

Fifth Chapter
5 Personal Data Transmission to Unsecure Non-EU Countries Including Subcontractors .
5.1 Non-EU Country Cloud Provider and Subcontractor
5.2 EU Cloud Providers and Non-EU Subcontractors
5.3 Non-EU Cloud Providers, EU Subcontractors
5.4 Interim Conclusion to Chapter 5

Sixt Chapter
6 Informational Self-Determination Potential for Improvements in the Cloud Computing Chain
6.1 Technical and Organisational Potential for Improvements
6.2 Technical Potential for Improvements to Support the Law
6.2.1 Consent
6.2.2 Principle of Transparency
6.2.3 Principle of Purpose Limitation
6.2.4 Principle of Necessity
6.3 Economic and Political Potential for Improvements
6.4 Potential for Improvements through Self-Security

Seventh Chapter

7 Informational Self-determination recognition in the new regulation
7.1 Application Area
7.1.1 General Provisions
7.1.2 Personal and material scope
7.1.3 Territorial Scope
7.2 Legitimation Scope of Personal Data Protection
7.2.1 Consent
7.2.2 Contract Data Processing
7.3 Cloud User and Cloud Provider Roles and Obligation in Subcontractor Chains
7.4 Data Transmission into Non-EU countries with Subcontractors
7.5 Interim Conclusion to Chapter 6 and 7

Conclusion

Bibliography

Abstract

The paper analyses the constraints of the current European directive on data protection regarding the free and active exercise of the right to informational self-determination in cloud computing with subcontractor chains.

The analysis focuses in particular on the personal and geographical scope of the protection of personal data, on the legitimation of data processing under the aspect of data transmission into secure and unsecure third countries with subcontractor involvement. Herein it will be critically analysed whether the options under which it is possible to process personal data, will deliver sufficient privacy security in cloud computing. Furthermore, the paper examines the effectiveness and the consequences of possible legitimation of processing personal data in cloud computing. Also, will be regarded the legitimation options to include subcontractors in complex cloud computing landscapes in secure and unsecure third countries. The data subject and the cloud user position and chances to execute their rights of informational self-determination in distributed cloud computing landscapes will be critical looked at.

Based on the multiple challenges that the personal data faces in complex cloud computing landscapes, various improvement potentials addressed to different actors emphasis the neces- sity to reduce the risk to the data subject´s informational self-determination in cloud compu- ting.

Finally, the recent regulation on general data protection that was published by the Council on 11th June 2015 will be cross-checked against the identified gaps of the currently existing data protection directive, with an emphasis on the requirements to achieve informational self - de- termination.

Procedure of the Paper

In the introduction, the paper starts with a description of the challenges that privacy on the Internet faces, and emphasises the economic power of data. Herein, an overview is given about the major technical developments regarding collecting private data from citizens.

The first chapter describes the economic importance of cloud computing, where mostly citizens´ private data is stored and then processed. Thereby, the paper explains how cloud computing represent an economic advantage, and explain how data is distributed. Subsequently citizens´ trustful behaviour on the Internet is considered. Following, it moves to an examination of how the European data protection law has developed.

Afterwards, the basis and development of the German data protection law is contemplated as European Union member state example.

In the second chapter, the current legal data protection directive EC/95/46 is taken as basis to prove that the application of directive EC/95/46 is against cloud computing with subcontrac- tors. The focus then is shifted towards personal and geographical scope, to emphasise on per- sonal data and the geographical dimension in cloud computing. The difficulty of effectively applying anonymizing and pseudonyms techniques in cloud computing is highlighted.

The third chapter scrutinize the legitimation criteria of the directive, in particular the consents and the contract data processing vis-a-vis the practical effectiveness of cloud computing. The German federal data protection law is used as an example of a European member state when it comes to the transposition of the relevant provisions. In chapter three, the first case of four different cloud scenario landscapes is discussed, where the legitimation of data transfers with subcontractors are conceived. In reference to Germany as an example of member state the da- ta subject and the cloud user are assumed to be located in Germany as a starting point. Chap- ter 3 discusses scenario A where the cloud provider and the subcontractor are located in the EU.

Abbildung in dieser Leseprobe nicht enthalten

The fourth chapter examines the transfer of personal data outside the EU, based on the EU directive EC/95/46 and the German federal data protection law. Initially, options are conceived awarding third countries an EU comparable security status.

Furthermore, different methods are presented for cloud computing scenarios with subcontractors securing a lawful data transmission to unsecure third countries.

The fifth chapter shows different cloud computing landscapes, explaining the requirements of data protection when it comes to lawfully process personal data in unsecure third countries with subcontractors.

Abbildung in dieser Leseprobe nicht enthalten

The scenarios look at different constellations, where the cloud provider and the subcontractor are located whether within the EU or in unsecure third countries.

The sixth chapter proposes technical, economic, political, educational measures to support the legal effectiveness of data protection. The examples of measures are wishful to be considered and implemented, in case there are interests to protect the informational self-determined so- ciety.

The seventh chapter analysis the latest official published and amended data protection regulation version by the Council from 11th June 2015. The Council version is reflected against the personal data protection in cloud computing with subcontractor chains. It looks in particular to the application area, the data processing legitimation and the data transmission into third countries including subcontractor chains. In the 7th chapter, the critics and proposals stipulated in chapter 6 are linked to the examined Council provisions and an attempt is being made to evaluate the provisions effectiveness regarding the protection of informational selfdetermination in cloud computing with subcontractor chains.

Finally, the conclusion summarizes the difficulties and challenges of the informational self- determination in cloud computing with subcontractor chains within the current directive EC/95/46 and confronts it to the provisions of the current approved Council version from 11th June 2015.

Due to the complexity of the topic, the presented paper could not cover special data protection regulations for sensitive data, special data protection regulations for children, detailed aspects for concerns and employees, or new provisions like the right to be forgotten.

Abbreviations

Abbildung in dieser Leseprobe nicht enthalten

Introduction

The Internet has become a “must use” medium. The daily usage of the Internet in private and in business life has become unavoidable. With the launch of FACEBOOK in 2004, it was not foreseeable that in March 2015 Facebook could count 1,42 Billion active users per month.1 Around 500 Million messages are sent everyday on Twitter.2 In 2010 Eric Schmidt, former Google Chief Executive Officer (CEO),3 said, „ We know where you are. We know where you've been. We can more or less know what you're thinking about4. The frequent usage of the Internet technology has considerably increased, and the daily high volume of data is produced and stored in the “clouds” around the globe. The data of each smartphone user are placed in the “clouds”. The intensive development towards machine- to-machine connection (M2M), which is hidden behind the keywords of “Internet of Things”, leads to a tremendous creation of data. Companies’ effectiveness and the cost de- liberations have led to increase data storage in the “clouds”. The distribution of the private data to different locations raises concerns regarding the failure of constitutional law to pro- tect the fundamental right to privacy. Depending on the background of the Internet users, the awareness regarding the hidden danger of the unsecured usage of private data has raised concerns against cloud computing, and is hindering the digitalization process in Eu- rope. Economists fear an enormous loss if companies, industries, and governments do not trust that their private data are secured in the “clouds”. Especially that the US companies, that are still the leading cloud providers, fear the loss of market share due to the delayed cloud adaption. Inflamed by the NSA scandals that were disclosed by Edward Snowdon in 2013, and driven by the public awareness regarding the loss of privacy and the loss of trust in the protection of privacy, the companies and private persons departed from cloud intro- duction and rethought an extensive Internet usage. Additionally, the Google CEO Eric Schmidt fears an economic and political separation of countries and companies from the global data transfer caused by the NSA scandals. The loss of trust in cloud security will not only lead to economic losses for US companies, but also can delay the development of the European digitalization. Hence, Google works hard now to fight the headwind from Eu- rope.5

First Chapter

1 Foundation for Privacy Protection in Cloud Computing

1.1 Business Opportunity Cloud Computing

Cloud Computing is a network connected server landscape, where it is possible to store private data.6 The external data centres execute programmes and process data on on behalf of the owners in external servers.7 The concept of processing private data in external serv- ers is not new, and it is similar to the former concept of classical “outsourcing”.8 In the business concept of cloud computing the user rents the processor performance based on his needs. The main advantages of cloud computing is seen in the resource pooling of IT infra- structure in the size as needed. Cloud Computing offers IT flexibility regarding infrastruc- ture, maintenance, availability, failure tolerance and IT management and promises effi- ciency and quality assurance. Cloud computing enables the users to avoid processor usage variation and guarantees cost advantages by paying only for the server resources that are used.9

Different cloud scenarios are possible and they depend on the underlying business cases and the chosen security level. Data can be stored in single or in multiple clouds, whereby the operation can take place centralized or decentralized, within or without trusted boundaries. Single clouds are mostly private clouds where security standards like Service Level Agreements (SLA) and certificates are applied. The deployment case of multiple, distributed clouds can be hybrid clouds or private-public clouds where applications cross a privatepublic trust boundary or even cross multiple public clouds.10

The possibility to protect privacy in the cloud is considered limited and only possible with high technical effort. There is the recommendation for private persons for self-protection in the Internet and social media by avoiding data traces and usage through encrypting tech- nics. Companies focus on the usage of own clouds in connection with the data encryp- tion.11

Jutta Große Wichtrup Informational Self Determination in Cloud Computing

Cloud Computing exists in three12 different service models.13 This enables the cloud pro- vider to compose the cloud services based on the clients’ needs. The cloud services can be provided in components and arranged, managed and coordinated in different activities.14

Cloud Computing has an enormous economic relevance and is very important for the digitalization progress, however it consists of multiple legal challenges.15

Nowadays, a lot of important information of companies are stored only in digital form. The loss of stored data could have fatal consequences, this is why companies invest a lot of en- ergy to safely secure and store data. Almost all companies keep data in second backups, where the backup maintenance requests physical and geographical separation from the original.16

The possibilities to secure the data privacy for big companies are, e.g. to use private clouds, where the company, as cloud user, would be responsible to provide the legally requested data privacy. In Private clouds, the cloud user would remain responsible for acquiring and maintaining the cloud, which requires different skillsets as IT knowledge and refrigeration technology etc. The company could lose the promoted cost advantages.17 Especially small and midsized companies (SMEs) would be interested in using clouds to achieve cost advantages by avoiding big IT departments.18

1.2 Private Persons Paradoxical Behaviour

In 2014, the protection of privacy on the Internet has evolved into a political topic. Gener- ally, the media and the majority of the public opinion believe that the human being is transparent on the Internet, and it is impossible to avoid it, which announce the end of pri- vacy.19 Consequently, the fulfilment of data protection in the digital world can be ques- tioned. This represents even more the need for technical, legal, and social consideration for data protection from the very beginning.20 Even though, the intensity and desire for privacy is related to cultural consideration and varies from the country specific perspective. For ex- ample, in the UK it is common practise to surveil public places. In Sweden, every citizen income is public.21 In contrast, in Germany the desire of privacy protection is high. In a poll of the University of Hohenheim every age range answered to 95%, that their desire for privacy would be important or very important. Privacy has been considered very important and worthy of being protected. The concern of losing privacy on the Internet rose in the last years, albeit the publication of private data also rose. Nevertheless, the German public is concerned of losing privacy on the Internet.22Privacy” based on the concept of the “right to be left alone” should be distinguished from the privacy desire under the aspect of informational self-determination as basis of a democratic and self - controlled communica- tion structure. Informational self-determination follows the perception to secure, to enable, and to establish self-determined communication. The aim is not to hamper business oppor- tunities neither to avoid nor to reduce communication, but to support business development that respects data privacy and secures the data subject´s right of informational self- determination.23

1.3 Legal Framework in the EU

The European Convention of Human Rights defined the legal foundation for privacy pro- tection in the EU and are enshrined in Art. 8 ECHR on the respect of private life. Art. 8 CFR grants the protection of personal data and allows data processing only if it is related to a specific purpose. Art. 6 (1) of the Treaty of the European Union (TEU) put the Charter of Fundamental Rights on equal rank to the Treaty of the European Union (TEU) and the Treaty of Functioning of the European Union (TFEU). In Art. 6 (2) TEU, the EU expresses the intention to join the ECHR and oneself commits to respect the human rights.24 Respec- tively to ECHR, the European Commission proposed the data protection directive (DP-D) to protect private data. In 1995, the European Council and European Parliament adopted the data protection directive EC/95/46 (DP-D) as central data protection element in the EU.25 The DP-D pursues the aim to protect the privacy rights of natural persons (Recital 2 DP-D) as well as other fundamental rights and freedoms related to automated and non- automated data processing (Art. 3 (1) DP-D).26 In addition to that, the DP-D should secure a common minimum data protection level within the EU. The European wide data protection minimum level should support the creation and functioning of the internal market for data transfers by securing privacy in economic, scientific, social, or technical cooperation (Recital 3, 7, 8 DP-D).27

In January 2012, the European Commission issued a proposal for a general data protection regulation (DP-R28 ) to ensure a European-wide fully harmonized data protection. The di- rective EC/95/46 is implemented by the member states with different severity. The DP-R aims to acknowledge the technical and medial change, and to foster the digital internal market and the free movement of data.29 After an interims period of two years, the DP-R ratification will move the DP-R into the position of final regulatory act of personal data protection in the EU. National regulations, which currently differ between the member states, will be obsolete.30 The DP-R will not affect national provisions referring to other di- rectives. In case of Germany, e.g. the telecommunication provisions based on the E- Privacy-regulation 2002/58/EG for electronic communication remains untouched. Despite the fact that the committee for civil liberties, justice and home affairs (LIBE) requested from the European Commission to present an amended version in near-time.31

In January 2013, the committee for civil liberties, justice and home affairs (LIBE) presented in reference to the European Commission data protection proposal (DP-R), an unofficial amended version32 to which the European Parliament (EP) agreed in October 2013.33 The unofficial LIBE version was issued as compromise based on the pressure of the US government and market leading online companies lobbyists.34 Nonetheless, the DP-R did not yet approve the final version. Countries like Germany block the negotiations, because state institutions should keep the right to collect citizens’ data.35 More than 3.300 change requests, hampering the final agreement, are coming from the industry, US companies and other lobbyist interested in skipping the current DP-R proposal.36

The DP-R would lead to several inventions relevant for cloud computing, and affecting the data transfer into third countries, which might threaten the informational self- determination.

The quasi-muddled situation came almost to a standstill, until the Italian EU Council Pres- idency progressed in the agreement process and issued in January 2015 a confidential, al- beit published version37, of a new and progressed stage of council position38 regarding the DP-R. It still contains more than 490 open topics, but shows an impressive progress, in comparison to the DP-R and the LIBE version. In March 2015, an unofficial European Council comparison39 containing a revision of the three regulations was published, it was a consolidated version of the EU-Data protection regulation, and it included comments and compromise proposals.40 Under the Latvian EU Council Presidency, the agreement process continued and was followed by the Luxembourg EU Council Presidency.41 On June 11th, 2015, the Council published the general approach for the general data protection regula- tion42 that was prepared to be discussed in the first trilogue on June 24th of the same year. In chapter 7, the latest official published Council43 version of 11th June this year will be used to analyse the level of personal data protection and the level of informational self- determination that was achieved under the critical aspects, and currently existing in the da- ta protection directive.

1.4 Legal Framework in Germany

In Germany44, the protection of privacy is cased on Art. 2(1) GG and Art. 1(1). Protection worthy fundament builds the personality´s free development in connection with the human dignity.45

In the census of population judgement (65,1) regarding automated data processing, the German federal constitutional court (BVerfGE) decided that the distinction between differ- ent privacy levels is not the crucial question. The opportunities to reference private data during data collection is judged under the aspect of how and for what kind of purpose the data is collected. Even more the data linking possibilities as well as the data usage should be considered to a greater degree.46

The decision reason is seen in the multiple sources to gather non-intimidate data, which re- sults in the decision that insignificant data does not exist.47 Referring to Art. 2 (1) GG, the BVerfG took in the census of population judgement a decision to manifest the right to pri- vacy with the right of informational self-determination, by focusing on the principle of purpose limitation.48 Automated data processing threatens privacy and freedom, because the possibility to store an unlimited amount of data allows data combination and the build- ing of personal profiles. The missing control in automated data processing is seen as a dan- ger for citizens. It could lead to the fear that the state could draw sanctions out of it, there- fore citizens’ political activity might be reduced. Exercising informational self- determination means that the individual person would keep the decision freedom and freely decide to act on his decision. It can be considered as threat to the individual’s freedom not having the overview and the security about one´s own information existing in his social environment. The individual might be inhibited to act and decide freely if there is uncer- tainty on the existing information on the communication partner side.49 The uncertainty which information exists where and might be used to someone’s disadvantage is a chal- lenge to the personal freedom with potential chilling effects to the free expression of opin- ion.

The privacy protection fundamental law could lead to legal relevant challenge when it con- flicts with other fundamental laws, like for example the rules of professional conduct.50 In 2001, Germany respected the transformation of the DP-D in the amended version of the Federal Data Protection Law (BDSG). The census of population decision of 1983 was adapted regarding the perspective to the right of informational self-determination.51

The planned DP-R introduction would replace the direct effect of the current BDSG. With the effectiveness of the DP-R, the privacy rights of the German fundamental law are not anymore compared with the rights defined by the BVerfG but with the EU fundamental law, in particular Art. 8 ECHR.52

Based on Art. 12 (b) TEU; protocol No. 2, the German Federal Council brought up a subsidiary complaint regarding the disproportionality in relation to the subsidiarity and proportionality principles. The criticism is related to the undifferentiated regulation of public and non-public sector, whereas the DP-R would affect national public law.53

The strict data protection law in Germany is considered two folded. On one side, it is con- sidered as an advantage since it creates customer confidence and is used as positive differ- entiation criteria towards international competition offering data security and privacy. On the other side, it increases costs for controls and might hamper innovations in international competition to gain additional market shares.54 German SMEs and Start-up-companies re- quest the consideration of regulations not to be restricted only to European Companies, but to be applied on data security and control measures to “companies operating in the EU re- gion”, which gives European Start Ups and SMEs a chance to compete with big American social media companies.55 The ECJ Google judgement C-131/12 of 13th May 2014 already recognizes this request, albeit under a different leading question.56 In the course of the pa- per, especially in chapter 7, the relevant provisions in the current data protection regulation proposal will be discussed.

The data protection law modernization should be built on the legal framework protecting the data subjects´ right of informational self-determination against breeches of private players. The BVerfG refers in the census of population judgement to direct state interven- tion into the privacy sphere of citizens by exorbitant data collection. Nowadays, the census of population judgement receives criticism directed against an outdated view, which con- sider citizens with “face and name”, and evaluate the judgement as “state organized com- munication control”.57 The arguments are emphasized with the reasoning that the BVerfGE regarded the citizens’ privacy under the aspect of anonymity, reclusiveness, and single sid- ed relationship, which would not fit to the nowadays communication tools and habits.58

Whereas, the challenges of the rapid technology development in connection with the data globalization and the important status of data as high value economic good require a con- ceptual further development of the right of informational self-determination. The judgements’ essential thought will be still applicable.59

The relevant application area to secure informational self-determination is not anymore purely faced vertically against the state to secure unprejudiced political activity. The pro- tection of informational self-determination should be extended on horizontal perspective between private persons and private companies. Informational imbalances between private and business actors might lead to economic exploitation. The data subjects concern regard- ing the unlimited data gathering and collection is feed by the uncertainty about its own data caused by the cluelessness who might know something about him, where else and for what kind of purposes is the data processed.

The modernization of the data protection law should not lead communication avoidance but to a self-determined communication community. In the changed environment, privacy aspects should regarded as part of a multi-relational concept of the real life.60

Second Chapter

2 Application Area

Art. 3 and 4 DP-D define the scope of data protection. It is define by Art. 2b as “any set of operation performed upon personal data”, like data elicitation, usage, processing, storage, deletion or transmission of data.

2.1 Personal Scope

The personal scope of the DP-D is opened, if Art. 3 in connection with Art. 4 DP-D find application regarding automated or non-automated personal data processing, stored data or data stored in a file. The DP-D needs to be applied by the member states to grant personal data protection, based on the fundamental freedoms and the constitutional rights. Art. 1 (1), Art. 2 (a, 2nd HS.) DP-D specifies personal data as the information related to an identifiable natural person. The person might be identifiable directly or indirectly by a reference num- ber, whereas one or more factors may specify the person’s identity in physical, physiologi- cal, mental, economic, cultural, or social way. Legally only that information is relevant where a personal reference exists, or where the personal reference can be constructed.61 In relation to these articles, recital 26 DP-D specifies protection principles applicable to all possible identifiable information used by controllers or any other persons for the identifica- tion of persons. Recital 26 DP-D excludes the data protection for sufficiently anonymized personal data.

Art. 29 on working group data protection (DP-WG-29),62 interprets widely the personal scope and limit it to “any information”, “relating to”, “an identified or identifiable”, and “natural person”. The DP-WG-29 also includes to the definition of personal scope the family and private life, in addition to working relations, economic and social behaviour and information like biometric data.63

2.1.1 Anonymous Data

Art. 1(1) and Art. 2(a) DP-D in connection with recital 26 legitimize data processing with anonymous data. Anonymous data relates, at least with one criterion to a person, but the knowledge to assign the reference to personal data is removed. Anonymous data contain information related to persons, but the direct personal assignment is removed.

However, there are possibilities to re-individualise anonymous data. Technical functions can increase the risk with possibilities to withdraw the anonymising with existing or ac- quirable additional knowledge. Even more, actual or potential future technical data pro- cessing possibilities in combination with the investigated effort and time increase the like- lihood to decrypt anonymous data. All of this might cause that, a persons´ identification chance depends on the data processor capability. Considering this, the personal reference to data can be classified as relative.64 It is a question of probability to de-anonymise per- sonal data to receive data with personal reference. Anonymity only exits, if under the con- sideration of technical science there is an unlikely probability that with the existing data criteria a personal reference can be linked.65

Due to technologies like Data Warehouses, Data Mining and Big Data the probability of de-anonymising increases.66 The increasing variety of gathered data from different sources increases the probability that, albeit personal data might be anonymised consequently, it will be possible to de-anonymise with techniques like big data in clouds.67 By processing data in clouds, anonymous data is very likely to be identified based on additional information that exists in the clouds about the person.68

2.1.2 Pseudonymised Data

The Federal Data Protection Act (BDSG) transformed Art. 1+2 in connection with recital

26 DP-D into the §§ 3 (6 + 6a), 3(a) BDSG, and differentiates in detail the two processes of pseudonymising and anonymising measures, in order to follow the principle of data economy and avoidance.

Art. 3 (6a) BDSG defines pseudonyms as alias for a name to eliminate the identification of the data subject, or at least to increase the complexity of identification. Using pseudonyms, the name and other identifying data are replaced by aliases keys. Data processing is legiti- mated with sufficiently and adequate pseudonyms guaranteeing privacy and the protection of personal data.69

Business wise it could be necessary to keep the possibility for identification, which would speak against the request to achieve anonymity. The reconstruction of personal data allows assigning responsibility for specific activities.70

Pseudonyms guarantee anonymisation. Pseudonyms can be given by the data subject, where only the data subject can recreate the personal relevance to the data or by a third person, e.g. the cloud provider. The pseudonymising by a third party can be done with and without the knowledge of the data subject.71 If the data processor creates pseudonyms for personal data, the data protection requirements still needs to be regarded. As the data pro- cessor owns the pseudonyms logic, he could still process the personal data without pseu- donyms.72 The usage of pseudonyms can increase the identification of data subjects in a way where the protection level could enable the personal data processing.73

The data subject has the possibility to use his pseudonym to avoid leaving data traces, which could be transferred against his free will. Since between the pseudonym and the person exists a logical link, it will possible to re-identify the person by using an identification logic. It can be distinguished between pseudonyms, which are factual impossible to identify and the possibility of re-identification with more or less effort.74

The Federal Supreme Court (BGH) raised the question of preliminary ruling to the ECJ based on Art. 267 TFEU, whether IP-addresses75 are considered personal data and are al- lowed to be saved beyond the time of its original purpose. The decision remains to be seen, if an IP address has no personal data relation, whether it should remain on the server. Only with additional knowledge of connected servers and data exchange between the servers, the personal reference could be identified.76 The DP-WG-29 considers whether the statical- ly as well as dynamical IP addresses are considered personal data.77 The personal reference of a data subjects´ IP-Address is limited to the generating servers, hence to the knowledge of the system provider. The data subjects´ IP Address is a pseudonym in further pro- cessing. The information about the assignment logic between the personal data and the IP address is available only to third parties, in case cooperation between system providers ex- ists.78 The distinction between technical and non-technical aspects to identify personal data in the BDSG is debatable. However, the DP-D includes the technical aspects of personal data and considers in Art. 2 (a) identification numbers, as suitable to identify personal data. Static IP - Addresses in connection with Cookies could be used for personal data identifi- cation.79

2.1.3 Encrypted Data in Cloud Computing

In Cloud Computing, data encryption is regarded as a possibility to secure data while per- sonal data is transferred between clouds. However, cloud providers often do not provide client-data encryption but instead use company’s keys. Client-side data encryption protects cloud user data against the possibility of the cloud provider to read personal data. The han- dling of client-data-encryption would request from the cloud user key handling to manage the different access accounts. To achieve the advantages of accessing personal data from different devices at different locations, some disadvantages might be acceptable. Since the confidentiality of business data is very high, business users in companies use private keys to encrypt data before the transmission into the cloud takes place. Private persons could add additional tools to encrypt data before using public clouds. The usage of tools for en- crypting data requires the necessary knowledge about the compatibility with the cloud technology and standard, since not all tools are suitable for a cloud usage. Encrypting tools might interfere with the cloud providers’ software and might require extra installation and key handling. Additional verification with the cloud provider software is required to ensure the compatibility. Client-side data encryption are costly for cloud provider ; because the encryption requires deduplication80 where the needed storage space will increase.81 How- ever, data encryption is not the ultimate solution for secured data. There are several possi- bilities to bypass the encryption, e.g. if the client server skips the encryption tasks and sends clear data. Additionally, during the decryption process, and after receiving data from the cloud provider, the client software could send the decrypted key back to the cloud provider or to and unauthorised person.82

2.1.4 Interim Conclusion to the Personal Scope

The BDSG and the DP-D are less precise regarding the security evaluation of pseudonyms or anonymised data for the processing of personal data. Legal differentiation between ab- solute anonym or factual anonym does not exist. The law does not require data processing related to specific purpose and subject for anonym and pseudonym data. The missing secu- rity of anonymised and pseudonyms against unlawful access and decryption can interfere with the data subjects’ informational self-determination. At the time of data collection, the data subject does not know the possible usages of his data. The data subject should be aware about the risk of losing the privacy and the control above his data, which threatens his informational self-determination. Internet users should be informed about measures se- curing the anonymisation and measures to prevent the removal against his will. Pseudo- nyms data can be decrypted by data concatenation to other data. Thereby data is enriched with additional information. Data accumulations in big clouds will increase the likelihood to decrypt pseudonyms with additional knowledge83

The current legislation is insufficient for cloud computing, where massive data with differ- ent analysing algorithms like data warehouses and data mining are distributed to world- wide server systems.84 The expectation towards a new data protection law is that it does not reduce or unnecessary increase the personal data protection level to enable digital tech- nology and to protect privacy. It needs to be framed in an intelligent way that it is possible to make the best use out of the technology and not at the expense of privacy rights.85 The legislation defined in multiple provisions requirements to protect privacy. Art. 6 (c, e) DP- D enshrined the principle of purpose limitation determining data collection for clearly de- fined purposes. Further processing should only take place considering the original purpose and should not exceed the timeframe until the purpose is fulfilled. Changes to the data pro- cessing purpose during or while data processing is only allowed with agreement. The prin- ciple of purpose limitation can support the right of informational self-determination, be- cause the initial data intention usage needs to be respected.86 § 3 (1) BDSG points in par- ticular to the principle of data economy and avoidance. Thus, the principle of proportional- ity is respected, in addition to minimizing the limitations of the principle of informational self-determination.87 The principle of data economy and avoidance could be achieved with anonymising and pseudonyms, since less data is stored. The correct application of the ex- isting legislation lacks visibility in cloud environments, because technical and organisa- tional measures are missing to support the law enforcement and control. Currently a dis- crepancy exists between the possibilities to achieve personal data protection in clouds and the offered technical and organisational measures to secure personal data. Securing person- al data only by law in clouds environments will be insufficient, since technical and organi- sational support and control is missing to secure the necessary application.88

Cryptographic decryption techniques could secure personal data where the mathematical code replaces parts of the text. The cryptographic key represents pseudonomised personal data for the cloud provider. The data appear anonymised to third parties, not holding the cryptographic key. However, depending on the investigated time and effort, sophisticated “hacking” method might be able to decrypt the keys. Backups are a weak point for crypto- graphically changed personal data. Cryptographic keys need to be changed and updated af- ter a while. Due to that, it is either necessary to delete backups with old keys or update the cryptographic keys in backups as well.89 Albeit there is the risk that over the course of time secure cryptographically keys can be analysed and hacked, cryptographically keys could deliver a decent level of personal data protection in the clouds.90 In the research project “Sealed Cloud” of the federal ministry of economy and energy an encryption technic has been developed which ensures that the data in the cloud are encrypted from the sender (cloud user) to the receiver (cloud provider). Therefore, there will be no possibility of a physical access to the data between the transmission points in the clouds of the sender and the receiver. This would protect encrypted and secured Meta data against external access.91

2.2 Geographical Applicable Law

Clouds are technically not restricted to borders and exist mostly in cross-border infrastruc- tures. From a technical perspective, different cloud providers and subcontractor clouds can process the users’ data in distributed locations. From the technical and business perspec- tive, the geographical question is not relevant.92 Clouds can be organised in different con- stellations. Companies can process personal data in their own clouds. In this case, the data controller and the cloud provider become the same legal entity. In the majority of the pro- moted cloud business cases, clouds are offered to small and medium sized companies (SMEs) where the cloud service user and the cloud provider are different entities.93

The jurisdictional clarification is very relevant. Directives do not have direct effect on the member states and need to be transposed into national law within the defined timeframe.94 Thus, directives leave to the member states to interpret and implement freedom and it might be transposed in different ways.

The intention of the Art. 1 (2) DP-D is to enable cross-border personal data processing within the EU. In the cross-border data transfer situations, the DP-D should guarantee data subjects the same data protection as in the homeland.95 Art. 4 (1) DP-D determines that the member states need to apply national measures issued based on the directive for all person- al data processing operations. In comparison to the unimportance of geographical locations for clouds, the directive differentiates the applicable law based on the data processing loca- tion. Art. 4 (1a, b) DP-D foresees that the national law is applicable where the data pro- cessing takes place.96 In Art. 4 (1a) DP-D the domicile principle is defined where the con- troller’s subsidiaries national law is applicable. Data processing needs to be carried out in connection with activities of the subsidiary. In Art. 4 (1c) DP-D the jurisdiction moves to the territorial principle.97 The territorial principle foresees that the location where the data processing takes place is relevant.

The BDSG deviates from Art. 4 DP-D in term as well as in systematic. Germany trans- posed the provision of territorial principle in §1 (5) BDSG.98 The BDSG is applicable if personal data is processed in Germany.99 The location of data processing is relevant where also foreign companies processing personal data in Germany are obliged to the BDSG.100

The domicile principle is adapted in § 1 (5 S.1 HS.1) BDSG. Pivotal is the controller as de- fined in § 3 (7) BDSG and not the data processing location.101 The domicile principle ex- ists if the cloud user is located in another EU member state, but the data collection takes place in Germany.102 In this case, the member states national law is the applicable law.103

In the inversion situation, if the cloud user is situated in another EU member state and op- erates a subsidiary in Germany, the territorial principle according to §1 (5) S.1 HS.2 BDSG is applied.104

Summarizing the BDSG is the relevant applicable law,

- if companies using cloud services (cloud user) are located in Germany
- if companies located in another EU member state processes data in a German subsidiary (or contract related)
- if companies located outside the EU processes data in Germany.105

2.2.1 ECJ Decisions on the Geographical Scope

In the ECJ’s Google decision, C-131/12 the court focused on the question whether data processing is carried out in connection with tasks of the subsidiary. Since the aim of the DP-D is to protect individuals’ private life, the prerequisites of the application area should be interpreted widely.106 The ECJ interpretation focuses on “the data processing in context of activities of a subsidiary”. In the Google case, the data processing does not need to take place exactly in the subsidiary of Ireland, but in relation to the activities of the subsidiary in Ireland. The ECJ confirmed Art. 4 (1a) DP-D as Google Spain operates as a subsidiary and the data processing takes place in connection with processing of the subsidiary. There- by the ECJ extended the territorial principle and interpreted the geographical application

[...]


1 http://de.statista.com/statistik/daten/studie/181086/umfrage/die-weltweit-groessten-social-networks-nach-anzahl-der-user/

2 https://about.twitter.com/company

3 http://www.google.de/intl/de/about/company/facts/management/

4 http://www.gutzitiert.de/zitat_autor_eric_schmidt_thema_privatsphaere_zitat_27040.html

5 http://www.spiegel.de/netzwelt/netzpolitik/nsa-affaere-google-manager-eric-schmidt-greift-us-regierung-an-a-973891.html

6 Liu, a.o. (2011): P. 17

7 Budszus, a.o. (2014): P. 4

8 Weichert (2010), P. 679

9 Liu, a.o. (2011): P. 7

10 Hogan a.o. (2011), P. 29

11 Jendrian (2013), P. 565

12 Software-as-a-Service (SaaS): The cloud provider offers “software-on-demand” in its own infrastructure. The cloud user rents the software from the cloud provider and uses its infrastructure. Platform-as-a-Service (PaaS): The service model “Platform-as-a-Service” comprises the option of the service user developing his own software on the service provider’s software and applications. The software provider offers the software application as a basis. Infrastructure-as-a-Service (IaaS): Cloud computing could be offered as an infrastructure service model where the hardware, e.g. the data storage, the processor, is provided as a service. The users are responsible for the selection, installation, and maintenance of their software.

13 Liu, a.o. (2011): P. 18

14 Liu, a.o. (2011): P. 20

15 Niemann, Paul (2014): P. 60

16 Kunz, Wolf (2013), P. 5

17 Borgmann a.o. (2012), P. 30

18 Borgmann a.o. (2012), P. 31

19 http://www.wiwo.de/technologie/digitale-welt/die-woche-im-netz-das-ende-der-privatsphaere/8453574.html

20 http://atlas.tk.informatik.tu-darmstadt.de/Publications/2008/WeberD-ICIE08.pdf

21 http://ondemand-mp3.dradio.de/file/dradio/2014/11/07/drk_20141107_1807_28a41523.mp3

22 Trepte, Masur, Pape (2014), P. 10

23 Roßnagel (2007), P. 112

24 http://ec.europa.eu/justice/data-protection/law/index_en.htm

25 In the following chapters, references to the directive EC/95/46 are referred to as DP-D.

26 Craig, De Búra (2011), P. 382

27 Directive EC/95/46 N. 3-8

28 In the following chapters, references to the European Commission data protection regulation draft are referred to as DP-R.

29 http://ec.europa.eu/justice/newsroom/data-protection/news/120125_en.htm

30 Scheja, Haag (2013), R. 17

31 http://twomediabirds.com/2015/01/08/libe-ausschuss-des-europaischen-parlaments-einigt-sich-uber-kompromissfassung-fur- europaweite-datenschutz-grundverordnung/

32 In the following chapters, references to the amended European Commission data protection regulation draft of the LIBE committee are referred to as LIBE.

33 http://www.haufe.de/compliance/recht-politik/europaeische-datenschutzreform-nimmt-weitere-huerde_230132_230368.html

34 http://www.rdp-law.de/datenschutz/europaeische-datenschutz-grundverordnung-ds-gvo.html

35 http://www.haufe.de/recht/datenschutz/deutschland-bremst-bei-eu-datenschutzverhandlungen_224_211354.html

36 https://www.datenschutzbeauftragter-info.de/eu-datenschutz-grundverordnung-lobbyplag-ranking-enthuellt-bestrebungen-der- politiker/

37 http://www.statewatch.org/news/2014/dec/eu-council-dp-reg-15395-14.pdf

38 In the following chapters, references to the general approach of the general data protection regulation are referred to as Council.

39 http://www.cr-online.de/Konsolidierte_Fassung__v._Maerz_2015.pdf

40 http://www.cr-online.de/26378.htm

41 http://twomediabirds.com/2015/01/08/libe-ausschuss-des-europaischen-parlaments-einigt-sich-uber-kompromissfassung-fur- europaweite-datenschutz-grundverordnung/

42 http://data.consilium.europa.eu/doc/document/ST-9565-2015-INIT/en/pdf

43 In the following chapters, references to the general approach of the general data protection regulation are referred to as Council.

44 In this paper, the German Federal Data Protection Law (BDSG) is used as member state example.

45 Rossnagel, Richter, Nebel (2012), P. 284

46 BVerfGe 65, 1, census of population judgement, grounds, par. C II 2

47 BVerfGe 65, 1, census of population judgement, grounds, par. C II 2

48 BVerfGe 65, 1, census of population judgement, grounds, par. C IV 1

49 BVerfGe 65, 1, census of population judgement, grounds, par. C II 1a, subpar. 3

50 Roßnagel, Richter, Nebel (2012), P. 286

51 Deutscher Bundestag Printed Matter 17/8999, P. 15

52 Roßnagel, Kroschwald (2014), P. 496

53 Roßnagel, Kroschwald (2014), P. 496

54 Deutscher Bundestag Printed Matter 17/8999, P. 48

55 Heilmann (2015), aus einem Gespräch während des Jahresempfangs des Landes Berlin bei der EU in Brüssel zum Thema „Welche Justizpolitik braucht der digitale Binnenmarkt?“

56 Case C-131/12,

http://curia.europa.eu/juris/document/document.jsf?text=&docid=152065&pageIndex=0&doclang=DE&mode=req&dir=&occ=first&par t=1&cid=252617

57 Härting (2014), P. 47

58 Schneider (2012), P. 21

59 Rossnagel, Richter, Nebel (2012), P. 287

60 Rossnagel, Richter, Nebel (2012), P. 287

61 Art. 2 (a, 2nd HS.) DP-D Art. 8 (1) DP-D prohibits the processing of special (sensitive) personal data. Data of this category are out of which it is possible to fol- low racialist or ethnic origin, political opinion, religious or philosophical belief or labour union membership, as well as health data or sexual orientation. In this work a further specification regarding sensitive data will not be considered. The presented work is based on regular data only.

62 DP-WG29 is an independent expert group founded based on Art. 29 DP-D. The DP-WG29 has consultancy function. The recommen- dations are not legally binding. http://ec.europa.eu/justice/data-protection/article-29/index_de.htm

63 http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2007/wp136_en.pdf

64 Roßnagel, Scholz (2000), P. 723

65 Roßnagel, Scholz (2000), P. 724

66 Weichert (2010), P. 682

67 Roßnagel (2007), P. 148

68 Weichert (2010), P. 4

69 Borgmann (2012), P. 30

70 Roßnagel, Scholz (2000), P. 724

71 Gola, Klug, Körffer (2015), BDSG §3, R.45

72 Roßnagel, Scholz (2000), P. 725

73 Weichert (2010), P. 4

74 Roßnagel, Scholz (2000), P. 724

75 In 2011, the case C-70/10 Scarlet Extended SA against Belgian Entertainment Association Video recital no. 51 the ECJ already an- swered indirectly to the question and valuated the IP addresses as personal data. http://www.heise.de/newsticker/meldung/Speicherung- von-IP-Adressen-BGH-hat-Fragen-an-den-EuGH-2437088.html

76 http://www.lto.de/recht/nachrichten/n/bgh-beschluss-vi-zr-135-13-ip-adressen-bund-speicherung/

77 http://ec.europa.eu/justice/policq

78 Roßnagel, Scholz (2000), P. 725

79 Hoeren, Sieber, Holznagel (2014), R. 33, 34

80 Deduplication enables to save large amount of storage space.

81 Borgmann a.o. (2012), P. 129

82 Borgmann a.o. (2012), P. 53

83 Roßnagel, Scholz (2000), P. 726

84 Roßnagel (2013), P. 562

85 Härting (2015), aus einem Gespräch während des Jahresempfangs des Landes Berlin bei der EU in Brüssel zum Thema „Welche Justizpolitik braucht der digitale Binnenmarkt?“

86 Keil, P. 8

87 Keil, P. 7

88 Hornung (2012), P. 105

89 Kroschwald (2014), P. 78

90 Kroschwald (2014), P. 80

91 http://www.trusted-cloud.de/267.php

92 Weichert (2010), P. 4

93 Borges, Brennscheidt (2012), P. 58

94 Craig, De Búra (2011), P. 106

95 Weichert (2010), P. 4

96 Weichert (2010), P. 4

97 Peifer (2005), P. 6

98 https://www.datenschutzbeauftragter-info.de/ist-das-bundesdatenschutzgesetz-bdsg-im-internationalen-datenschutz-anwendbar/

99 Simitis (2011), BDSG §1 R. 158

100 Simitis (2011), BDSG §4b R. 9, 13

101 Bitkom (2008), 7

102 Borges, Brennscheidt (2012), P. 59

103 Gola, Schomerus (2012), BDSG §1 R.27

104 BT-Drs. 14/4329 (2001), P.13

105 Borges, Brennscheidt (2012), P. 60

106 Beyvers, Herbrich (2014), P.561

Final del extracto de 106 páginas

Detalles

Título
Informational Self Determination in Cloud Computing. Data Transmission and Privacy with Subcontractors
Universidad
Centre International de Formation Européenne - Nice  (CIFE European Online Academy)
Curso
European Law - Informational Self Determination in Cloud Computing -- Data Transmission and Privacy with Subcontractors
Calificación
95,5
Autor
Año
2015
Páginas
106
No. de catálogo
V309264
ISBN (Ebook)
9783668075702
ISBN (Libro)
9783668075719
Tamaño de fichero
1032 KB
Idioma
Inglés
Palabras clave
informational, self, determination, cloud, computing, data, transmission, privacy, subcontractors
Citar trabajo
Jutta Grosse Wichtrup (Autor), 2015, Informational Self Determination in Cloud Computing. Data Transmission and Privacy with Subcontractors, Múnich, GRIN Verlag, https://www.grin.com/document/309264

Comentarios

  • No hay comentarios todavía.
Leer eBook
Título: Informational Self Determination in Cloud Computing. Data Transmission and Privacy with Subcontractors



Cargar textos

Sus trabajos académicos / tesis:

- Publicación como eBook y libro impreso
- Honorarios altos para las ventas
- Totalmente gratuito y con ISBN
- Le llevará solo 5 minutos
- Cada trabajo encuentra lectores

Así es como funciona