Fairness and Control In Data-Driven Marketing - UiO - DUO

50
Fairness and Control In Data-Driven Marketing On the notion of control in data protection and the implications when digital standardised contractual terms, containing provisions on personal data processing for marketing purposes, are seen as unfair under interplaying legal instruments Candidate number: 7009 Submission deadline: 16.05.2019 Number of words: 15,102

Transcript of Fairness and Control In Data-Driven Marketing - UiO - DUO

Fairness and Control

In Data-Driven Marketing

On the notion of control in data protection and the implications when digital

standardised contractual terms, containing provisions on personal data processing

for marketing purposes, are seen as unfair under interplaying legal instruments

Candidate number: 7009

Submission deadline: 16.05.2019

Number of words: 15,102

1

Table of Contents

1 Introduction ........................................................................................................................ 1

2 The Interplay of Rules Applicable to M3DM Processing Activities .............................. 4

2.1 The General Restriction of M3DM Processing Activities ................................................... 4

2.1.1 Preliminary Remarks ................................................................................................. 4

2.1.2 Scope of Data Protection: Relevant Observations on the Notion of ‘Personal Data’

............................................................................................................................ 5

2.1.3 Legal Requirements of M3DM Processing Activity................................................... 7

2.2 The Notion of Control in the EU Framework ...................................................................... 8

2.2.1 The European Rhetorical Entanglement of the Meaning of Control....................... 10

2.2.2 The Placement of Structural Measures: Reasons and Consiquenses ...................... 14

2.2.3 Final Remarks.......................................................................................................... 16

2.3 Concluding Remarks .......................................................................................................... 17

3 Consent and Contracts: Similarly Shared Elements of Agreement ............................ 18

3.1 Establishment of Similarity Structure Demonstrating Simplicity of Issues....................... 18

3.1.1 Information Requirements ....................................................................................... 20

3.2 Application of the Legal Interactivity to ‘The Ride of My Life’ ....................................... 21

3.2.1 Valid Consent and Contract: Capacities of Offer and Acceptance ......................... 21

3.2.2 Consent Must be Given with Affirmative Action...................................................... 22

3.2.3 Provisions Stipulating Processing Activity must Form Part of the Valid Contract 23

3.3 Information Requirments: The Outlook of the Layered Notice Praxis .............................. 25

3.3.1 Consent and Contract: Different types of Notices and Requirements ..................... 25

4 What M3DM Processing Activities Can Be Justified Based on Valid Consent.......... 26

4.1 Can Adhoc Privacy Notices be Freely Given?................................................................... 26

4.2 Encapsulating Consent as an Imperative Barrier of Potential Misuses.............................. 27

4.3 Finding Balance between Exploitation and Protection of Personal Data........................... 28

4.3.1 Why Buy ‘Google Assistant’ in a ‘Data Driven Market’?....................................... 28

5 The ‘Transparent’ Potential of Data Subjects Empowerment .................................... 29

5.1 Preliminary Remarks.......................................................................................................... 29

5.2 Forms of Productive and Transparent Practices................................................................. 29

5.3 Use of Data as Unfair Commercial Practice ...................................................................... 30

2

6 Conclusion......................................................................................................................... 32

7 Table of reference............................................................................................................. 33

7.1 A. Books and journal articles ............................................................................................. 33

7.2 B. Report and other documents .......................................................................................... 36

7.2.1 Independent EU Data Protection and Privacy Advisory Bodies............................. 36

7.2.2 European Commission............................................................................................. 38

8 Table of abbreviations ..................................................................................................... 43

9 Table of legal instruments ............................................................................................... 45

10 Table of cases .................................................................................................................... 46

1

1 Introduction Digital technologies are rapidly reshaping the world, bringing great benefits as well as many

challenges. Despite rigours efforts on behalf of the EU to establish a growing digital single

market (DSM), the inability of the law to evolve at a similar phase has left users of digital

services and content, in a situation that allows for harmful exploitation and undermines the

general will of the society.1

Society, stakeholders, and users (hereafter subjects or data subjects) all greatly benefit

from the continuous development of the digital environment.2 The rise of the ‘Internet of

Things’ (IoT) and the pursuing dramatic increase in data collection and sharing (‘Big Data’)

has offered companies like Google, Amazon, and Facebook the ability to bring about new

products and services as well as to configure old ones in a way that is better suited to the

needs their consumers'. Ultimately, individuals’ everyday lives are greatly improved through

better and less expensive, or, more often, even free, products and services, in all sectors of

society.3

In particular, this is made possible by cookies and everyday smart devices, embedded with

sensors, and designed to continuously collect, process, and transfer data.4 Tracking subjects

behaviour and interests, online or offline, essentially creating a 'profile'5 that links to subjects

account or IP address, enabled above-mentioned companies,6 to offer their services for free,

i.e. by operating ad-networks, where ads are defined and displayed according to subjects pro-

files, using predictive or behavioural analytics concluded by these ad-network or affiliate ad-

server. Additionally, this allows them to better predict how subjects could be influenced.7

The increasing capabilities of digital technologies to process data, e.g. by methods of ad-

vanced analytics, machine and, deep learning, have allowed both private and public entities to

make use of personal data on an unprecedented scale in order to pursue their activities.8 In

1 Jacquemin, Digital content and sales or service contracts, 27. 2 Ibid. 2-3. Commission, IoT: Action Plan, 3. Ezrachi and Stucke, Written evidence (OPL0043), [2.1.]. 3 Bentzen, et al., 2018:11. 4 WP29, Opinion on IoT Development, 6-7. Using http://www.youronlinechoices.com/nor/dine-valg you can see the magnitude of OSP’s presently tracking you and providing you with interest-based advertising. 5 Qv. GDPR, rec. 30. 6 “Google operates the ‘Google Display Network’ (comprising partner websites and some Google [properties, e.g.] YouTube), DoubleClick online ad serving [products, ...] ‘DoubleClick’, ‘Ad Exchange’ and ‘Invite Media’. These provide advertisers with the tools needed to analyse the market and deliver specialised ads with greater ease and efficiency”. Qv. http://www.google.co.uk/intl/en/policies/privacy. 7 Ezrachi and Stucke, Written evidence (OPL0043), [2.1.-2.5.]. Re. Commission, DSM Strategy for EU, 63-64. Qv. “[The] core themes of the strategic implications of Big Data.”, cf. Ibid. [2.3.-2.9.]. 8 Ibid. [2.2.].

2

return personal data has become a central asset to companies' and the digital society.9 Such

activities often include collection and use of personal data for concerning purposes, such as to

gain undue advantage over competitors, not only by predicting trends and behaviours but

sometimes by manipulating their services or products to erase any trace of competition, q.v.

Googles search result violation in 2017.10

‘Data’, as information, has always played a central role in any market economy. As the

essential input for its performance, the flow of information has stimulated innovation and the

economy as well as enabled consumer’s choice.11 However, more recent cases have had

greater impacts on data subject’s ability to act based on relevant and accurate information.

Moreover, they often question the general fairness of these new practices in an environment

where companies gain access to massive amount of personal data often without the know-

ledge of subjects, and consequently sell it to marketers or use it in pursuit of influencing pub-

lic opinion through a verity of media,12 q.v. Facebook - Cambridge Analytica scandal in

2018.13 Unsurprisingly, these recent developments have increased privacy concerns and

brought new challenges for data protection,14 as subjects are not fully informed about its col-

lection, processing, and subsequent use and lack the capacity to control the access others have

to their data.15 Resulting in the deterioration of the data subject’s trust.16

Providing data subject’s with a sufficient level of control is a paramount concern because

putting subject’s in control of their data, achieving empowerment and effective protection of

there privacy, improves their level of trust which plays an important role in the uptake of new

9 Reding, “The EU data protection reform 2012”, 2. Commission, Completing a trusted DSM, 2. EESC, Opinion 345/2017 on the ePR, [3.3]. 10 Case AT.39740 Google Search (Shopping). 11 Ezrachi and Stucke, Written evidence (OPL0043), [2.1.-2.2.]. 12 Commission, Completing a trusted DSM, 3. 13 One’s users consented to personality quiz-app apps ToU, an app freely available on Facebooks platform, it collected psychographic data about users for academic purposes, as indicated in the ToU. However, Facebooks design enabled the app to additionally collect data from all Facebook friend of a user, without their consent. Consequently, although under 300,000 agreed, data was collected from approx. 72 million users. The data was then sold to Cambridge Analytica to use for political and commercial purposes. It was, i.a. allegedly used to influence Brexit voters as well as to advance US presidential candidates such as Donald Trump, cf. Farivar, “lawsuits filed by angry Facebook users”, https://arstechnica.com/tech-policy/2018/03.html and subsequent cases Case no. 01725, N.D. Cal. 3:18, Case no 01732, N.D. Case Cal. D. Del. 5:18, CA. Facebook - Cambridge Analytica scandal had similar facts as Specht v. Netscape. 14 It is why EU data protection agencies are turning their focus to social media platform giants such as Facebook, stating that saying “sorry is not enough” when a multibillion-dollar platform uses personal data unlawfully, cf. Andrea Jelinek, former chairman of the WP29. 15 Hansen et al., Privacy and identity management, 316-317. Privacy definition provided by Culnan is used, Culnan, M.J.: How did they get my name? an exploratory investigation of consumer attitudes toward secondary information use. MIS Q. 17(September), 341–364 (1993). 16 GDPR, rec. 7-8.

3

technology.17 Rebuilding data subject’s lack of trust is therefore essential to maintain digital

development and the growth of the digital economy. Nevertheless, overregulation can often

cause significant harm. Thus, neglecting to maintain the balance of interests between society,

controllers and data subjects would be counterproductive to abovementioned aims.18

Recent reform sets out to achieve this balance, allowing greater control while striking a

balance between the free flow of data and the data subjects rights to the protection of their

personal data.19 Importantly the new General Data Protection Regulation 2016/679/EU

(GDPR) strengthens and extends the scope EU’s data protection requirements. It includes the

principle of consent, as one of the lawful basis for collecting and processing personal data

which collector obtain e.g. through subj ects use of digital products and service (digital ser-

vices hereafter refers to digital products).20 The lack of transparency that has surrounded

those circumstances has recently been made painfully clear with the influx of cookie consent

requests of all types following the implementation of the GDPR. Notably, social media giants

such as Facebook and Google have now taken steps to increase transparency, respect the rules

of democratic debate, and overall, to portray themselves as placing privacy and data protec-

tion, in the forefront, taking up data protection measures and requesting consent.21

It remains to be seen whether marketers actually apply the requirements of consent and

transparency in accordance with the current reform and in a way that offers data subjects en-

ough control over their personal data, to reinstate their trust in digital services, while still

maintaining balance, thus sustaining the development of the digital single market (DSM).22

More importantly it most first be asked to what extend data driven marketing falls under the

GDPR. Thereafter what is actually meant by the ‘notion of control’ and how principles of

fairness not just within the GDPR but in interplaying rules help achieve subject control.

17 Studies [31] have shown that [31]. See the European research project SWAMI: www.isi.fraunhofer.de/t/projekte/e-fri-swami.htm. 18 Qv. “given [the rapid development of digital technologies and] the importance of creating the trust that will allow the digital economy to develop [the GDPR calls for data subjects to] have control of their own personal data”, cf. GDPR, rec. 6-7. Or as the commission puts it “Creating the right framework conditions and the right environment is essential to retain, grow and foster the emergence of new online platforms [...]”, cf. Commission, Online platforms: Opportunities and Challenges, 3. Q.v. EU Committee. OP & DSM (HL Paper 129), 67 [255], regarding concerns on implementing GDPR, re. Yahoo (OPL0042) and ITIF (OPL0076) by ensuring individuals right to privacy are put in action while still sustaining development of the online environment and economy. 19 GDPR, art. 1-3, rec. 1-13. 20 EU Committee, OP & DSM (HL Paper 129), 57-59 [217-221, 223]. 21 Adapting to new digital reality {Citation}es, 4-5. Study on consent notification measures. 22 Another key aspect of building trust is the capability to adjust the functioning and properties of technological systems to individual preferences (within safe boundaries). <ref: Commission, IoT: Action plan, 6*-7.>

4

2 The Interplay of Rules Applicable to M3DM Processing Activities 2.1 The General Restriction of M3DM Processing Activities

2.1.1 Preliminary Remarks

The rise of the ‘Internet of Things’ (IoT) and the pursuing dramatic increase in constant data

generation and flow,23 combined with 'big data' analytics (BDA), has massively (continually

to greater extent) equipped entities (like Google, Amazon, and Facebook) with capabilities to

‘obtain’ personal data, e.g. by use of cookies and everyday smart devices, embedded with

sensors, and designed to continuously collect, process, and transfer data.24 Tracking subjects

behaviour and interests, online or offline, essentially creating a 'profile'25 that links to subjects

account or IP address, in a way that enables exploitation (i.e. ‘utilisation’), i.e. predict and

infer subject’s interests and behaviour for, and consequently ‘benefiting’ from, more effective

marketing.26 Data driven marketing27 allows offering digital services for free, to bring about

new digital products and services (hereafter jointly referenced as digital services), to config-

ure old ones in a way that is better suited to the needs their consumers (hereafter respectively,

development- and optimisation through personalisation (DtP and OtP)) as well as to find new

potential customers and to reach them, (hereafter ...) as ads are defined and displayed accord-

ing to subjects profiles, using predictive or behavioural analytics concluded by ad-network or

affiliate ad-server.28

Rules of data protection are relevant as data processing forms the basis of data driven

marketing practices.29 Generally, such processing is prohibited if the data processed is con-

sidered ‘personal’, i.e. the original raw data, metadata collected by cookies,30 extracted, ag-

23 24 It goeas without saying that these are not the sole factorce, for example the commission noted reasently that the use of mobile data has grown 12 fold as result of roaming charges since June 2017, cf. Commission, “EU in May 2019: Top 20 Achivements”, 5. 25 WP29, Opinion on IoT Development, 6-7. Using http://www.youronlinechoices.com/nor/dine-valg you can see the magnitude of OSP’s presently tracking you and providing you with interest-based advertising. 26 Ezrachi and Stucke, Written evidence (OPL0043), Ibid. [2.2.]. WP29, Opinion on IoT Development, 10-11. Such as the application of censor fusion, fingerprinting, or cross matching. Qv. GDPR, rec. 30. 27 “Google operates the ‘Google Display Network’ (comprising partner websites and some Google [properties, e.g.] YouTube), DoubleClick online ad serving [products, ...] ‘DoubleClick’, ‘Ad Exchange’ and ‘Invite Media’. These provide advertisers with the tools needed to analyse the market and deliver specialised ads with greater ease and efficiency”. Qv. http://www.google.co.uk/intl/en/policies/privacy. 28 Ezrachi and Stucke, Written evidence (OPL0043), [2.1.-2.5.]. Re. Commission, DSM Strategy for EU, 63-64. Qv. “[The] core themes of the strategic implications of Big Data.”, cf. Ibid. [2.3.-2.9.]. 29 The definitions of ‘data subject’, i.e. any natural persons who can be directly or indirectly identified by the controller or a third party using reasonably likely means, and ‘personal data’, i.e. any information data relating to a data subject that can be linked to said individual, are key for determining the scope of the GDPR, cf. art. 4(1). 30 as the commission has concluded in September 2010 where the commission has decided to refer the UK to

5

gregated or inferred information so far as it relates to an identified or is reasonably likely,

without disproportionate effort, to be matched to an (i.e. identifiable) individual,31 unless both

lawful and fair.32

2.1.2 Scope of Data Protection: Relevant Observations on the Notion of ‘Personal Data’

In accordance to the above, under the GDPR it remains a prerequisite for the data to be ‘per-

sonal’ in order for subjects to enjoy their rights of protection, i.e. control is only provided in

situations where the data is personal.

The majority of data used to build profiles on and market to data subjects (e.g. raw data) is

however, not ‘personal’ data strictly speaking (i.e. they are just codes and digits by them

selves). Nevertheless, flexibility is presumably embedded into the concept of 'personal data' to

provide an appropriate response to the circumstances at stake.33 A high degree of flexibility is

generally given in regards to the above mentioned raw data when collected in relation to IoT

devices, Big Data and cloud computing, do to there pervasive nature and vast amount of data

processed which might be combined with other ‘unique identifiers’ and other information

received by the servers to create profiles of natural persons.34 They thus generally fall under

the GDPR. However, as they data driven marketing generally neither produces legal effects

nor a comparable risk potential (e.g. decision on whether to give subjects car insurance based

on there score) the profiling does generally constitute processing falling under the prohibition

under Article 22(1) of the GDPR.35

Furthermore, data used for data driven marketing that might be considered personal data

at one stage (falling under GDPR) can be rendered anonymous, i.e. incapable of being identi-

fiable, at different stages of the processing activity or thereafter processing activity of others,

EU's Court of Justice (ECJ) for not implementing rules on confidentiality of e-communications such as e-mail or internet browsing, cf. Article 5(1) of the ePrivacy Directive 2002/58/EC (ePD) and Article 2(h) of the Data Protection Directive 95/46/EC (DPD). Said rules requires Member States to ensure confidentiality of the communications and related traffic data by prohibiting unlawful interception and surveillance unless the users concerned have consented is obtained. Member States are also required to establish appropriate sanctions in case of infringements and such supervision be carried out by an independent authority must be charged with supervising cf. (IP/09/1626) (IP/09/570) (IP/12/60). DPD, art. 24, 28. Qv. Definition of personal data see GDPR, rec. 26 and +++ art. 4(1). art. 4(12) of the GDPR. 31 Cf. Breyer which held that dynamic IP addresses may be PD under art. 2(a) of the DPD so far as they are reasonably likely to be matched to an individual. The corresponding definition is taken up in art. 4(1) of the GDPR, cf. rec. 26 (sentence 1 and 3), (re. “GDPR”, 2) and recent case Nowak where CJEU has ruled that a candidate’s exam script is ‘personal data’, as it constitutes information that ‘relates’ to him. Qv. WP 29 Opinion 4/2007 on the concept of personal data, WP 136, 21. [Recital 26.]. 32 GDPR, art. 5(1)(a). 33 WP29, Opinion on Personal Data, 4. 34 Lazaro, Christophe and Le Métayer, Control over Personal Data, 18-20. GDPR 235 35 GDPR 1832-183.

6

thereby falling again outside the scope of data protection law.36 Nevertheless, statements of

data driven marketing entities like Pornhub.com that inaccurately assume that personal data

does not include anonymized or pseudonymized data,37 are domed to be not wholly correct.

The half truth exists as pseudonymization does not render data incapable of being used to

identify subjects, i.e. personal data, but rather that the personal data which is object of

pseudonymization, is aggregated in manner where it solely cant be attributed to specific sub-

jects without use of additional information belonging to the personal data which was the ob-

ject of the pseudonymization. This will be explained below with reference to ECJ

Wirtschaftsakademie.

ECJ Wirtschaftsakademie, dealt with the issue of whether a Facebook fan page administrator (FB-FP adm.), such as Wirtschaftsakademie (WA), was, in accordance with the DPD, a con-troller jointly responsible with FB, a social media networks, for the processing of personal data collected by FB via cookies placed on visitors devices, for statistical purposes (providing WA with ‘Facebook insight’ function), without first being informed. The court found that such was the case because WA as adm. took part in the determination of the purposes and means of the processing, by opening the FB-FP and agreeing to predetermined terms of processing, WA made it possible for FB to collect data of the visitors, and was involved with defining param-eters depending, in particular, on its target audience and the objectives of managing and pro-moting its activities, which ‘influenced’ the purposes and means of the processing.

The statistical information provided to FB-FP adm., i.e. demographical percentage of likes

and visits, is the aggregated form of personal data collected and processed by FB, i.e. infor-

mation which subject accessed and liked what on the FB-FP. It is presumed anonymous as the

statistics are presented in a manner whereby the adm. cannot know which particular subjects,

nor in what way they, belong to given statistics. If and when, the adm. collects received in-

formation and further processes it would fall outside the scope of data protection instruments

as long as the original information collected was endurably anymous.38 Where as it would fall

inside the scope when the data object of collection and subsequent processing has undergone

pseudonymisation, although the risk to identification would be reduced significantly in said

case.

Comparatively, it can be argued that the aggregated data an adm. is supplied with in the

36 “Key Aspects Expleined: GDPR Proposal”, 2. 37 According to Article 4(5) of the GDPR, ‘pseudonymisation’ means the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person; 38 “Key Aspects Expleined: GDPR Proposal”, 2.

7

Facebook insight function is neither the object of anomysastion or pseudonymised. Concern-

ing the latter, the data (i.e. the key), which the statistics are based on, are (at least facebook

likes but not who accessed the FB - FP) accessible to the FB-FN adm.

However, the notion of anonymous data should be interpreted narrowly, and with con-

sideration to the fact that the ability of receivers to link data together and re-identifying sub-

ject is, now more then ever easily possible, e.g. due to the rise of sophisticated re-

identification tactics.39 This is especially true concerning location data do to the extensive re-

identification capabilities of data driven marketing entities. For example, the criteria of per-

sonal data is meet where, e.g. the FB-FP adm. has copied down the ‘Facebook insights’ stat-

istics, e.g. the number of likes of mid day posts, for further processing of some self deter-

mined purpose, e.g. knowing frequency of midday activity across services, and thereafter

examines the midday posts themselves on his FB FP, to identify who the individuals are that

are liking midday posts.

This is especially the case when collecting metadata as it is generally not encrypted, and

the vast majority is likely to remain so. This is data that needs to stay unencrypted in order for

the systems to operate: location data from cell phones and other devices, telephone calling

records, header information in e-mail, and so on. This information provides an enormous

amount of surveillance data that was unavailable before these systems became widespread.40

This situation specifically, where this raw data is collected by and, or, through digital de-

vices or services, also triggers particular rules of the ePD, i.e. art. 5(3) and 13 …REC 30

GDPR EDPB opinion on interplay 11-12.

2.1.3 Legal Requirements of M3DM Processing Activity

In accordance with the above, for the data driven marketing processing activity to be lawful it

must both not be prohibited by law in combination with having legitimate purposes for, and

which limits the, collection and processing and legitimate basis, i.e. either [1] subjects valid

consents (this is likely the only legitimate basis where ePD applies, notably prior to the

placement of cookies necessary for behavioural advertising, STH SPECIAL CHATO-

GORYcf. next chapter)41 are obtained by controllers,42 i.e. the entities involved in the M3DM,

that, alone or jointly with others, determine why and how data will be processed, i.e. who

39 “Key Aspects Expleined: GDPR Proposal”, 2. 40 (going dark, 5) 41 EDPB, guidelines on GDPR art. 6(1)(b), 13 [52]. AD REFERENCED MATERIAL THEREIN. 42 Reding, “The EU Data Protection Reform 2012”, 2

8

have the ‘factual influence’,43 or [2] they have other legal permissions, e.g. prevailing legiti-

mate interests (see chapter thereafter) or contractual necessity (this is generally not a suitable

legal ground for behavioural advertising and associated tracking profiling,44 but might be rel-

evant, in some cases, for personalisation of content,45 cf. chapter next).46

Building on the above and will serve as the structure of analysis, unless consent is specifi-

cally required, if there does not exist a contract that is considered valid (as determined by, e.g.

e-commerce, contract law, etc. where applicable), and that the processing activity is either

necessary either for the performance of a contract or in order to take pre-contractual steps at

the request of a subject (7 [22]), they need to seek justification, in accordance with the princi-

ples of purpose limitation, in another appropriate legal basis,47 e.g. consent or prevailing le-

gitimate interest.

Whether a processing is seen as ‘unfair’ can be used when considering the need for-, and

ability of control in that particular situation, will central to deciding whether legitimate inter-

est can be deemed more appropriate, whether they can prevail over the rights of data subjects.

Yet what is considered fair is by itself difficult to define in abstract and will be discussed. It

first needs to be established what is meant by control.

2.2 The Notion of Control in the EU Framework

What do major companies like Microsoft actually mean when they express a will to “[...] pre-

43 WP29, Opinion: “controller" and "processor", 10. The GDPR, 14, 17. GDPR, art. 4(7). 44 EDPB, guidelines on GDPR art. 6(1)(b), 13. 45 EDPB, guidelines on GDPR art. 6(1)(b), 13, 14. 46 The GDPR similar to its predesseor the DPD, provided six grounds for processing being lawful. Cf. GDPR, art. 6 (1)(a-f). NB. (d-e) only concern public authorities, therefore, as NRAs are likely not in the buisness of marketing, at least not on the scale of private entaties, they are exempt from this paper. (b) concerns processing for contractual neccessesity, this ground is irrelavant as this scope is limited to cases and data used to conclude a contract, e.g. such as shoping chart cookies, and not e.g. collection of data about buying behaviour because of a obligation stemming from a contract. Lastly, (c) deals with processing necessary for compliance with legal obligations, this ground is deemed irrelevant as it would appear exceptional if processing for marketing purposes would be necessary to comply with legal obligations, cf. Moerel and Prins, “homo digitalis - proposal: BD and IoT”,3. In principle, after the user has uninstalled the app, the app developer does not have a legal ground to continue processing of the personal data relating to that user, and therefore has to delete all data. An app developer that wishes to keep certain data, for example in order to facilitate reinstallation of the app, has to separately ask for consent in the uninstall process, asking the user to agree to a defined extra retention period. The only exception to this rule is the possible existence of legal obligations to retain some data for specific purposes, for example fiscal obligations relating to financial transactions.The Working Party reminds all information society services, such as apps, in opinion on apps on smart devices, 24)that the European data retention obligation (Directive 2006/24/EC) does not apply to them and therefore cannot be invoked as a legal ground to continue to process data about app users after they have deleted the app. The Working Party takes this opportunity to highlight the especially risky nature of traffic data, which deserve special precautions and safeguards per se – as highlighted in the WP29’s Report on the enforcement of the Data Retention Directive (WP172) – where all the relevant stakeholders were called upon to implement the appropriate security measures. 47 EDPB, guidelines on GDPR art. 6(1)(b), 14 [19]. WHERE THEY TAKE DIFFERENT APPROACH.

9

serve the ability for [subjects] to control [their] data”48? Soon after recent forms, ‘giving sub-

jects control’ became as trendy as ‘we value your privacy’. Never the essence of the former

statement is not within the grasps of subjects because of the plurality of different meanings

frequently ascribed to the ‘notion of control’ and the vagueness of its supposed core mecha-

nisms of empowering them.49 This prompts the need for a greater comprehension of the no-

tion before conducting an examination of whether the EU data protection framework allows

data subjects more control in a way that allows trust in the DSM.

The ambiguity surrounding the meaning of ‘control’ likely stems from ‘misleading confla-

tion’, firstly, between scholarly literature and materials produced by the EU regulator, DPAs

and advisory bodies. Secondly, between the latter materials themselves, as well as their

vagueness. It is important to realize that although the recent EU reform pushed for providing

subject’s with greater control over their personal data, the notion of control is not new. In fact,

privacy advocates, scholars, regulators, and the tech industry have for a long time advocated,

for giving subjects more control as the key regulatory response to the problems raised by re-

cent developments of digital technologies.50 In the course of scholarly literature the notion of

control has often been conceptualised in certain ways, notably as one of the foundations of

privacy,51 which creates a misleading confliction with the deployment of the notion of control

by EU institutions as a tool for data subjects to manage their privacy.52 In short, this means

that if subjects control is guaranteed under EU law, their privacy (or aspects of it) is not sim-

ultaneously guaranteed automatically but their ‘ability to manage’ it would be.

Secondly, by the scale of diverse EU material expressing the need for control,53 often in

an incoherent manner and leaving the meaning of control and its core elements vague and

pervasive.54 The GDPR best illustrates this lack of decisiveness as it briefly references in its

recitals, the need for subjects to have control given the challenges brought by the technologi-

cal development and the importance of creating the trust that allows for the development of

48 Nadella, Your data, powering your experiences, controlled by you, at https://privacy.microsoft.com/en-US 49 Lazaro, Christophe and Le Métayer, Control over Personal Data, 29. (ATH) 50 Ibid. Lazaro, Christophe and Le Métayer, Control over Personal Data, 4, 7, 15-16. Woodrow, The Case Against Idealising Control, 424. 51 Debates concerning the validity of such will be left out of this paper. For in depth analysis of the meaning of control as developed in privacy and data protection scholarship, see Lazaro, Christophe and Le Métayer, Control over Personal Data, 6-15. Leenes, DP and Privacy, 117-120. 52 Lazaro, Christophe and Le Métayer, Control over Personal Data, 15. Morel et al., IoT: Enhancing Transparency and Consent, 1. 53 e.g. preparatory legislation works, legislative text, experts’ opinions, and material addressed to citizens 54 Lazaro, Christophe and Le Métayer, Control over Personal Data, 15-19, 21.

10

the DSM.55 No explanation is however given to the meaning of control (including its prag-

matic modalities), how providing data subjects with greater control is among the crucial ele-

ments that will create the trust that will allow the digital economy to develop across the inter-

nal market?56

2.2.1 The European Rhetorical Entanglement of the Meaning of Control

The EU rhetoric might provide an explanation to the promises of the commission that the EU

legal framework, particularly with the addition of the GDPR, empowers data subjects to exer-

cise effective control over their personal data.57 Through an examination of the material pro-

duced by the EU regulator, the dominant rhetoric can be said to take a uniquely entangled

approach to the notion of control where as it encompasses both subjective and structural

agents.58

Respectively, this empowerment to ‘manage’ their personal data could obliquely manifest

through a combination of several categories of rules (i.e. a set of ‘micro rights’) that are fre-

quently linked with or mentioned along side ‘the notion of control’.59 These micro rights

mainly surround the right to object and the requirements of consent60 (and to withdraw it),61

as well as of transparency and information (including the rights of access).62 These provide

55 GDPR, rec. 6-7. Or as the commission puts it “Creating the right framework conditions and the right environment is essential to retain, grow and foster the emergence of new online platforms [...]”, cf. Commission, Online platforms: Opportunities and Challenges, 3. Q.v. Commission (EC), comprehensive approach on personal data protection, 3-4 [H-N], EU Committee, OP & DSM (HL Paper 129), 67 [255], regarding concerns on implementing GDPR, re. Yahoo (OPL0042) and ITIF (OPL0076) by ensuring individuals right to privacy are put in action while still sustaining development of the online environment and economy. 56 Ibid. Morel et al., IoT: Enhancing Transparency and Consent, 1-2. 57 Voigt, and Bussche. The EU GDPR, 141-147. Commission, Completing a trusted DSM, 3, 5. Commission, stronger protection, new oppertunities, 3. Important rules arising for secondary legislation concerns the burden of proof, obligations, safeguards, and rights deriving from data processing. Tackling the DSM unrelated to whether data is considered personal are the ECD, the Services Directive 2006/123/EC, the Directive on the re-use of public sector information (Directive 2003/98/EC, known as the ‘PSI Directive’, currently under review), the GDPR, which is complemented by the e-Privacy Directive 2002/58/EC (ePD) as modified by Directive 2009/136/EC (currently under review). The general provisions of the TFEU (Articles 49 and 56) could be applicable to the totality of data storage and other processing services. However, addressing existing barriers through infringement procedures against the Member States concerned has been identified as cumbersome and complicated, yet leaves a situation of legal uncertainty. 58 Lazaro, Christophe and Le Métayer, Control over Personal Data, 15-16. 59 Q.v. Reding, “EU DP Full Speed”, 3. 60 Ibid. Consent as defined in the Article 4(11) in the GDPR (see also rec. 43), is one of the legal grounds for processing personal data cf. GDPR, art 6(1)(a), q.v. rec. 32 and must be obtained prusuant to the requirements as stipulated in the GDPR, see art. 7-11 and rec. 42, 33. Q.v. GDPR, art. 7(3) on consent. 61 The right to object (GDPR, art. 21, rec. 69-70.) or to restrict data processing (GDPR, art. 18-19, rec. 67.), rights related to automated decision-making and profiling (GDPR, art. 22, rec. 71-73.). 62 Lazaro, Christophe and Le Métayer, Control over Personal Data, 21-22. See also Bygrave, DP Law, 158-163. Q.v. concerning the requirement of transparency, as well as fairness, the controller must, before any processing operation of personal data begins, inform the data subject of its existence, what their legal basis and purposes

11

data subjects with the ability to make decisions through autonomous choice (and by a delib-

erate and informed action) about the processing and, or, subsequent use of their personal data

by others.63

In comparison when the notion of control is conceptualised as privacy, the subjects would

be considered to lose control over personal data as soon as they provide and entrust control-

lers with it,64 whereas the EU rhetoric, in the same situation, would not consider control lost

as long as the subject still has some measure of influence over the processing and subsequent

use,65 i.e. where subjects have the ability to be aware66 of what, where and when personal data

is processed and why, how, and by whom, and the ability to act in some manner to shape the

use, e.g. to withdraw consent, to rectify67 or erasure,68 of the personal data that concerns

them.69

It would be impossible for subjects to exercise control over the use of personal data, that

concerns them, by third parties, if the latter or otherwise responsible party, does not make the

subject sufficiently aware of its collection,70 nor the purpose and scope of the processing prac-

tice, i.e. by ‘effectively communicating’, directly or indirectly, ‘intelligible information,71 that

is necessary’, in a ‘concise and transparent manner’, to the subject (q.v. chapter 3.),72 since it

affects subjects ability to take action,73 encompassing the exercise of their ‘micro rights’, cf.

are, to name a few. The communication itself must be governed by the principle of transparency, cf. GDPR, art. 5(1)(a), whereby controllers are obligated to create suitable information measures, cf. GDPR, art. 12. Qv. art. 2(1). whereas effective control by the data subject and by national data protection authorities requires transparent behaviour on the part of data controllers, 63 Lazaro, Christophe and Le Métayer, Control over Personal Data, 21-22. Bygrave, DP Law, 158, 160-162. Compare similariteis with the principle of ‘data subjects influence’ descussed in Bygrave, DP Law, 158-163. 64 Lazaro, Christophe and Le Métayer, Control over Personal Data, 29. Leenes, DP and Privacy, 122-123. 65 Compare, EESC, “Opinion on ePRp”, (6.3). 66 ATH re. awareness in Bygrave, DP Law, 158-163. 67 GDPR, art. 15-16, rec. 63-65. 68 GDPR, art. 17, rec. 66. Also called the right to be forgotten. 69 See analysis on ‘data subjects influence’ in Bygrave.. (ADD). Reding, DP reform.. 2 70 Q.v. EESC, “opinion on ePRp”, (5.4), where EESC warns against the effets profetability resulting from the possibility to obtain greater acess to personal information by futher processing of metadat, re. (5.2) and the presumed lack of knowlidge thereof, by data subjects when expressing consent to the storega of metadata, which will bresumably have such effect, and therefore advvocates the need to inform and better educate subjects. 71 In the ‘event of uncertenty’ about the 'level of intelligibility’ and ‘transparancy of information' and 'effectivnes of user interface/ notices/ polices etc.', the 'controller' can: determine what the avarge member of which the privacy information is inteded for (the subjects, that the collected personal data cconcerns) based on collectors assumed knowledge about that group of subjects) or test them (i.e. mentioned uncertities relating to both the information and communication) through, firstly, a number of 'mechanisms' (e.g. user panels readability testing, consolidation with stakholders (indrusty and advocacy) and regulatory bodies) and , secondly, through other means, as appropriate to determine the suitability of the information and communication to allow understanding by aan avarge member of the intended audicance. WP29, GDPR Guidelines: Transparency, 6. Re. Morel et al., IoT: Enhancing Transparency and Consent, 4. 72 WP29, GDPR Guidelines: Transparency, 5. 73 Q.v. WP29, “opinion: apps on smart devicess”, 24.

12

ECJ, Bara and Others, where the court held that the requirement of fair processing precludes

transfers between, and further processing by, NRAs without first informing the data subject,

for the purpose of the processing by, and in its capacity as, the recipient of the data.74

Control is similarly rendered unachievable when, although subjects are made sufficiently

aware, subjects have no means of taken actions, both, ‘expressing the privacy preferences’

‘they have decided’, which is communicated, directly or indirectly, to, and ‘received’, and

‘upheld’ by ‘parties responsible for implementation of requirements to shape the processing

accordingly’ as well as possibly to ‘allow the subject to monitor actual compliance’ with the

privacy choices they gave.75

The EU regulator appears to have recognized, or at least feels compelled to recognise,

Q.v. I v. Finnland,76 that duo to the digital design, notably its complexity,77 for subjects to be

able to ‘effectively exercise control’ (i.e. through use of these ‘micro rights’) providing the

micro rights themselves is insufficient, rather a set of ‘structural measures’ must be placed

within the digital environment that ensures its reliability and effectiveness,78 as technical solu-

tions facilitate more effective and sufficient means of making data subjects aware and em-

powered to exercise control.79 I v. Finnland (2008), which concerned confidentiality of pa-

tient data in a public hospital system. Specifically, the patient did not entrust the employees to

refrain from accessing the data without authorisation as was required by law. However as the

law did not require, and thereby by ensure that the law was implemented obligations under

Article 8(1) under the convention were not fulfilled. As the hospital did not place measures

within the health record system, that comprehensively logged access to the records, as well as

stored such log in away making monitoring of compliance possible, i.e. whether the data had

been accessed without proper authorisation.80 The lack of logging undermined the applicant’s

ability to litigate before the Finnish courts because it deprived her of concrete evidence that

her health records had been accessed unlawfully. Had the hospital provided a greater control

over access to health records by restricting access or by maintaining a log of all persons who

74 ECJ, Bara and Others, para 32, 33, 35. 75 Lazaro, Christophe and Le Métayer, Control over Personal Data, 30. Morel et al., IoT: Enhancing Transparency and Consent, 1. 76 Positive obligations extend to more then just putting legal rules in place. See Bygrave, privacy by design 109-110. 77 There is ample disbelieve that enabling ‘subjective controls’, e.g. like requiring informed consent, can, by themselfs, provides subjects with sufficient control duo to the digital design, notably its complexity, cf. Morel et al., IoT: Enhancing Transparency and Consent, 2. 78 Lazaro, Christophe and Le Métayer, Control over Personal Data, 18-20. Q.v. Bygrave, Privacy by design, 107. Where Bygrave points out that Article 25 shows that the EU regulator embraces PbD idieals to such… 79 Reding, “The EU Data Protection Reform 2012”, 2. GDPR, rec. 59. 80 Bygrave, Privacy by design, 109-110.

13

had accessed the applicant’s medical file, the applicant would have been placed in a less dis-

advantaged position before the domestic courts’: ibid, para. 44.

As seen above whether and to what extent structural measures need to be in place depends

on the risk at case. Although the main elements of the ‘structural measures’ are not widely

determined, EU material allows categorization of three pillars.81 There will be limited discus-

sion in this paper devoted to the first two pillars. 82 Respectively, they are ensuring effective

enforcement83 of effective and heterogeneous rules, which encompasses the need for

comprehensive set of rules to cope with risks of the online environment.84 These rules need to

apply to the processing of personal data of EU citizens independently of the area of the world

in which their data is being processed.85 Finally, the rules need to be effective in a way that

they are actually enforced, e.g. by independent national data protection authorities (NDPAs).

Secondly, the responsibility and accountability of data controllers86 which entails, e.g., the

designation of a data protection officer, the performance of impact assessments and possibly

the observation of ‘the principle of privacy by design’ (DPbDsgn) (see chapter 3.3.).87 The

final pillar involves ‘measures’ i.e. that reinforce security or enhance privacy, such as

implementation of an privacy certification scheme, using privacy enhancing technologies

(PETs) and privacy friendly default settings (DPbDflt),88 which deals with minimising the

processing of personal data (see chapter 3.3.).89

81 (ADD what material). 82 These can be said to entail mainly organisational measures q.v. Lazaro, Christophe and Le Métayer, Control over Personal Data, 18. 83 GDPR, rec. 7, 11, 13. 84 Ibid. 2-4, 18-20. Reding, “Your data, your rights”, 2-3. Q.v. such rules can now be found in the GDPR, e.g. ‘right to withdraw’, ‘burden of proof’ and ‘proof of need to collect and retain personal data’. 85 Lazaro, Christophe and Le Métayer, Control over Personal Data, 18-20. The GDPR applies “regardless of whether the processing takes place in the Union or not”, cf. GDPR, art. 3(1). Qv. in speech held 2014 Viviane Reding, then Vice-President of the European Commission, stated that “Europe must act decisively to establish a robust data protection framework that can be the gold standard for the world. Otherwise others will move first and impose their standards on us.” 86 Ibid. GDPR, rec. 103. 87 Lazaro, Christophe and Le Métayer, Control over Personal Data, 18-20. The GDPR increases NDPA’s powers, cf. GDPR, art. 51–59. 88 GDPR, art. 25(2). 89 Lazaro, Christophe and Le Métayer, Control over Personal Data, 18-20.

14

2.2.2 The Placement of Structural Measures: Reasons and Consiquenses

The ‘structural measures’ placed in the EU framework aim at enhancing data subjects ability

to take deliberate action and trust is placed on stakeholders, like Microsoft or Google,90 to set

up the measures in a way that will allow there customers, the data subjects, to take effective

control.91

Thus, the placement of these ‘structural measures’ is not done in the pursuit of achieving

‘hard privacy’ or ‘privacy by architect’, where the system is designed in a way that ensures

non disclosure of personal data and, more or less, thereby gives away with the need for data

subject to take action. Without the ability to take some action based on subjects own deci-

sions that in someway shapes the processing, e.g. whether or not to provide personal data in

the first place, control would be delegated to the system (e.g. as the programming would dic-

tated whether or not personal data would be provided from subjects) and trust would need to

placed on non human actors (i.e. the programming) or at least the manufactures.92

With reference to the GDPR, notably requirements of implementation of structural and

organisational measures (TOM) such as relevant to ensuring, e.g. PDbDsgn & bDflt (PDbD:

here after used collectively reference both), as such provisions, counter to their rational, being

limited in scope and loosely formulated, combined with there being little incentive to abide to

them, strongly indicates that the EU regulator has not decided to employ mentioned ‘hard

privacy’ approach, at least not to its full extent.93 The rationale behind DPbD is that,94 having

implemented measures, designed to ensure compliance with DP requirements and principals,

90 Q.v. Google’s Privacy Policy, where, in the first pharagraph in big letters, Google acknowledges that trust is placed on them when subjects are using their services and that they work hard to put subjects in control, cf. X. 91 Lazaro, Christophe and Le Métayer, Control over Personal Data, 26-30. 92 This notion of “privacy by architecture” differs from the usual vision of “privacy by control” as the user does not have to take any action: the design of the system ensures that his or her personal data will not be disclosed. in that case that control is entirely delegated to non-human actors. Privacy can be distinguished as hard privacy and soft privacy based on different trust assumptions. The data protection goal of hard privacy refers to data minimization, based on the assumption that personal data is not divulged to third parties (utilising privacy en-hancing technologies (PETs)). The system model of hard privacy is that a data subject provides as little data as possible and tries to reduce the need to place ‘trust’ on other entities. The threat model includes service provider, data holder, and adversarial environment, where strategic adversaries with certain resources are motivated to breach privacy, similar to security systems. Soft privacy, on the contrary, is based on the assumption that data subject lost control of personal data and has to trust the honesty and competence of data controllers. The data protection goal of soft privacy is to provide data security and process data with specific purpose and consent, by means of policies, access control, and audit. The system model is that the data subject provides personal data and the data controller is responsible for the data protection. Consequently, a weaker threat model applies, including different parties with inequality of power, such as external parties, honest insiders who make errors, and corrupt insiders within honest data holders. Deng et al., “A privacy threat analysis framework”, 4-5. re. Lazaro, Christophe and Le Métayer, Control over Personal Data, 26-27. 93 See assessment, cf. Bygrave, “DP by design and default”, 114-119. 94 GDPR, art. 25(1). The duty extends to ensuring default application of particular data protection principles and default limits on data accessibility.

15

from the design stage right throughout the lifecycle, (i.e. effectuating systems architectures

with built in DP compliance) would achieve greater results than through traditional regulatory

methods, code of conduct or contractual obligations, in relation to, e.g. improving traction of

DP principles, ensuring compliance and accountability as well as, likely, eliminate any phas-

ing challenges.95

In reality, contrary to DPbDflt, the GDPR might have reduced DPbDsgn effects in certain

parts of the life circle, to act as no more than merely guidance. 96 Currently, only ‘control-

lers’ are subject to requirements relating to the principal of DPbDsgn under GDPR (similarly

to the principal of DPbDflt). Furthermore, controllers use of processors is restricted to such

use sufficiently guaranties DPbDsgn.97 Where the definition of controller does not expend (as

is likely where entities are involved with e.g. IoT or Search engines) to include ‘producers’ of

digital devices, services and products, that collect, processing and, or, facilitates online behav-

ioural advertisings, the latter are only ‘encourage’ under the GDPR to design there DS in ac-

cordance to DPbDsgn.98 Moreover, in such case the producer might be subject to the same

obligation under different legal instruments, as manufacturer of terminal equipment are re-

quired to implement measures ensuring DPbD in their construction according to ePD, cf. art.

14(3), irrespective of whether the manufacturers would be designated as controller in accord-

ance to GDPR, 99 Similarly, provisions on stakeholders responsibilities of ensure sufficient

capabilities of subjects to ecsice their rights, e.g. provide information, request consent or en-

able objection, are implemented across legal instruments.

Admittedly, not all entities involved in developing any of the relevant DSPs (i.e. that fa-

cilitate 3DMPs) are covered. For example, in relation to the mass amount of data collected on

subjects behaviour across the Internet, e.g. through use of cookies, the entities involved in

developing the basic internet standards, which significantly effects such processing and could

in practice make the Internet DPbDsgn would not be considered controllers, nor, generally,

would the manufacturer of the digital device such as subjects PC where the cookies are stored

95 Bygrave, “DP by design and default”, 106. ICO, “Data protection by design and default”. Qv. GDPR, rec. 78. pseudonymising personal data as soon as possible; ensuring transparency in respect of the functions and processing of personal data; enabling individuals to monitor the processing; and creating (and improving) security features . 96 The requirements relating to PbDsgn essentially prompts the implementation of measures where they are needed to ensure that the particular processing activity sufficiently meets requirments of GDPR and otherwise ensure information security and protection of subjects rights. Which measures are appropriate in the context of the processing activity can be concluded by an impact assessment 97 Bygrave, “DP by design and default”, 116. 98 Bygrave, “DP by design and default”, 116, 118. Compare The GDPR 64. 99 Bygrave, “DP by design and default”, 108.

16

(although, such is more likely in relation to IoT devices see chapter X) nor the publisher or

the ad network (although, such is more likely in relation to SMN websites). Comparatively,

the advertiser and manufacturer of web browsers, such as Google Chrome, that allows access

to the Internet, would be considered controllers.100 Moreover, it remains a issue who should

be required to inform.

The complexity of the various entities involved in various ways and at various 3stages of

3DMPs makes it extremely difficult to delineate clear lines of responsibility and accounta-

bility, notwithstanding designating a controller and processor. Information needed for suc-

cessful provision of data driven marketing is the product of mixing the various data obtained

by various entities across various services and platforms.101 Notably, Google as the device

manufacturer of IoT devices (e.g. the Google Pixel phone) or as a DSP, collects themselves or

obtains from Android, as the OS provider, affiliated or partnered services and third party apps

(given that it was unveiled through the Google Appstore and uses one or more Google ser-

vices, e.g. Google Maps, play, analytics etc.), information relating to subjects activity online

(e.g. subjects telephony data and use of functions or third party apps) and offline (through

device sensors), collected by services or devices, irrespective of platform or whether during

subjects use or not.102

2.2.3 Final Remarks

As mentioned in order to effectively combat the ‘contemporary challenges’103 undermining

subject’s trust,104 relevant stakeholders, (as required by relevant legal instrument, q.v. discus-

sion above) should intervene by implementation of structural measures.105 However, NDPAs

must ensure their accountability, as it is known that DSPs are prone to neglect their informa-

tion duties, provide coercive and misleading consent request notification or all together avoid

data subject’s involvement.106

This placement of trust, achieved through the specific configuration of requirements (i.e.

implimentation of measures entailing the structural agent of control) in the EU framework, is

likely the result of attempting to achieve balance between stakeholders and data subjects ra-

100 Bygrave, “DP by design and default”, 118. 101 Google admits it may conduct such practices see Google Privacy Policy: Protect our Users, and the Public. 102 Google Privacy Policy: Your Activity. 103 GDPR, rec 58 AND. 104 WP29, Opinion: online behavioural advertising”, 6. 105 WP29, “opinion: IoT”, 3. 106 Lazaro, Christophe and Le Métayer, Control over Personal Data, 23-25.

17

ther than being the decided regulatory approach because of it being the most suited for data

subjects.107

In reality determining whether these tools and measures provided, are in practice, realisti-

cally effective in enabling control, and what the normative consequences from operating such

control, hinges on ‘the scope of the instruments’ that give rise to data subjects control’.108

Since, if the legal provision that enable control, as determined above, are not ‘available to

data subjects’ to deal with situations where they don’t fully trust that there rights will be re-

spected, such provisions would seemingly be ill equipped to foster trust.

Simply put data subjects cannot trust what they do not know or control. Thus mistrust

starts appearing as subjects become aware of the connection between there digital behaviour

and the presented targeted content without having oversight of what sort of data was being

collected, or how, which companies received and utilised it to provide targeted advertising.109

2.3 Concluding Remarks

Consent for collecting and processing personal data is an imperative barrier of potential mis-

uses by private and public entities in today’s data driven society where marketers have access

to vast amount of data and analytical technics to build user profiles containing information

about subjects, who they are and how they behave, and can use such personal profiles in ways

that might negatively affect subjects or the society as a whole. Therefore, It is essential to

place importance on data subject's informed consent as one of the legal grounds allowing the

processing of personal data and that its limits constrain possibilities to process the data, thus

serving as a mechanism for the subjects to stay in control of the purposes for which her per-

sonal data are used.110

Although consent can be a fundamental tool for individual users to guard their sovereignty

over their individuality, lifting some of the regulatory burdens of controllers is not necessarily

bad, that is if it is done to maintain a balance of interests, similar to providing data subjects

with greater protection. Importantly, data-driven business models can be pro-competitive,

yielding innovations that benefit both consumers and the company. Collecting and analysing

107 Compare, Commission, “Guidelines: Social Media”, 32. 108 Ibid. Reding, “The EU Data Protection Reform 2012”, 2-3. [make more ref.] 109 https://www.datatilsynet.no/globalassets/global/english/privacy-trends-2016.pdf 110 With digitalization came big data whereby utilizing the cloud infrastructure private corporations have gained the capacity to search, aggregate, and cross-reference large datasets for analysis in order to identify previously undetectable patterns, as well as the power to monitor and profile individuals, calculate risks and even predict behaviour, cf. Chen and Cheung, The Transparent Self Under Big Data Profiling, 356.

18

data can provide the company with insights on how to use resources more efficiently and to

outmanoeuvre dominant incumbents. The European Commission noted in 2015 that “the use

of big data by the top 100 EU manufacturers could lead to savings worth €425 billion,” and

that, “by 2020, big data analytics could boost EU economic growth by an additional 1.9%,

equalling a GDP increase of €206 billion.” n.”[7]

In reality marketers are given some leeway when it comes to transparency of said use, be-

cause regulators have deemed it necessary to maintain balance between the formers and soci-

eties economic interests and data subject’s privacy related interests. Therefore, the current,

and evermore urgent issue lays in finding a balance between the exploitation of personal in-

formation and the protection of individual privacy.

3 Consent and Contracts: Similarly Shared Elements of Agreement 3.1 Establishment of Similarity Structure Demonstrating Simplicity of Issues

The various ways subjects can interact with PornHub.com, i.e. a self acclaimed “best adult

porn website on the net!”,111 to enter into a valid contract, will be used to illustrate not just a

practical application of rules governing the formation and validity of online contracts but

more importantly how and to what extent they interplay.

As stated in Pornhubs T&C subjects agree via access or use of the website, whether they

click accept or not, to be bound and to abide by Pornhubs ToS and Privacy policy, which in-

clud, e.g. the processing activities by Pornhub as the collector, concerning uses (i.e. tracking,

profiling, analysing, predicting) of personal data (i.e. raw-, survay and usage data, as well as

sensitive data) for the purposes of, and to the extent necessary to, provide their service, or for

other purposes, e.g. development and optimisation based on personalisation, behavioural ad-

vertising, e.g. for their or third parties legitimate interest, or based on subjects consent.112

In relation to data driven marketing, applicable and overlapping rules can mainly be found

in two distinct fields both in regard to determining the existence of a valid contract pursuant

to contractual necessity, i.e. in data protection and e-commerce (and e-consumer) law, and in

regard to valid consent for a particular purposes, i.e. e-privacy and data protection. The va-

lidity of these agreements depends on which rules apply as well as what particular purposes

can be justified based on those legitimate grounds. Logically it will therefore form the second

legal issue assessed in this chapter.

111 www.pornhub.com 112 Ibid.

19

The formation of such agreement along with other online contracts is governed by the

ECD,113 as it applies to providers of 'information society services’ (ISS providers), i.e. ser-

vices normally provided for remuneration,114 at a distance, by electronic means and at the

individual request of a recipient of services.115 Accordingly, an offer or acceptance is deemed

to be effective when it is communicated, unrelated, whether by electronic means.116 Where the

latter follows the former,117 their mutual assent will gain equal validity to other contracts con-

cluded by more ‘traditional’ means,118 as long as accessible, e.g. on providers webpage,119 in

a way that allows recipient of offer to store and reproduce them,120 or in relation to consumer

contracts under the CRD, cf. art. 8(7), when subjects have been provided with conformation

of reception.121

Moreover, specific rules mandate greater requirements for a contract to be valid if it trig-

gers ‘consumer protection law’, i.e. aggressive commercial practices, misleading or unfair

conduct, e.g. where relevant use of pre ticked boxes, misleading omission of needed informa-

tion or use of unfair terms is restricted and certain information requirements must be fulfilled.

These rules will here after only referenced in so much as relevant, e.g. where they provide

113 Importantly, ECD, art. 9(1) allows electronic contracts to be concluded and forbids member states from creating interferences based solely on the account of being an online contract. A similar definition can be found in article 5 and 5bis of the UNCITRAL Model Law on Electronic Commerce. 54 COM (1998) 586 final, p. 26. 114 Qv. Mc Fadden v Sony Music C-484/14, "The remuneration of a service supplied by a service provider within the course of its economic activity does not require the service to be paid for by those for whom it is performed" (para 41) That is the case, inter alia, where the performance of a service free of charge is provided by a service provider for the purposes of advertising the goods sold and services provided by that service provider, since the cost of that activity is incorporated into the price of those goods or services" (para 42) 115 Cf. ECD, art 2(a), rec. 18, and Directive 98/34/ EC (as amended), art. 1(2)(b), rec. 18. 116 The fact that the contract or the communication was made through electronic means does not affect its validity or enforceability solely, cf. UN Convention on the use of Electronic Communications, Art. 8(1), art. 9. ECD, art 9(1). 117 Different theories exist on non-instantanies communication, q.v. time of contracts conclution in regards to data messages under Uncitral Model Law, cf. art. 15(1), stipulating at dispatch and ,cf. art. 15(2), stipulating both when the communicated is acces 118 COM (2003) 702 final, p. 11. Digital contracts for Europe – Unleashing the potential of e-commerce, 9.12.2015, COM(2015) 633 final, see also ECJ Haute-Garonne, which cocluded that Article 8 precluded national legislation banning advertising by dentists as it prohibited any recourse to online advertising methods that promotes specific dentists or their company [cf. paras 17-19, 22.]. Qv. Member States have the freedom not to apply Article 9(1) to certain types of contracts, cf. Article 9(2) such as rental contracts. 119In El Majdoub, the CJEU ruled that the 'click-wrapping' technique fulfils the requirements in Article 23 of the Brussels I Regulation, as the purpose was to treat electronic communications in the same way as written communications provided that they offered the same sort of guarantees in terms of evidence of the contract. Thus, it was sufficient that it was 'possible' to save and print the ToU information (i.e. to provide to a durable record of the agreement) before conclusion of the contract. 120 ECD, art. 10(3). 121In El Majdoub, the CJEU ruled that the 'click-wrapping' technique fulfils the requirements in Article 23 of the Brussels I Regulation, as the purpose was to treat electronic communications in the same way as written communications provided that they offered the same sort of guarantees in terms of evidence of the contract. Thus, it was sufficient that it was 'possible' to save and print the ToU information (i.e. to provide to a durable record of the agreement) before conclusion of the contract.

20

greater protection then the GDPR and e-Privacy or where information obligations comple-

ment each other. Finally it is necessary to emphasise that DP rules apply in all these cases,

where personal data is processed, they form legal floor of protection and cannot be contracted

away. Nevertheless, the validity of a online contract in terms of contractual necessity and

valid consent as well as the notion of effective acceptance in regards to the latter legitimate

basis, are dividing points between e-commerce and data protection as the legal assment of

Pornhub.com below will illustrate.

3.1.1 Information Requirements

Trust is hard to give to something you do not control or know, thus, in accordance with the

GDPR, the collection and use of personal data must be fair and transparent to allow data sub-

jects to exercise their rights.122 Controllers must inform the subject of the existence of collec-

tion and processing, what collectors legal basis and purposes are for said processing, who are

if any, the third party recipients of the collected and or processed data,123 or can access it and

the usability of the obtained by said third parties as well as subjects rights and any further

information necessary to ensure fair and transparent processing, taking into account the con-

text and the specific circumstances of the processing.124 Data subjects maintain the right to be

informed at the time of collection, where personal data has been obtained directly from the

subjects,125 or, when obtained from third parties, within a reasonable amount of time or upon

being used to directly contact (directly market to) the data subjects.126

Foremost, the communication itself must be governed by the principle of transparency,127

i.e. be accessible, easy to understand and, use clear and plain language.128 Additionally, by the

principle of data protection by design, whereby controllers are obligated to create suitable

information measures.129 Thus, and in line with the GDPR,130 controllers must create suitable

information measures,131 such as ‘layered’ information notices (see chapter 3.1.2.),132 that are

122 GDPR, art. 12, rec. 39, 58. 123 GDPR, art. 13(1)(e). 124 GDPR, rec. 60, qv. rec. 62. 125 GDPR, art. 13. 126 GDPR, art. 14, rec. 61. Qv. exceptions apply, cf. GDPR, rec. 73, 62. See opinion on IoT, 16 “The fairness principle specifically requires that personal data should never be collected and processed without the individual being actually aware of it.” Could uswe otherwise take out. 127 GDPR, art. 5(1)(a), rec. 59. 128 GDPR, rec. 39. Qv. WP29, Guidelines on Transparency under GDPR. 129 Voigt and Bussche, The EU GDPR, 141-147. Cf. GDPR, art. 12. 130 GDPR, art. 12 131 Voigt, and Bussche, The EU GDPR, 141-147. 132 As WP29 has advocated for:

21

governed by the principle of transparency,133 thereby, there should be greater clarity in the

communication given, e.g. when signing up to social networks or entering sites. It should con-

tain [a set of] information (such as described above) in a clear and comprehensible manner,

such as the functionality, interoperability and main characteristics of digital content.134

3.2 Application of the Legal Interactivity to ‘The Ride of My Life’

3.2.1 Valid Consent and Contract: Capacities of Offer and Acceptance

As subject enter the “wonderful world of Pornhub”135 they are not greeted with an ability to

take affirmative action, e.g. whereby subjects click of a button, ensuing a communication of

there agreement, i.e. ‘valid’ click wrap agreement under EU law, cf. CJEU El Majdoub.

Nevertheless, a valid contract might still have taken place, i.e. brows-wrap agreement,

whereby an actual or constructed notice of SCC could be deemed as demonstrating subjects

assent,136 such would ultimately depend on a case by case assessment, notably what rules are

apply is a determining factor, cf. Barnes & Nobel (B&N), where the court decided B&N’s

placement of a hyperlink at the bottom of every page, gave insufficient notice of ToU to hold

the costumers to an arbitration (i.e. browse wrap-) agreement. Because B&N did not other-

wise provide other notices to users nor prompted them to take any affirmative action to dem-

onstrate assent, moreover, in the latter case, even if the links would have been placed close to

the relevant buttons that users must click on, without more, would be insufficient to give rise

to constructive notice.

Similar to B&N, Pornhub displays some information at the bottom of every page within

pornhub.com. Although such information would adequately give notice, e.g. both solely there

in, or in combination with adequate reference to further information, could be enough to con-

clude the existence of a contract without any affirmative action such in would still remain

133 GDPR, art. 5(1)(a). 134 CRD, art. 6(2) re. 6(1)(a), (r) and (s). 135 www.pornhub.com 136 Cf. Barnes & Nobel (B&N). where the court decided B&N’s placement of a hyperlink at the bottom of every page, gave insufficient notice of ToU to hold the costumers to an arbitration agreement (a browse wrap agreement). Because B&N did not otherwise provide other notices to users nor prompted them to take any affirmative action to demonstrate assent, in which even if the links would have been placed close to the relevant buttons that users must click on. without more. is insufficient to give rise to constructive notice. Specht v. Netscape, where users argued that a freely downloaded ‘plugin’ transmitted user’s personal data, after having downloaded and, while facilitating the use of a ‘browser’. The producer, Netscape, argued that the respected privacy dispute was subject to a binding arbitration clause, contained in the ‘browsers ToU’, which users had assented to via clickwrap agreement. It was concluded that even when defined broadly, the arbitration clause did not include ‘use of the plug in’ as it did not expressly mention it nor had users assented to separate arbitration clause, contained in the ‘plug ins’ ToU (i.e. a browse wrap agreement) as they could not have learned of the existence before downloading the plugin.

22

insufficient where consent is needed in the field of data protection and e-privacy, as stricter

requirements exist to protect data subjects that entail that a consent would only be considered

given with an affirmative action.137

3.2.2 Consent Must be Given with Affirmative Action

Although there must be an affirmative action, the GDPR does not specifically clarify whether

subject’s conduct entails ‘actions’ that demonstrates actual or constructive notice can be con-

sidered as indication of acceptance. 138

Not including websites who collect personal data without a notice,139 cookie consent

forms can, for the sake of simplicity, be presented as overlays that blocks usage until accept-

ance has presumably been provided, i.e. a cookie wall (as mentioned in chapter 2.2.1.2.),140 or

by a notice, e.g. standard banner, not preventing use while notice is displayed. The WP29 the

EDPS and the EDRI have been vocal of their opinion on cookie walls and all underline the

limits of current implementation of the cookie consent mechanism (based on "cookie walls")

under the ePD,141 that it should only be allowed in certain exceptional circumstances (q.v.

chapter 4.3.2.).142 A ‘non-cookie wall notice’ can be placed on a 3 point spectrum i.e. without

and with the option to accept, on a take it or leave it basis or with the ability to accept or de-

cline some or all processing activities.

Considering notices without an option to accept, as is the present particular case concern-

ing Pornhub, as part of the requirement of clear affirmative action, cf. GDPR, art. 4(11), inac-

tivity should not constitute consent, cf. GDPR, rec. 32, when there is no option for a user to

affirmative provide indication of assent, the website either auto-accepts, i.e. automatically

assumes consent is given and disappears when scrolling or when users reacts by closing the

banner. However, this is not inactivity as prescribed in the GDPR, nor are other cases, where

the website will only inform subjects the first time, they use a website, or whenever entering a

website, and stops displaying after assuming implied consent is given by continued use.

137 Cf. GDPR, art. 4(11). 138 GDPR, rec. 32. 139 Interestingly, these are not solely hyperlink-streaming- and other illegal websites there are many other legal sites, some even owned and operated by well-known companies. Nevertheless 140 On the internet, many companies offer people take-it-or-leave-it-choices regarding privacy. For instance, some websites install tracking walls (or ‘cookie walls’), barriers that visitors can only pass if they agree to being tracked. If people encounter such take-it-or-leave-it choices, they are likely to consent, even if they do not want to disclose data in exchange for using a website. An assessment of the ePR, 9. 141 See chapter 2.2.1.2. on page 10. 142 WP29, Opinion on evaluation and review of ePD, 16. Opinion on ePR proposal, p. 16, EDPS, cited above, p. 14; EDRI, e-Privacy Directive Revision, https://edri.org/files/epd-revision/EDRi_ePrivacyDir-final.pdf. See also SMART 2013/0071; Acquisti-Taylor-Wagman, cited above, p. 41; DLA Piper, cited above, p. 29.

23

In the latter case the consent would among other likely not be valid as users would not be

able to withdraw their consent as easily, i.e. by browsing,143 same likely applies to scrolling

but not necessarily when a user closes the notice by clicking an icon. However, noticing the

banner and choosing not to click accept but rather closing the banner would rather signal a

decline rather than acceptance.144

On the other hand, in these situation where this type of notification blocks usage of the

website until user has disabled ad-blockers, there would be both affirmative action as well as

an indication of acceptance, when a user would disable ad-blockers and allow the site to place

cookies. Furthermore, by turning the adblocker back on the user has withdrawn his accept-

ance.145 Nevertheless this would likely be considered a form of a ‘cookie wall’, thereby ren-

dering a supposed consent invalid, as it would not be freely given, (q.v. chapter 4.3.2.).

Furthermore, in these situations, websites (or digital services) often only inform users to

disable adblocks to continue or to receive services without providing more information such

as the purpose of disabling (the purpose of collection) it would not be a valid consent without

sufficient information. Ergo, not stating the purpose will generally eliminate the possibility of

being a valid consent, (q.v. chapter 4.3.2.). However, this could be deemed valid if the col-

lected data is stored in and related to end-user’s (data subjects) terminal equipment

(/according to the new ePR).146

3.2.3 Provisions Stipulating Processing Activity must Form Part of the Valid Contract

In comparison with the requirements of consent, as argued above, the presumption is not

without merit, that by browsing Pornhub.com subjects enter into a valid contract although

they take no affirmative action as there conduct147 would likely demonstrate subjects actual or

constructive notice of the offer (which is, in this case, e.g. stipulated in Pornhubs privacy pol-

icy, ToC, etc.), and there continued use would thus likely indicate there acceptance. Neverthe-

less, terms stipulating the processing activity in question, must have formed apart of this no-

tice in order to exist within the ‘valid contract’, cf. Specht v. Netscape, this is not guaranteed.

Knowledge of SCC is of relevance to determine whether certain SCC, or certain provi-

sions therein, form apart of a valid contract. This is a two part assessment, i.e. how certain

143 cf. art. 7(3) 144 Use cookie consent form reaserchstudy. 145 WP29, Guidelines on consent under GDPR 259/2018, 17. 146 cf. PECR, art. 8(1)(b), re. art. 9(2). 147 GDPR, rec. 32.

24

SCC, or provisions therin, are assumed to have formed apart of a contract. Provisions can be

among the text which has specifically been agreed to, or refered to therein or other places, or

neither refered or apart of agreed text. Secondly what requirements are given in that particular

situation to subjects having knowledge. Such requirements can be derived from the content of

SCC (i.e. being unfair or unexpected) the feild of use, standing of contractual parties and their

prior contractual relations.148

Specht v. Netscape: where users argued that a freely downloaded ‘plugin’ transmitted

user’s personal data, after having downloaded and, while facilitating the use of a ‘browser’.

The producer, Netscape, argued that the respected privacy dispute was subject to a binding

arbitration clause, contained in the ‘browsers ToU’, which users had assented to via click-

wrap agreement. It was concluded that even when defined broadly, the arbitration clause did

not include ‘use of the plug in’ as it did not expressly mention it, nor had users assented to

separate arbitration clause, contained in the ‘plug ins ToU’ (i.e. a browse wrap agreement) as

they could not have learned of the existence before downloading the plug-in.

The content of the footer wrapper is divided into three separate sections. The last being

irrelevant and only containing logos. Although, the first section contains text which would

otherwise contain relevant contractual information, it is clear from the information provided

in this case that it relates mostly to the features of the site and does not expressly mention that

they collect data, nor the type, means, use or purpose of there processing activities, nor do

they state that by use of pornhub subjects accept to abide by there ToC, ToS and privacy pol-

icy nor do they reference these policies. The only reference to SCC is made in the second sec-

tion via listing of menu links in column titled information. It is thus referred SCC or a Brows-

wrap.

The SCC are generally apart of the contract if subjects are aware of their existence. If a

subject knows that certain SCC apply but did check them, although accessible, subjects would

demonstrate Thus subjects would enter into a contract whether or not they had actual know-

ledge of there content.

Unlikely that there unlikely that there is a binding contract about the data processing, for

example, in e-commerce If the offer corresponds to a consumer contract as defined in X, it

will not become binding unless certain information is provided in a clear and comprehensive

manner, cf. art. 6(1)(a)-(t) of CRD contains a list of 20 pre-contractual requirements, which

are complemented by requirements under Article 5 of the ECD. 3

148

25

3.3 Information Requirments: The Outlook of the Layered Notice Praxis

3.3.1 Consent and Contract: Different types of Notices and Requirements

As stated, the design of no-cookie wall notices can be place on a 3-pointed spectrum. In all

cases, i.e. unrelated where a notice is placed on the spectrum, information requirements must

be meet for consent to be valid. Examining the different typically-used notices along the latter

two points of the spectrum, pursuant to the 'layered' information notices praxis advocated by

WP29, can provide extensive illustration of how controllers would successfully provide rel-

evant information to the subject in an intelligible and easily accessible form.149

Notably subjects are not aware of their data being collection most of the time. This stems

from digital services often neglecting to ask permission or even to inform subjects when col-

lecting their data. Such lack of information constitutes a significant barrier to demonstrating

valid consent under EU law, as the subject must be informed ‘prior’ to obtaining consent or

otherwise commencing collection of their personal data.150 Unless the information is given

and, in such way that the average data subject is able to understand what they are essentially

allowing without much effort.151 Without this knowledge, the individual has no control and

consent becomes invalid as a legal basis in accordance with the GDPR.152 Where no informa-

tion is given consent cannot be relied upon as a legal basis for the corresponding data process-

ing under EU law. Furthermore, other legal bases would not be possible when no notification

has been given.

As concluded above the validity of a consent obtained through a notice where there is no

option to affirmatively act upon, hinges on both being allowed ‘cookie wall circumstances’ as

well as containing suitable information as required by e.g. the GDPR and ePR (see chapter x),

unless otherwise being exempted (see chapter x). This type of notice is only under consider-

ation in following assessment in relation to information requirements.

It is essential to examine information requirements of the notices that provide data subject

the ability to affirmatively indicate their assent by action, depending on the particular context

of a processing activity. Notably, subjects reasonable expectations and the extent of parties

common understanding, can be useful in determining what kind of information needs to be

149 GOOGLE THESIS 2, PAGE 75. 150 151 WP29 Guidelines on Consent under Regulation 2016/679, 14. Cf. GDPR, rec.42. 152 GDPR, art. 6(1)(a).

26

provided, to what extent and in what manner, i.e. whether it can be on a take it or leave it

bases or if the notice needs to give subjects options to explicitly agree or decline some or all

processing activities, to be a valid informed consent.

4 What M3DM Processing Activities Can Be Justified Based on Valid

Consent 4.1 Can Adhoc Privacy Notices be Freely Given?

For an effective consent to be valid, it must be freely given, specific, informed, and unam-

biguous.153 Meeting these three elements does not give way for unfair and unlawful process-

ing. “If the purpose of the data processing is excessive and/or disproportionate, even if the

user has consented, the app developer will not have a valid legal ground and would likely be

in violation of the (GDPR)”.154

Silence or inactivity cannot be taken as a valid consent.155 There must be an affirmative

action to demonstrate data subjects assent thus an actual or constructed notice of SCC would

not suffice.156 Importantly, merely (affirmatively) clicking on a download button does not

show assent to SCC if they are not conspicuous157 and if it is not explicit to the subject that

clicking clearly indicates that he or she is consenting (agreeing) to a specific processing for a

specific purpose.158

Actual practices of 'requesting' and 'obtaining' consent within the DSM pose challenges to

the concept of consent as ‘freely given’, specific and informed indication of the wishes of the

individual, by which, the data subject displays his agreement to personal data relating to him

being processed.159

If data subject is presented with a request for collection and processing of personal data on

a take it or leave it bases, then there is no real contracting freedom. To be considered freely

given the data subject must have real choice and control.

153 GDPR, art. 7, 4(11), rec. 32, 42. Qv. The GDPR added ‘unambiguous’. The former EU DPD specified that consent must be ‘freely, specific and informed’. 154 WP29, Opinion on apps on smart devices, 16. 155 Commission, Completing a trusted DSM, 3. 156 GDPR, art. 4(11). Cf. Barnes & Nobel (B&N). where the court decided B&N’s placement of a hyperlink at the bottom of every page, gave insufficient notice of ToU to hold the costumers to an arbitration agreement (a browse wrap agreement). Because B&N did not otherwise provide other notices to users nor prompted them to take any affirmative action to demonstrate assent, in which even if the links would have been placed close to the relevant buttons that users must click on. without more. is insufficient to give rise to constructive notice. 157 Which requests consent should be clear, concise GDPR, rec. 34. 158 GDPR, rec. 32. Qv. Barnes and Nobel. [see bygrave] 159 Google thesis 2, 71, 76.

27

The consent of the data subject for the processing of his personal data, as the main means

of empowerment of the data subject, is the genuine expression of the right to informational

self-determination and should be considered under the light of the individual's autonomy and

free will. it is interesting to examine whether the processing of the personal data should really

rely on the consent of the data subject, in the first place.

4.2 Encapsulating Consent as an Imperative Barrier of Potential Misuses

It is essential that data subject's informed consent is one of the legal grounds that allows the

processing of personal data and that the limits of the consent given constrain the possibilities

to process the data, thus serving as a mechanism for the data subject to stay in control of the

purposes for which her personal data are used.160

As seen above the problem is not the collection of data but that when companies have ac-

cess to vast amount of data, including personal data, they gain the ability to mix various data

together and analyse them in numerous ways to make predictions about data subjects. The

value of individual’s personal data lies in its usability, i.e. in this case the ability to make pre-

dictions that allow for better, more accurate and effective marketing practices, and which also

have the ability to negatively affect individuals. 161

This is potentially why companies like Google and Facebook allow data subjects to see

their data, e.g. what pages they visited or what they have liked, but not the predictions about

data subjects which they have concluded by comparing all available data related to a particu-

lar individual, with the great amount of data that they have on everyone else.162 As will be

argued in the next subchapter, it is questionable whether an opt out approach would fulfil

transparency requirements.

Although an opt out approach would suffice transparency requirements as well as be justi-

fied compared to the aims of the GDPR, it is certain that consent as a legal basis for process-

ing personal data is an imperative barrier for any potential misuses. Since data, including per-

sonal data, forms the basis of these predictions, if an entity is not able to collect or process

individual’s personal data without his or her consent, the companies would be unable to con-

duct those analyses about data subject and target him or her accordingly, or provide other

marketers with said ability.163

160 N. Forgó et al, The Principle of Purpose Limitation and Big Data, 27-28. 161 See for example https://www.facebook.com/ads/preferences, and https://myactivity.google.com. 162 See for example https://www.facebook.com/ads/preferences, and https://myactivity.google.com. 163 N. Forgó et al, The Principle of Purpose Limitation and Big Data, 27-28.

28

4.3 Finding Balance between Exploitation and Protection of Personal Data

4.3.1 Why Buy ‘Google Assistant’ in a ‘Data Driven Market’?

The DSM is a cooperative environment where its participants (e.g. data subjects, controllers

and society) receive the highest pay off where they cooperate; take into consideration the be-

haviour, decisions and interests of each other to obtain equilibrium.

The ‘potential use’ of the collected data is the problematic factor; Data subjects, control-

lers and society perceive the ‘potential use’ idiosyncratically as positive or harmful depending

on the situation. Normally data subjects’ interests are not the same as societies and conse-

quently occasionally conflict. In such case, subject’s rights might need to be ignored in urgent

cases, for example to maintain the growth of the digital economy.164

It is important to keep in mind that the big data revelation is, in itself, not bad for the

individual or the society. On the contrary it a great value to society to have the ability to

examine big data to uncover hidden patterns, unknown correlations, customer preferences,

market, social and health trends, and other useful information that can provide deeper insight

into our society and help organizations make more informed decisions.

Deb Roy’s 2013 study of his child’s language development provides good insight into the

potential’s revelations and positive outcome from personal data processing. Deb used over-

head cameras to record and map all sounds from his family’s domestic environment during

the first years of their child’s development. The purpose was to understand how we learn a

language, in context, through the words we hear. The study allowed for the precise tracking

and identification of every utterance spoken in both space and time. The study revealed the

spatial clustering of certain words, such as the word "water" in the architectural context of the

kitchen. Learning of a word is not achieved by reputation but rather the context within which

the words are used when they are heard by the child, e.g. how and where. Words heard in

more various contexts will be learned first. In this way predicting first words was possible.165

Imagine in today's society when individuals are using Google Home, ….

The negative effect that users and the society can receive from the issue of contractual

clauses giving online service providers the right to process user's data for marketing purposes

has recently become apparent in the Facebook-Cambridge Analytica scandal where a re-

searcher under his private company (GSR), developed and availed a personality quiz app,

164 Marketers later use of personal data can, similarly, be viewed from opposing poles. For example, when the personal data is used to personalize digital services, {vs. filter bubble}. 165 A Plus. “What One Dad Learned When He Recorded 230,000 Hours of His Son Learning to Talk.”

29

through Facebook’s platform as an online research survey. As indicated in the ‘fine print’,166

upon installing the app on their Facebook account, the app would collect psychographic in-

formation about participant for academic purposes.

Although less than 300,000nd agreed, Facebook's design allowed the app to collect infor-

mation from all users in every participant's Facebook social network, without their consent,

consequently, enabling GSR to acquire data from approx. 72 million Facebook users. The

psychographic data,167 with or without consent for academic purposes, was then sold to Cam-

bridge Analytica to use for political and commercial purposes. The data was used to advance

US presidential candidates and is claimed to have been used thereafter in e.g. Trump’s cam-

paign, Brexit referendum.168

5 The ‘Transparent’ Potential of Data Subjects Empowerment 5.1 Preliminary Remarks

Transparency, when applied in practice by marketers, empowers data subjects to hold market-

ers accountable and to exercise control over their personal data, e.g. by providing or with-

drawing informed consent as well as to carry out any of data subjects’ rights mentioned in

chapter two.169 Regretably however, the digital environment is not characterized by fairness

or transparency, it has traditionally been ignorant of data subject privacy related interested

and rights.170

5.2 Forms of Productive and Transparent Practices

In particular, children should be fully aware of the possible consequences when they first sign

up to social networks. All information on the protection of personal data must be given in a

clear and intelligible way – easy to understand and easy to find.

These rights are enhanced when transparency is not use-dependant. Transparency irre-

spective of use allows for access to, and control over data subjects personal data no matter in

166 certain preferences and users Facebook social network, 167 Where conventional political advertising uses crude demographic factors like age and ZIP code to target advertising, Cambridge supposedly used a technique called psychographics, which involves building a detailed psychological profile of a user that will allow a campaign to predict exactly what kind of appeal will be most likely to convince any particular voter. https://arstechnica.com/tech-policy/2018/03/facebooks-cambridge-analytica-scandal-explained/ 168 Case no. 01725, N.D. Cal. 3:18, Case no 01732, N.D. Case Cal. D. Del. 5:18, CA 169 WP29, Guidelines on Transparency under GDPR, 5. See, for example, the Opinion of Advocate General Cruz Villalon (9 July 2015) in the Bara case (Case C-201/14) at paragraph 74. (mögulega reifa eða kvóta) 170 Online platforms and the digital single market p. 63.

30

which way a digital product or service is used. Google enhances data subjects’ rights when it

allows data subjects access to all their personal data which has been collected by Googles

Gmail services during data subjects use of the service irrespective whether it was through the

Gmail website or the Gmail mobile app. Similarly, data subjects should have access as well as

control over their personal data collected by Google irrespective of the data being collected

during use of any other particular goggle service.171

It is easier for companies to personalise privacy notices when data subjects are allowed to

manually adjust their privacy settings. When Facebook allows data subjects to choose, via

privacy dashboard, not to see ads outside of Facebooks platform that are based on data sub-

jects activities on Facebooks Company Products, Facebook can exclude information that re-

flects that use in their privacy notice their by personalising notices by reflecting only the types

of processing occurring for that particular data subject.172

5.3 Use of Data as Unfair Commercial Practice

The UCPD prohibits any unfair commercial practices,173 either under all circumstances174 or

belonging to one of three categories of practices that lead to a consumer making a “transac-

tional decision”175 and are subject to a “fairness” test, these are aggressive practices,176 prac-

tices that mislead consumers by action,177 or by omission.178 Lastly, any commercial practices

deemed unfair if contrary to professional diligence and likely to result in a consumer decision

that would not otherwise have taken.179 Those practices are unfair if they restrict user freedom

of choice through coercion, harassment, or undue influence, and are likely to result in a con-

sumer making a transactional decision he would not have otherwise taken.180 Therefore, the

commercial practices discussed, i.e. advertising, performed within an digital platform and

171 WP29, Guidelines on Transparency under GDPR, 22. 172 Ibid. 173 ‘Commercial practices’ is defined as “any act, omission, course of conduct or representation, commercial communication including advertising and marketing, by a trader, directly connected with the promotion, sale or supply of a product to consumers,” cf. UCPD, art. 2(d). 174, Thirty-one commercial practices (contained in the Annex) are deemed unfair and prohibited under all circumstances, cf. UCPD, art. 5(5), Annex I. 175 A transactional decision is, amongst others, a decision to purchase or make payment, cf. UCPD, art. 2(k) 176 UCPD, art. 5(4)(b). 177 UCPD, art. 6 (false or inaccurate information) 178 UCPD, art. 7 (non-provision of information that the consumer needs to make a decision). 179 UCPD, art. 5(2). . Issues of fairness linked with (i) the use of product or service as an advertising medium for third-party advertising, (ii) or as media for their own products or service as the subject matter of advertising, (iii) which type of targeting advertising technic or personalisation of search results, the device used and the data subject. 180 UCPD, art. 8.

31

aimed at convincing consumers to purchase an in-platform product can lead to a transactional

decision, thereby falling within the UCPD. Same applies to where the commercial practice

aims at ‘pushing’ consumers to use a free product or service, which does not initially involve

any purchase or payment.181

Google could be said to employ these commercial practices. Firstly, by serving ‘interest-

based ads’ on apps and software belonging to the Google Display Network, aimed at buying

in-network products, such include remarketed ads see graph.182 These ‘interest-based ads’ are

selected on the basis of information such as what user browsed, the way of interacting with

sites and software, location, demographics and more.

Secondly, by personalising search results on their Search Network as well as search en-

gine optimization and prominent placements, of paid or interest-based material or favouring

their own products and services. Although Googles search result algorithm is a highly kept

secret an recent study by x concluded that an average, 11.7% of Google Web Search results

and the highest personalization was queries related to political issues, news, and local busi-

nesses.183 in performing these practises and essentially by creating a filter bubble, Google

could be exploiting its position of power in a way which significantly limits the consumer's

ability to make an informed decision.184 In fact, Google has been penalised for breach of EU

antitrust rules by abusing its market dominance as a search engine by giving an illegal advan-

tage to another Google products thereby denying other companies the chance to compete on

the merits and to innovate. Furthermore, it denied consumers a genuine choice of services and

the full benefits of innovation.185

Another situation which can give rise to unfair commercial practices includes where a col-

lector exploits personal data revealing specific misfortune or circumstance of such gravity as

to impair the consumer’s judgement, to influence the consumer’s decision with regard to ser-

vice or product.186

181 The European Commission and the UK Office of Fair Trading favour a broad interpretation and therefore “[a] commercial practice may be considered unfair not only if it is likely to cause the average consumer to purchase or not to purchase a product but also if it is likely to cause the consumer to enter a shop” also OFFICE OF FAIR TRADING, ONLINE TARGETING OF ADVERTISING AND PRICES, 6. 182 https://services.google.com/fh/files/misc/analytics_360_product_overview.pdf 183 Hannák etc. “Measuring Personalization of Web Search,“ 2. 184 Cf. UCPD, art. 2(j). 185 http://europa.eu/rapid/press-release_IP-17-1784_en.htm. 186 Cf. UCPD, art. 9(c). App-solutely Protected? The Protection of Consumers Using Mobile Apps in the European Union Christiana Markou* and Christine Riefa

32

6 Conclusion The deployment of the notion of control by EU institutions as a tool for data subjects to man-

age their privacy differs from common scholarly literature where the notion of control has

often been conceptualised in certain ways, notably as one of the foundations of privacy.

Through an examination of the material produced by the EU regulator, the dominant rhetoric

can be said to take a uniquely entangled approach to the notion of control where as it encom-

passes both subjective and structural agents.

The EU regulator appears to have recognised, or at least feels compelled to recognise, that

duo to the digital design, notably its complexity, for subjects to be able to ‘effectively exer-

cise control’ (i.e. through use of these ‘micro rights’) providing the micro rights themselves is

insufficient, rather a set of ‘structural measures’ must be placed within the digital environ-

ment that ensures its reliability and effectiveness, as technical solutions facilitate more effec-

tive and sufficient means of making data subjects aware and empowered to exercise control.

Control can be assisted by the principal of fairness both in the GDPR as well as through

different interplaying legal instruments

33

7 Table of reference 7.1 A. Books and journal articles

A Plus. “What One Dad Learned When He Recorded 230,000 Hours Of His Son Learning

To Talk.” Last modified 12 April 2018. https://aplus.com/a/deb-roy-recorded-son-learning-

talk?no_monetization=true.

Bygrave, Lee A. Data Privacy Law: An International Perspective. First edition, Vol. 63.

Oxford: Oxford University Press, 2014. ISBN 978-0-19-967555-5.

Bygrave, Lee A. "Data Protection by Design and by Default: Deciphering the EU’s Legis-

lative Requirements" (DP by design and default). Oslo Law Review 4, no. 02 (2017): 105-

120. SISN online: 2387-3299.

Deng, Mina, Kim Wuyts, Riccardo Scandariato, Bart Preneel, and Wouter Joosen. "A pri-

vacy threat analysis framework: supporting the elicitation and fulfillment of privacy require-

ments." Requirements Engineering 16, no. 1 (2011): 3-32.

Deng et al., “A privacy threat analysis framework”,

Chen, Yongxi and Cheung, Anne S. Y., The Transparent Self Under Big Data Profiling:

Privacy and Chinese Legislation on the Social Credit System (June 26, 2017). Vol. 12, No. 2,

The Journal of Comparative Law (2017) 356-378; University of Hong Kong Faculty of Law

Research Paper No. 2017/011. Available at SSRN: https://ssrn.com/abstract=2992537 or

http://dx.doi.org/10.2139/ssrn.2992537.

Corrales M., Fenwick M., Forgó N. (2017) Disruptive Technologies Shaping the Law of

the Future. In: Corrales M., Fenwick M., Forgó N. (eds) New Technology, Big Data and the

Law. Perspectives in Law, Business and Innovation. Springer, Singapore

https://doi.org/10.1007/978-981-10-5038-1_1.

CXL. “How Data-Driven Marketers Are Using Psychographics.” Last modified 11 Au-

gust, 2017. https://conversionxl.com/blog/psychographics/.

34

D’Agostino, Elena. Contracts of Adhesion between Law and Economics: Rethinking the

Unconscionability Doctrine. SpringerBriefs in Law. Springer International Publishing, 2015.

DOI 10.1007/978-3-319-13114-6. (Contracts of adhesion).

Degeling, Martin, Christine Utz, Christopher Lentzsch, Henry Hosseini, Florian Schaub,

and Thorsten Holz. “We Value Your Privacy ... Now Take Some Cookies: Measuring the

GDPR’s Impact on Web Privacy” (Cookies: Measuring GDPRs Impact). Proceedings 2019

Network and Distributed System Security Symposium, 2019.

https://doi.org/10.14722/ndss.2019.23378.

Malbon, Justin, Online Cross-Border Consumer Transactions: A Proposal for Developing

Fair Standard Form Contract Terms (November 22, 2013).

http://dx.doi.org/10.2139/ssrn.2413289.

Forbs. “How Target Figured Out A Teen Girl Was Pregnant Before Her Father Did.” Last

modified 16 February 2012. https://www.forbes.com/sites/kashmirhill/2012/02/16/how-

target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/#60bb8ac26668.

Hannák, Anikó, Piotr Sapieżyński, Arash Molavi Khaki, David Lazer, Alan Mislove, and

Christo Wilson. „Measuring Personalization of Web Search". ArXiv e-prints 1706 (1. June

2017): arXiv:1706.05011.

Hansen, Marit, Eleni Kosta, Igor Nai Fovino, and Simone Fischer-Hübner, eds. Privacy

and Identity Management. The Smart Revolution - 12th IFIP WG 9.2, 9.5, 9.6/11.7, 11.6/SIG

9.2.2 International Summer School, Ispra, Italy, September 4-8, 2017, Revised Selected Pa-

pers. Vol. 526. IFIP Advances in Information and Communication Technology. Springer,

2018. doi:10.1007/978-3-319-92925-5.

Hervé Jacquemin, Digital Content and Sales or Service contracts under EU Law and Bel-

gian/French Law, 8 (2017) JIPITEC 27 para 1. page 27.

Leenes, Ronald, Rosamunde van Brakel, Serge Gutwirth, and Paul De Hert, eds. Data

Protection and Privacy: (In)Visibilities and Infrastructures. Vol. 36. Law, Governance and

Technology Series. Cham: Springer International Publishing, 2017.

35

https://doi.org/10.1007/978-3-319-50796-5.

Kosta, Eleni. Consent in European Data Protection Law (Consent: EU DP). Vol. 3. 3

vols. Nijhoff Studies in EU Law. Brill, 2013. https://doi.org/10.1163/9789004232365.

N. Forgó et al. (2017) The Principle of Purpose Limitation and Big Data. In: Corrales M.,

Fenwick M., Forgó N. (eds) New Technology, Big Data and the Law. Perspectives in Law,

Business and Innovation. Springer, Singapore https://doi.org/10.1007/978-981-10-5038-1_1.

Lazaro, Christophe and Le Métayer, Daniel, Control over Personal Data: True Remedy or

Fairy Tale? (June 1, 2015). SCRIPT-ed, Vol. 12, No. 1, June 2015. Available at SSRN:

https://ssrn.com/abstract=2689223

Forgó, Nikolaus, Stefanie Hänold and Benjamin Schütze. “The Principle of Purpose Limi-

tation and Big Data,” in New Technology, Big Data and the Law. M. edited by Marcelo Cor-

rales, Mark Fenwick, Nikolaus Forgó. 17-42. Perspectives in Law, Business and Innovation.

Singapore: Springer, 2017. DOI 10.1007/978-981-10-5038-1_2

Voigt, P., and A. von dem Bussche. The EU General Data Protection Regulation (GDPR):

A Practical Guide. Springer International Publishing, 2017.

https://books.google.no/books?id=cWAwDwAAQBAJ.

Victor Morel, Daniel Le Métayer, Mathieu Cunche, Claude Castelluccia. Enhancing

Transparency and Consent in the IoT. IWPE 2018 - International Workshop on Privacy Engi-

neering, Apr 2018, London, United Kingdom. IEEE, pp.116-119, Proceedings of the Interna-

tional Workshop on Privacy Engineering (IWPE 2018). <10.1109/EuroSPW.2018.00023>.

<hal-01709255v2> (Morel et al., IoT: Enhancing Transparency and Consent).

Bygrave, Lee A. "Data Protection by Design and by Default: Deciphering the EU’s Legis-

lative Requirements." Oslo Law Review 4, no. 02 (2017): 105-120. SISN online: 2387-3299

Wolters, P. T. J. “The Control by and Rights of the Data Subject Under the GDPR.” Jour-

nal of Internet Law, 22, no. 1 (2018): 7-18, ISSN: 1094-2904

https://repository.ubn.ru.nl/handle/2066/194516.

36

Information Commissioner’s Office (ICO) Data protection by design and default.

(2018https://ico.org.uk/for-organisations/guide-to-the-general-data-protection-regulation-

gdpr/accountability-and-governance/data-protection-by-design-and-default/2018). Accessed:

2018-08-30.

‘The REAL Difference Between Marketing & Advertising’. MarketingProfs. Accessed 30

November 2018. http://www.marketingprofs.com/2/mccall5.asp.

Moerel, Lokke and Prins, Corien, Privacy for the Homo Digitalis: Proposal for a New

Regulatory Framework for Data Protection in the Light of Big Data and the Internet of Things

(May 25, 2016). Available at SSRN: https://ssrn.com/abstract=2784123 or

http://dx.doi.org/10.2139/ssrn.2784123

Hartzog, Woodrow. The Case Against Idealising Control. European Data Protection Law

Review. Volume 4, Issue 4 (2018), pp. 423 – 432. DOI:

https://doi.org/10.21552/edpl/2018/4/5

Datatilsyet, Personal data in exchange for free services: an unhappy partnership? (PII in-

exchange for VAS), (28 January 2016)

https://www.datatilsynet.no/globalassets/global/english/privacy-trends-2016.pdf

Export Citation

7.2 B. Report and other documents

7.2.1 Independent EU Data Protection and Privacy Advisory Bodies

Article 29 Data Protection Working Party, Guidelines on Consent under Regulation

2016/679, (2017) WP 259.

Article 29 Data Protection Working Party (2015), Cookie Sweep Combined Analysis –

Report, WP 229.

Article 29 Data Protection Working Party, Opinion 1/2010 on the concepts of "controller"

37

and "processor", (16 February 2010) WP 169, 00264/10/EN. (WP29, Opinion: “controller"

and "processor").

Article 29 Data Protection Working Party, Opinion 8/2014 on the on Recent Developments on

the Internet of Things, (16 September 2014) WP 223. (WP29, Opinion on IoT Development).

Article 29 Data Protection Working Party (2004) Opinion 4/2004 on the processing of

personal data by means of video surveillance, 11750/02/EN, WP 89, Brussels, 11 Feb 2004

Article 29 Data Protection Working, Opinion 4/2017 on the Proposal for a Directive on

certain aspects concerning contracts for the supply of digital content,

Article 29 Data Protection Working Party, Opinion 01/2017 on the proposal for a regulation

of the ePrivacy Regulation (WP247, 4.4.2017). (WP29, Opinion on ePR proposal).

Article 29 Data Protection Working Party, Opinion 4/2007 on the concept of personal data,

20 June 2007 (WP136) 01248//07/EN. (WP29, Opinion on Personal Data).

Article 29 Data Protection Working Party, Opinion 03/2016 on the evaluation and review of

the ePrivacy Directive (2002/58/EC), (19 July 2016) WP 240. (WP29, Opinion on evaluation

and review of ePD).

Article 29 Data Protection Working Party, Opinion 5/2005 on the use of location data with a

view to providing value-added services (2130/05/EN), Peter Schaar, (25 November 2005) WP

115. (WP29, Opinion on location data: value-added services).

Article 29 Data Protection Working Party, Guidelines on Transparency under Regulation

2016/679, 11 April 2018 (WP260 rev.01, 29 November 2017) 17/EN. (WP29, GDPR Guide-

lines: Transparency).

European Data Protection Supervisor, Opinion 5/2016 Preliminary EDPS Opinion on the

review of the ePrivacy Directive (2002/58/EC) (Opinion on review of ePD), (22 July 2016)

Digital Agenda: Commission refers UK to Court over privacy and personal data protec-

tion, (IP/10/1215).

38

Deloitte, Evaluation and Review of Directive 2002/58 on Privacy and the Electronic

Communication Sector (Evaluation of ePD), SMART 2016/0080 (European Commission

2017).

7.2.2 European Commission

European Commission - PRESS RELEASES - Press Release - Digital Agenda: Commis-

sion Closes Infringement Case after UK Correctly Implements EU Rules on Privacy in Elec-

tronic Communications’.

COMM/PRESS/01, ‘Digital Agenda: Commission Closes Infringement Case after UK

Correctly Implements EU Rules on Privacy in Electronic Communications’ (europa, 26 Janu-

ary 2012) <http://europa.eu/rapid/press-release_IP-12-60_en.htm> accessed 14 November

2018

‘European Commission - PRESS RELEASES - Press Release - Digital Agenda: Commis-

sion Closes Infringement Case after UK Correctly Implements EU Rules on Privacy in Elec-

tronic Communications’ <http://europa.eu/rapid/press-release_IP-12-60_en.htm> accessed 14

November 2018.

European Commission (2016) European Commission launches EU-U.S. Privacy Shield:

stronger protection for transatlantic data flows. 12 July 2016, IP/16/2461.

http://europa.eu/rapid/press-release_IP-16-2461_en.htm. Accessed 10 September 2018.

Summary of Commission decision of 27 June 2017 relating to a proceeding under Article

102 of the Treaty on the Functioning of the European Union and Article 54 of the EEA

Agreement (Case AT.39740 — Google Search (Shopping)) (notified under document number

C(2017) 4444)

COMMISSION DECISION of 27.6.2017 relating to proceedings under Article 102 of the

Treaty on the Functioning of the European Union and Article 54 of the Agreement on the

European Economic Area (AT.39740 - Google Search (Shopping))

39

Commission (EC), Guidance on Unfair Commercial Practices

https://webgate.ec.europa.eu/ucp/public/index.cfm?event=public.guidance.browse&article

=article-14&elemID=74#article-14; see also OFFICE OF FAIR TRADING, ONLINE TAR-

GETING OF ADVERTISING AND PRICES 6, 1.11 (2010), available at

http://webarchive.nationalarchives.gov.uk/20140402142426/http:/www.oft.gov.uk/shared_oft

/business_leaflets/659703/OFT1231.pdf.

Commission Case AT.39740 Google Search (Shopping), decision of 27 June 2017 avail-

able at http://ec.europa.eu/competition/antitrust/cases/dec_docs/39740/39740_14996_3.pdf.

For further information see IP/17/1784 of 27 June 2017 available at

http://europa.eu/rapid/press-release_IP-17-1784_en.htm and MEMO/17/1785 of 27 June 2017

available at http://europa.eu/rapid/press-release_MEMO-17-1785_en.htm.

Commission Decision relating to proceedings under Article 102 of the Treaty on the

Functioning of the European Union and Article 54 of the Agreement on the European Eco-

nomic Area (AT.39740 - Google Search (Shopping)), OJ C 9, January 12 2018.

Commission (EC), Stronger protection, new opportunities - Commission guidance on the

direct application of the General Data Protection Regulation as of 25 May 2018, COM (2018)

43 final.

Communication from the Commission to the European Parliament, the Council, the Eco-

nomic and Social Committee and the Committees of the Regions, A comprehensive approach

on personal data protection in the European Union, COM(2010) 609 final, Brussels, 4 Nov

2010, available at http://ec.europa.eu/justice/news/consulting public/0006/com 2010 609

en.pdf. (Commission (EC), comprehensive approach on personal data protection)

Communication from the Commission to the European Parliament, the Council, the Eco-

nomic and Social Committee and the Committees of the Regions, Safeguarding Privacy in a

Connected World: A European Data Protection Framework for the 21th Century, COM

(2012) 9 final, Brussels, 25 Jan 2012, available at http://eur-lex.europa.eu/legalcontent/.

(Commission (EC), Safeguarding Privacy in a Connected World)

40

Commission, Online Platforms and the Digital Single Market: Opportunities and Chal-

lenges for Europe (Online platforms: Opportunities and Challenges), (Communication) COM

(2016) 288 final. 25.5.2016

Commission, Online platforms: Opportunities and Challenges, []

Commission, Internet of Things: An action plan for Europe (IoT: Action plan), (Com-

munication) COM (2009) 278 final.

Commission, A Digital Single Market Strategy for Europe - Analysis and Evidence: Ac-

companying the document Communication from the Commission to the European Parliament,

the Council, the European Economic and Social Committee and the Committee of the Regions

on a Digital Single Market Strategy for Europe (DSM Strategy for EU), (internal procedure)

SWD (2015) 100 final.

Commission, DSM Strategy for EU.

Commission, Completing a trusted Digital Single Market for all: The European Commis-

sion's contribution to the Informal EU Leaders' meeting on data protection and the Digital

Single Market in Sofia on 16 May 2018 (Completing a trusted DSM), (Communication)

COM (2018) 320 final. 25.5.2016

Commission, Completing a trusted DSM

Commission, “The Top 20 EU achievements 2014-2019 (ANNEX III) Europe in May 2019:

Preparing for a more united, stronger and more democratic Union in an increasingly uncertain

world” (EU in May 2019: Top 20 Achivements) a part

The concept of privacy consists of the three main features secrecy, anonymity, and soli-

tude. In particular the value attached to information varies with respect to the individual to

which the information relates who generally has a higher interest in its secrecy than a poten-

tial by stander. However, privacy is valuable not only to the individual but also to functioning

democratic political system and all individuals there in as it provides as exclusion in which

democracy can grow

Weber, R.H.,“How does Privacy change in the Age of the Internet?”, in: Fuchs, C.,

Boersma, K., Albrechtslund, A. and Sandoval, M., eds., Internet and Surveillance, Routledge,

41

2011, 274. 9 Regan, P.M., Legislating privacy: Technology, Social Values and Public Policy,

University of North Carolina Press, Chapel Hilland London, 2009, 225

Digital Agenda: Commission closes infringement case after UK correctly implements EU

rules on privacy in electronic communications, EUROPEAN COMMISSION (Jan. 26, 2012),

http://europa.eu/rapid/pressReleasesAction.do?reference=IP/12/60&type =HTML. Com-

mission refers UK to Court over privacy and personal data protection

(IP/10/1215)

“European Commission - PRESS RELEASES - Press Release - Digital Agenda: Commis-

sion Refers UK to Court over Privacy and Personal Data Protection.” Case breifing. Accessed

November 14, 2018. http://europa.eu/rapid/press-release_IP-10-1215_en.htm.

“European Commission - PRESS RELEASES - Press Release - Digital Agenda: Commis-

sion Refers UK to Court over Privacy and Personal Data Protection.”

European Commission, Vice-President of the European Commission, EU Justice Com-

missioner, Viviane Reding, PRESS RELEASE: The EU Data Protection Reform 2012: Mak-

ing Europe the Standard Setter for Modern Data Protection Rules in the Digital Age, 22

January 2012, SPEECH/12/26, available online at

<europa.eu/rapid/pressReleasesAction.do?reference=SPEECH/12/26>(accessed 12 November

2014).

As Viviane Reding commented,‘[p]ersonal data is the currency of today’s digital market.’

Reding, “The EU Data Protection Reform 2012”

Reding, Viviane. Data Protection Day 2014: "Full Speed on EU Data Protection Reform"

(EU DP Full Speed), European Commission Brussels, (27 January 2014) MEMO/14/60

Reding, Viviene. “The EU Data Protection Reform 2012: Making Europe the Standard

Setter for Modern Data Protection Rules in the Digital Age.” Innovation Conference: Digital,

Life, Design, (22 January 2012) SPEECH/12/26 available at

42

http://ec.europa.eu/commission_2010-2014/reding/pdf/speeches/s1226_en.pdf.

Reding Viviane., “Your data, your rights: Safeguarding your privacy in a connected

world”, SPEECH/11/183, Europa (16 March 2011), available at available online at

europa.eu/rapid/press-release_SPEECH-11-183_en.htm>(accessed10September2014

Privacy Platform "The Review of the EU Data Protection Framework"

Executive Summary of the preliminary Opinion of the European Data Protection Supervi-

sor on the review of the ePrivacy Directive (2002/58/EC).

Opinion of the European Economic and Social Committee on the ‘Proposal for a Regula-

tion of the European Parliament and of the Council concerning the respect for private life and

the protection of personal data in electronic communications and repealing Directive

2002/58/EC (Regulation on Privacy and Electronic Communications)’ (COM(2017) 10 final

— 2017/0003 (COD)) OJ C 345, 13.10.2017, p. 138–144

Mańko, Rafal, “Contracts for the supply of digital content and digital services”, EU legis-

lation in progress (Briefing), 3 ed. EPRS, 19 February 2018, PE 614.707, available at:

http://www.europarl.europa.eu.

Office of Fair Trading, Online targeting of advertising and prices: A market study. (May,

2010). OFT 1231, available at

https://webarchive.nationalarchives.gov.uk/20140402175024/http://www.oft.gov.uk/OFTwor

k/markets-work/online-targeting

Ariel Ezrachi and Maurice Stucke, Online Platforms and the EU Digital Single Market.

(Written Evidence), (data.parliament.uk, 16 October 2015) OPL0043, available at

http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/eu-

internal-market-subcommittee/online-platforms-and-the-eu-digital-single-

market/written/23223.html

43

Farivar, Cyrus, “Cambridge Analytica breach results in lawsuits filed by angry Facebook

users: ‘Facebook lies within the penumbra of blame’, Maryland woman claims” (lawsuits

filed by angry Facebook users), published 23 March 2018, https://arstechnica.com/tech-

policy/2018/03.com

Ezrachi and Stucke, ‘Written Evidence (OPL0043) Online Platforms and the EU Digital

Single Market’. (Governmental database: Parliament of the United Kingdom),

Mańko, Rafal, “Contracts for the supply of digital content and digital services”, EU legis-

lation in progress (Briefing), 3 ed. EPRS, 19 February 2018, PE 614.707, available at:

http://www.europarl.europa.eu.

House of Lords Select Committee on European Union, Online Platforms and the Digital

Single Market (10th report, session 2015-2016, HL paper 129) Volume II: Oral and written

evidence <https://publications.parliament.uk/pa/ld201516/ldselect/ldeucom/129/129.pdf>

House of Lords Select Committee on European Union. “Online Platforms and the Digital

Single Market.” Volume II: Oral and written evidence. Volume II: Oral and Written Evi-

dence. house of lords, April 20, 2016.

https://publications.parliament.uk/pa/ld201516/ldselect/ldeucom/129/129.pdf.

8 Table of abbreviations Article 29 Working Party .................................................................................................. WP29

European Union...................................................................................................................... EU

Digital Single Market .......................................................................................................... DSM

European Commission ........................................................................................................... EC

the Charter of Fundamental Rights .......................................................................................CFR

the European Convention on Human Rights..................................................................... ECHR

the Treaty on the Functioning of the European Union.................................................................

TFEU

Proposal for a regulation concerning e-communications and data protection ........................... 5

Proposal for a Regulation on Electronic Communication............................................................

44

5

European General Data Protection Regulation (EU) 2016/679 ....................................... GDPR

the e-Privacy Directive 2002/58/EC ...................................................................................e-PD

Proposal for a Directive of concerning contracts for the supply of digital content ....... pSDCD

the e-commerce Directive 2000/31/EC ................................................................................ ECD

the European Court of Human Rights ............................................................................... ECHR

The European Court of Justice ........................................................................................... CJEU

Over the Top Services .......................................................................................................... OTT

Internet of Things ................................................................................................................... IoT

Terms of Use ToU

Information Society Service .................................................................................................. ISS

public e-communication service provider .................................................................................. 5

Information and Communication Technology ...................................................................... ICT

Terms and Conditions (standard contract terms) .............................................................. T&Cs

Unfair Commercial Practices Directive 2005/29/EC ....................................................... UCPD

Standard Contractual Clause .................................................................................................SCC

Data Protection Supervisor....................................................................................................DPS

Privacy-Enhancing Technologies........................................................................................ PETs

Staff working document ..................................................................................................... SWD

45

9 Table of legal instruments

Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April

2016 on the protection of natural persons with regard to the processing of personal data and

on the free movement of such data, and repealing Directive 95/46/EC (General Data Protec-

tion Regulation), Pub. L. No. L 119/1, OJ L 119, 4.5.2016, pp. 1–88 88 (2018).

Treaty of the Functioning of the European Union, opened for signature 7 February 1992,

[2009] OJ C 115/199 (entered into force 1 November 1993) (‘TFEU’).

Consumer Rights Directive

Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011

on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the

European Parliament and of the Council and repealing Council Directive 85/577/EEC and

Directive 97/7/EC of the European Parliament and of the Council, OJ L304/64

Data Protection Directive

Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on

the protection of individuals with regard to the processing of personal data and on the free

movement of such data, OJ L281/31

European Electronic Communications Code

Final text of the proposal for a Directive establishing the European Electronic Communi-

cations Code, as adopted by the European Parliament on 14 November 2018.

ePrivacy Directive

Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002

concerning the processing of personal data and the protection of privacy in the electronic

communications sector, OJ L201/37

ePrivacy Regulation

Proposal for a Regulation of the European Parliament and of the Council concerning the

respect for private life and the protection of personal data in electronic communications and

46

repealing Directive 2002/58/EC, COM/2017/010

Framework Directive

Directive 2001/21/EC of the European Parliament and of the Council of 7 March 2002 on

a common regulatory framework for electronic communications networks and services as

amended by Regulation (EC) No 717/2007 of the European Parliament and of the Council of

27 June 2007 on roaming on public mobile telephone networks within the Community and

Regulation (EC) No 544/2009 of the European Parliament and of the Council of 18 June 2009

amending Regulation (EC) No 717/2007 and Directive 2002/21/EC and Directive

2009/140/EC of the European Parliament and of the Council of 25 November 2009 amending

Directives 2002/21/EC, 2002/19/EC and 2002/20/EC

GDPR

Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April

2016 on the protection of natural persons with regard to the processing of personal data and

on the free movement of such data and repealing Directive 95/46/EC (General Data Protec-

tion Regulation), OJ L119/1

Regulation

Regulation (EU) 2015/2120 of the European Parliament and

10 Table of cases

Barnes & Nobel

CJEU British Gas Trading case

CJEU, El Majdoub

Register v. Verio

The Netscape case

ECJ, judgment of 13 May 2014, Google Spain and Google (C-131/12, EU:C:2014:317).

ECJ, judgment of 1 October 2015, Bara and Others (C‑201/14, ECLI:EU:C:2015:638)

Order of the Court (Eighth Chamber) of 23 October 2018 — Council départemental de

l’ordre des chirurgiens-dentistes de la Haute-Garonne

(Case C-296/18)

47

Glossary of Terminology abbreviations, and acronyms , Data Subjects Participation and Control i, 5

form of marketing that has a high risk of negatively affecting individuals and society. For

example, Google search result is tailored using the personal data they have collected on an

individual such as recent searches, location country and langued. Solely, collecting basic

information like location and using it as a factor in individuals search results can have ma-

jor consequences 8, 9

Indirect and Direct marketing 8

Internet of Things 1

Over the Top Services 10

Providers of Information Society Services 2, 22

publicly available Electronic Communication Services providers 12

Terms of Use 3, 5, 19, 20, 22

European Courts

4, 19, 22

Internationally known Abbreviations and Acronyms central mews network 9

Guest ID number 10

Opinions and Executive Summaries of relavant European Authorities or

Similar DPA's

Executive Summary of the preliminary Opinion of the European Data Protection Supervisor

on the review of the ePrivacy Directive (2002/58/EC) 13

Opinion 4/2017 on the Proposal for a Directive on certain aspects concerning contracts for the

supply of digital content, 20

the proposal for a regulation (WP247, 4.4.2017, opinion 01/2017) , the Article 29 Working

Party 13