At a turning point: old versus new society?

Spyros A. Pappas

One of the deepest philosophical foundations of the EU is the absolute value of the individual. It is on this value that democracy is based and the EU in its turn on the latter. Although of absolute character it is not unlimited. It reaches its borders at the line of respect of the other.

One aspect of the value of the individual is privacy. Directive 95/46/EC has so far generally been hailed as being a success and a pioneer in data protection all over the world in particular thanks to its technology-neutral character and the underlying, flexible principles1. It encompasses two general objectives of the EU: the protection of fundamental rights and freedoms of the individuals, in particular the right of protection of personal data (Art. 8 of the Charter of Fundamental Rights of the European Union, now legally binding), and the free flow of the said data in the internal market (attainment of a common market). The Directive though has come to face challenges of the EU legal system and of our time. Social networking sites, cloud computing, e-government, e-commerce, globalization are some of the factors that have transformed the current technological landscape beyond recognition and “imposed” their own reality.

By way of example, suffice to look at the case study of Google leading in the field of search engines: In 2007, Art. 29 Working Party (the EU independent advisory body set up by the 95/46/EC Directive) urged Google to store server logs (users’ web history data, which is identifiable “personal data” in the meaning of the Directive) generated by Google users for a reduced period of time, as two years was too long2. Google failed to specify why it needed to keep this data for so long, as required by Art. 6(1)(e) of the Directive and also why the “Google cookie” had a lifetime of thirty years manifestly going far beyond what is “strictly necessary” for Art. 5(3) Directive 2002/58/EC.

1 Notice – Individuals must be informed that their data is being collected and about how it will be used.

Choice – Individuals must have the ability to opt out of the collection and forward transfer of the data to third

Choice – Individuals must have the ability to opt out of the collection and forward transfer of the data to third parties.

Onward Transfer – Transfers of data to third parties may only occur to other organizations that follow adequate data protection principles.

Security – Reasonable efforts must be made to prevent loss of collected information.

Data Integrity – Data must be relevant and reliable for the purpose it was collected for.

Access – Individuals must be able to access information held about them, and correct or delete it if it is inaccurate.

Enforcement – There must be effective means of enforcing these rules.

2 It should be noted that two years is the period of time Google decided to adopt after a change of policy to reduce this amount of time; before this incident it stored personal data indefinitely!

From 2007 to 2010, Google, via its Street View service, had access and was amassing information available on public Wi-Fi networks in most EU countries (and all over the world too, e.g.New Zealand), including all Internet activity of users, the content of their e-mails and potentially even hard drive content, without any notice or consent! This unprecedented breach of security and privacy came to light only after the Hamburg Data Protection Authority insisted on reviewing data accessed by Google Street View. Google has not accepted responsibility or explained why this happened.

In 2009, it was found out that Google Docs Cloud Computing Service disclosed usergenerated documents to users of the service without permission to view them.

In 2010, Google’s social networking site Buzz was launched, immediately contributing to the company’s violations. Buzz, now inoperative, had a huge privacy flaw: its default setting allowed public viewing of the people the user mailed and chatted with the most –through Gmail- and it outraged users who fell its victims. Google finally settled claims with the Federal Trade Commission (“FTC”) in the USA, accepting that it “used deceptive practices and violated its own privacy policies”.

In 2011, “Safarigate” or “Cookiegate”3 was revealed, which truly appalled everyone watching.

Apparently, Google had been circumventing Safari default privacy settings to serve personalized advertisements to users by spying on their research habits. Although the practice is hardly unusual in this business, the intentional circumventing is not only in contradiction to Google’s own instructions to users on how to avoid tracking, but also evinces how unilaterally Google can act and manipulate users’ identity elements, with virtually no consequence whatsoever for its actions.

Finally, Art. 29 Working Party has warned Google to pause implementation of its new Privacy policy (uniting all its services’ policies into one) so that the new privacy terms are checked.

However, went on with its plans and implemented the new policy on 1st March 2012, as originally envisaged. The most important issue in this new policy will have to be cross-sharing, i.e. usergenerated data collected across all of Google’s numerous services. By implementing this new policy, Google also breaches the FTC agreement mentioned above, which provides that Google should allow users to opt in, in case its privacy policy changes.

The disregard of fundamental rights is obvious. Fragmented and outdated legislation offered an attractive vacuum to pioneering companies. Moreover, it became obvious that absolute values cannot but be of relative protection in a world driven by new market and societal features. As the EU Justice Commissioner recently put it: “Let us build a new gold standard of data protection based on clear and strong laws allowing our businesses and citizens to fully benefit from the digital economy.”

In this context next to the objective of a single set of rules on data protection, valid across the EU, new concepts are on the agenda such as the right of informational self-determination. It contains:

-the right of easy access to one’s own data and the right of data portability, i.e. easier transfer of personal data from one service provider to another;

-the principle of prior and explicit (not just assumed) consent for data to be processed;

– the ‘right to be forgotten’ will allow people when they no longer want their data to be processed and there are no legitimate grounds for retaining it, the data to be deleted;

-the right to limited data retention.

The challenge remains enormous. It looks like the struggle between an outgoing world based on ideas in the center of which lies the Man versus an incoming world based on market and technology.

No doubt the balance is not easy, yet of paramount importance for the quality of life. Besides, while legal certainty is at present required to create confidence, on the other hand flexibility, or adaptability is indispensable. In this regard, it is relevant to take into account the comparison between the ratio 1% of all telecommunicated information in 1993 to more than 97% in 2012 and a decision making that was reasonably quick and has become unreasonably slow. In between, people and data mobility will be increasing, as well as the occurrence of more and more commercial and financial transactions via the Internet resulting in the need for more cooperation throughout the EU on criminal matters (identity of criminals, child pornography, terrorism matters), all made possible through the evolution of technology. Will users manage to adapt themselves to new technological achievements? Will new legislation when adopted always be up to date? How will it be feasible to couple certainty with flexibility, both required? Questions that have no obvious answers once the basics are put into review.


The Right to Informational Self-Determination: A Privacy

Concept fit for the Future?

Stephan Dreyer

Senior Researcher at the Hans Bredow Institute for Media Research


The “Right to Informational self-determination” has been an important approach in the field of

data protection and surveillance since the 70ies. Since a well-known decision of the German

Constitutional Court in 1983, it has been recognized as a specific extension of the basic right to personality, shaping the layout of data protection laws. Back then, it has been elucidated to react to new forms of risks by means of electronic information processing. The following article describes the legal and sociological theories behind the right to informational self-determination, its current challenges and the possibilities a concept of informational self-determination as such offers for today’s information and communication regulation and practice.


Informational self-determination: Protection of freedom and

democracy, not property in data

It already comes with the term self-determination. To determine, and not to be determined, is the objective of a right that aims at letting the individual’s authority “to decide himself when and within what limits information about his private life should be communicated to others”. And this is the premise of the right to informational self-determination the German Constitutional Court generally introduced in its famous public census decision (BVerfGE 65, pp. 1). This constitutionally backed right was not new, as some state, it was more like a cogent deduction of the existing judicature. It was the court’s reaction to new risks for personal and/or personality rights that are accompanied with modern developments: The purpose of the right’s birth was to put existing basic rights back on the map again in view of new threats in every day’s life due to new technologies and procedures (Kunig 1993: 569).

And how life-altering these new technologies for data procession have been in 1983! From paperbased card indexes to electronic files and databases – storable, accessible and combinable virtually without limit, leading to a situation where people might not know who knew what about them in what contexts. Nowadays, we live in an age that is strikingly reminiscent to this deliberation, almost 30 years later.

The court based its decisions on two pieces of reasoning. Firstly, informational self-determination must exist to ensure individuals can develop their personality freely and without undue external interference. Living out these “autonomic capabilities” (Rouvroy and Poullet 2009: 46) – or shorter: enjoying this freedom – is what a right to informational self-determination has to aim at. The approach builds upon concepts from system theory: According to Luhmann, the function of a concept of privacy aims at protecting the consistency of the personality (Luhmann 1995). This consistency relies on the separation of societal sub-systems. Only as long as these (personal) sub-systems are basically shielded from each other and no information from one system diffuses into another system (e.g. from medical treatment to work environment) self-determined development of the individual can be upheld (cf. Hornung and Schnabel 2009: 85). The court refers to (broader) constitutional guarantees, especially the general right to general personal liberty (Art. 2 sec (1) of the German Constitution), and, combined with the guarantee of human dignity (Art. 1 sec (1)), the concept of a general right of personality, giving each citizen the possibility to freely develop his or her own personality autonomously. This is the place where the court saw the potential of new risks of technological developments: If a person cannot predict with sufficient certainty what information about him is known to his counterpart or social milieu, the individual might see himself pressured to act “low-profile”. In case a person is uncertain whether deviant behaviour is being monitored, noted and stored as permanent information, he will try not to attract attention by such activities. Such consequence, however, is exactly what a general right to autonomously develop one’s personality ought to prevent.

A new right? No, just a name for a specific interpretation of existing ones

So, in view of electronic data processing the German Constitutional Court, hence, sculptured the right to informational self-determination as a special sub-area of the general right of personality (there are more of such sub-areas, e.g. the right to one’s own picture, the right to be left alone or the right in the confidentiality and integrity of IT systems). Data protection laws in this view are rather indirect tools to ensure or at least foster informational self-determination (cf. Rouvroy and Poullet 2009: 53).

Moreover, a right to informational self-determination with its consequences on personal developments is more or less indirectly fostering a free and vivid democracy. Self determination in this view, enabling the individual to act freely, is also a necessary precondition for a free democratic society. Hence, not guaranteeing the right to informational self-determination also has an impact on common welfare.

Both lines of reasoning refer to a phenomenon of informational self-determination that does not make the theoretical approach anything easier: Of course, humans are social life forms after all, evolving within social structures and communities, and one’s personality is always based on and shaped by interactions with other individuals. So, information related to a person always fulfils the function of a picture of the “social reality” which cannot be laid solely into the hands of the person affected. Hence, limitations of the right to informational self-determination have to be made for the interest of third parties.

The last thought has two major consequences for the legal concept of a right to informational selfdetermination:

First, the existence of this right in no way results in an encompassing ownership of all the information that is out there. It contains such information that relates to a specific, identifiable person or that relates to facts or circumstances concerning an individual. The person affected by such information shall have the ability to determine who should know, collect or disclose this information and how it should be processed. But, if there is an interest in processing the data by a third party (another person, company, state body or the public), this interest has to be balanced against the right to informational self-determination. Informational self-determination, thus, is a concept carrying both individualistic values as well as the function to enable a free and democratic society.


Principles for data protection laws

A right to informational self-determination has important consequences for its implementation in law:

Self-determination and autonomy are no tangible resources and they cannot be granted by the lawmaker. To provide for a right to informational self-determination, the lawmaker, rather, has to implement a legal framework within which an individual is – theoretically – able to determine the processing of one’s data. The concept of informational self-determination as a right of selfdetermination aims at empowering and supporting people to actively use their right of selfdetermination.

In order to constitute such actively exerted individual rights, therefore, the lawmaker needs to implement institutional, organisational and procedural rights.

What arise from this theoretical background are the main principles of modern data protection laws:

–‐ The individual should have control over the decision whether data is collected, stored or processed. His consent is needed. Only in limited cases, where external interests outweigh the right to self-determination, should the collection, storage or processing be allowed.

–‐ In cases where a person declares his consent to data processing, the individual has to be informed in advance about the type of data being processed and the purpose of processing. Without knowing the purpose of data processing, consent would lack its function that the person can oversee factual or potential consequences for his self-determination.

–‐ To keep track of the relevant personal data, the individual must have the right to information, i.e. to access this information against third parties.

–‐ To minimize risks for informational self-determination an organisational premise of data protection must be data security and minimisation of data collection and processing (data protection by design). 

Information society: Challenging the underlying assumptions

During the last 30 years, enormous technological and societal changes have taken place, challenging at least some of the assumptions made by informational self-determination:

For one thing, information asymmetries, when it comes to collecting, storing and processing data, worsened, despite regulations. Consumption goods and their distribution get digital, services are offered electronically, customer retention strategies have spiked, all backed by huge ERP and CRM systems – let alone new services and business models made possible by the internet. This led to the present situation of “consent all over the place”, where contract terms, terms of service and end user licence agreements usually include the user’s consent to the processing of his data, often extending beyond the sole purpose of fulfilling the contract or providing the service. Here, consumers’ general inertia regarding such mandatory terms “is a strong and pervasive limitation on free choice” (Schwartz 2000: 823), and permits notices to become an alibi for “take-it-or-leave-it” data processing (ibid. : 825). Result is a habitualization of one-directional one-click consent, without (a) the possibility to say “no” and still being able to use the service and (b) consciously taking into consideration the potential consequences of each consent for the own development of a self-determined personality. It is more an uninformed or even negligent choice than a deliberate action. However, the concept of informational self-determination is based on the assumption that a person actually wants to activelydecide on who collects what data for what purpose.

With the exorbitant success (here, from a social rather than from an economic perspective) of social networking services (SNS) another aspect of the underlying concept is challenged: People sometimes want to publish their personal data in social worlds. The important function of SNS, when it comes to self-display in peer-groups or in general public, cannot be underestimated. Creating and “tuning” one’s perceived personality via SNS is an activity that is closely related to the possibility to freely develop his or her own personality autonomously. However, changing contexts and the underrated size of “public private spheres” in SNS also pose a threat for informational self-determination. In the end, these diffusing new spheres between public and private are raising the question whether the implications for the right to informational self-determination are still foreseeable for the individual in general: When the objective of the right is to decide about who knows what data, but the “who” isn’t predictable anymore, the current situation at least undermines autonomous and terminal decisions due to the complexity or even obscurity of their consequences.

Third, the current concept of informational self-determination rests upon the expectation that the two most problematic counterparts of the right to self-determination are state bodies on one side and companies on the other side. While this assumption still seems to be right, another phenomenon has not been anticipated by the concept. Data processing is being done more and more by private individuals. SNS in this respect have to serve as examples. Sharing, liking, copying, forwarding or commenting on each other’s posts always includes the processing of a third party’s personal data – intentionally or incidentally. These social interactions elude the current notion of informational selfdetermination, as they usually fall within the sphere of necessary social interaction – like explained above – but nowadays tend to exceed traditional – framed and limited – social interaction. In practice, the increased data processing by laymen or private third parties also poses the question whether laws still aim at the right direction and whether they are written in the right language for the extended “target group”.

Informational self-determination in current communication practices:

Empowerment of those affected

It has become clear that the current challenges do not question the idea of informational selfdetermination in general. The idea remains a good one. However, it seems necessary to reinterpret the concept in view of the modern information society, filling it with life again in the right areas. As personal information, personal and public communication and personal data become more and more intertwined, posing both risks to informational self-determination and fostering personal liberties in parallel, they should be considered together from a regulatory perspective, too. Especially in the intersection between information rights, information interests and personality rights, there is a new need for an information law that is granting freedom and limitations within one coherent framework, emphasizing the ambivalence of data handling for individual freedom, social interaction and public communication. Communication scientists already make use of the concept of informational selfdetermination, extending it towards an approach of “self-determination by informational means” in media literacy. In the field of SNS, this framework ends in a differentiation of activities into categories of “identity management”, “relation management” and “information management” (cf. Paus- Hasebrink, Lampert and Hasebrink 2007: 2).

With ever more complex data processing possibilities and increasing amounts of data arising en passant, making it easier to process and analyse personal data, the objective of an appropriate interpretation of informational self-determination has to be to use the same modern technology to simplify the information and control possibilities of individuals when exerting their right, especially with regard to transparency, autonomy of decisions and the possible courses of action. Law, in this area, will meet its (national and theoretical) limits soon, once a potential risk materializes in practice, as information tends to diffuse quickly and irreversibly. Hence, regulation will have to focus on setting incentives for implementing structures and techniques that aim both at minimizing potentials for privacy risks (“privacy by design”) and at maximizing the technical possibilities for the individual to protect his privacy in case of infringements (“privacy by tools”). Finally, new forms of governance have to be found in a global data processing environment to help implement such objectives: Multistakeholder approaches and new control instruments (e.g. community-backed information and control, informational regulation tools like transparency obligations, new forms of information obligations like icon-based privacy statements) might be the first steps to implement informational self-determination as a common global value (“privacy as an ethical standard”).


Flaherty, D. (1990): On the Utility of Constitutional Rights to Privacy and Data Protection. Case Western Reserve Law Review 41, pp.831.

Hornung, G.; Schnabel, C. (2009): Data protection in Germany I: The population census decision and the right to informational self-determination. Computer Law & Security Review 25, pp. 84.

Kunig, P. (1993): Der Grundsatz informationeller Selbstbestimmung. Jura 1993, pp 595.

Paus-Hasebrink, I.; Lampert, C.; Hasebrink, U. (2009): Social Network Sites – Challenges for Media Literacy. EU Kids Online Conference June 2009. London. Available at

Pitschas, R.(1998): Bedeutungswandel des Datenschutzes im Übergang von der Industrie- zur Informationsgesellschaft. In: Sokol (ed.), 20 Jahre Datenschutz – Individualismus oder

Gemeinschaftssinn?, pp. 35.

Rouvroy, A.; Poullet, Y. (2009): The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy. In: Gutwirth et al. (eds.), Reinventing Data Protection?, pp. 45.

Schwartz, P. M. (2000): Internet Privacy and the State. Connecticut Law Review 32, pp. 815.

The Reform of the data protection legal framework

Panayota Boussis

The European Commission proposes a new, clear and uniform legislative framework, which will ensure a strong protection of the fundamental right to data protection throughout the European Union and at the same time, will strengthen the functioning of the Single Market.

Building trust in online environment: a challenge for the Commission

The phenomenal development of new technologies has an undeniable effect on the ever-increasing volume of personal data collected, accessed, used and transferred. By using smart cards, cloud computing or social networking sites, we leave digital traces at every “click” we make. At the same time, collecting and analyzing personal data has become a real asset for many companies of which the economic activities are mainly based on the analysis of the data of potential customers. When disclosing their personal data, people are absolutely aware that their data will be processed.

They feel however that they are not in complete control of them and they are concerned that their personal data may be misused. This lack of confidence in online services definitely affects the growth and the competitiveness of the digital economy within the European Union.

Building trust in the online environment seems essential to economic development. A reform of the current legislative framework was therefore required in order to ensure a high level of data protection, enhancing thus trust in online services and fulfilling the potential of the digital economy.

This reform is even more important given the central role of personal data protection in the Digital Agenda for Europe and in the Europe 2020 Strategy.

Current Legislative framework: Directive 95/46/EC

The existing legislation at European level on personal data protection is the Directive 95/46/EC4, adopted in 1995 with a double objective: to protect the fundamental right to data protection and to guarantee the free flow of personal data between Member States. Directive 95/46/EC has been completed by the Framework Decision 2008/977/JHA as a general instrument at Union level for the protection of personal data in the areas of police co-operation and judicial co-operation in criminal matters5.

Nowadays, we are facing new challenges for the protection of personal data, principally due to the technological developments. The scale of data sharing and collecting having increased considerably, the objectives and principles protected by the current legal framework need more than ever a strong and coherent protection. Indeed, the current legal framework has a main weakness: it has not prevented fragmentation in the way personal data protection is implemented across the Union.

Under Directive 95/46/EC the ways in which individuals are able to exercise their right to data protection are not sufficiently harmonized across Member States. Nor are the powers of the national authorities responsible for data in order to ensure consistent and effective application of the rules within the European Union. This fragmentation may lead however to legal uncertainty and as a result to the public perception that there are significant risks associated with online activity. Indeed, many Europeans consider that they are not properly informed of the processing of their personal data and they do not know how to exercise their rights online.

A stronger and more coherent data protection framework within the European Union is therefore essential. It would put individuals in control of their own data, reinforce legal and practical certainty for economic operators and public authorities and allow hence the digital economy to develop across the internal market

The right to protection of personal data is protected by Article 8 of the Charter of Fundamental Rights of the EU as a fundamental right. Likewise, the Treaty on the Functioning of the European Union (TFEU) establishes in Article 16 (1) the principle that everyone has the right to the protection of personal data concerning him or her and introduced a specific legal basis (Article 16(2)) for the adoption of rules on the protection of personal data.

This is on that basis that the Commission proposes a new legal framework on data protection. After assessing the impacts of different policy options, the European Commission proposes a strong and consistent legislative framework across Union policies, enhancing individuals’ rights, cutting red tape for businesses, enhancing thus the Single Market dimension of data protection.

One aspect of the reform is the nature of the legal text. Data protection requirements and safeguards will be set out in a Regulation with direct application throughout the Union.

The proposed legal framework consists of two legislative proposals:

– a proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), and

–a proposal for a Directive of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data by competent authorities for the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and the free movement of such data. 

The right to protection of personal data

The right to protection of personal data is established by Article 8 of the Charter and Article 16 TFEU as well as in Article 8 of the ECHR. According to the Court of Justice of the EU6, the right to the protection of personal data is not an absolute right, but must be considered in relation to its function in society. Data protection is closely linked to respect for other rights established by the Charter such as i.e. freedom of expression (Article 11 of the Charter); the rights of the child (Article 24) the right to property and in particular the protection of intellectual property (Article 17(2)); the prohibition of any discrimination amongst others on grounds such as race, ethnic origin, genetic features, religion or belief, political opinion or any other opinion, disability or sexual orientation

(Article 21).

Objectives of the reform

Putting individuals in control of their personal data

One of the priorities of the new legal framework on data protection is to allow individuals to exercise an effective control on their personal data. This rejoins the expectations of many Europeans who although they consider that disclosure of their personal data online is inevitable, they feel that they are not in control of their data since they are not properly informed of what happens to their personal information once disclosed. Often, as already mentioned, they do not know how to exercise their rights online.

The reform of the EU data protection rules will namely ensure the “right to be forgotten” by introducing an explicit requirement that obliges online social networking services to minimize the volume of users’ personal data that they collect and process. The proposal foresees also an explicit obligation for data controllers to delete an individual’s personal data if that person explicitly requests deletion and where there are no other legitimate grounds to retain it. Moreover, it is foreseen that the default settings shall ensure that data is not made public.

The individual’s ability to control their data will be improved with the proposed Regulation, which will ensure that, when their consent is required, it is given explicitly and freely with a clear affirmative action by the person concerned.

In addition, the Regulation will strengthen the right to information so that individuals fully understand how their personal data is handled, particularly when the processing activities concern children. It will also guarantee an easy access to individual’s own data and a right to data portability, i.e. a right to obtain a copy of the stored data from the controller and the freedom to move it from one service provider to another.

The new legal framework intends to reinforce national data protection authorities’ independence and powers, so that they are properly equipped to deal effectively with complaints, with powers to carry out effective investigations, take binding decisions and impose effective and dissuasive sanctions. It also aims at improving administrative and judicial remedies when data protection rights are violated.

The new text foresees namely the possibility for qualified associations to bring actions to Court on behalf of individuals. 

Enhancing of the accountability of the data processors

The aim of the reform proposed by the Commission is to strengthen individual rights, by informing them of the processing of their data and by allowing them to exercise their rights more effectively.

The reform of the EU’s data protection rules will oblige thus companies to strengthen their security measures to prevent and avoid breaches and to notify data breaches to both the national data protection authority – within 24 hour of the breach being discovered– and the individuals concerned without undue delay.

The Regulation introduces also the ” Privacy by Design” principle to make sure that data protection safeguards are taken into account at the planning stage of procedures and systems. Moreover, the new text introduces for organizations involved in risky processing the obligation to carry out Data Protection Impact Assessments.

In addition, the proposed Regulation introduces the concept of “risky processing” and requires from data controllers to designate a Data Protection Officer in companies with more than 250 employees and in firms which are involved in processing operations which, by virtue of their nature, their scope or their purposes, present specific risks to the rights and freedoms of individuals. 

Strengthening the functioning of the Single Market

The Commission proposes a clear and uniform legislative framework at European level, which will help to strengthen the potential of the Digital Single Market and promote economic growth and innovation. The chosen form of the legal text will put an end to the fragmentation of different legal regimes across the Member States and remove thus the obstacles to market entry. A Regulation directly applicable in all Member States will avoid cumulative and simultaneous application of different national data protection laws. This will definitely simplify the regulatory environment and as a result will cut red tape and eliminate formalities. This will particularly help micro, small and medium sized enterprises to which a special attention is given their considerable importance for the competitiveness of the European economy.

In addition, Commission proposes to further enhance the independence and powers of national data protection authorities (DPAs) in order to make them more effective. They will be given the possibility to carry out investigations, to take binding decisions and to impose effective and dissuasive sanctions. Moreover the Regulation will give the possibility to data controllers in the EU to deal only with the DPA of the Member State where the company’s main establishment is located. Hence in case of violation of data protection, only the data protection authority where the company has its main establishment will be responsible for deciding whether the company is acting within the law or not. At the same time, the Regulation aims in ensuring a prompt, and effective coordination between national data protection authorities, by creating the conditions for an efficient cooperation between DPAs, including the obligation for one DPA to carry out investigations and inspections upon request from another as well as the mutual recognition of each other’s decisions.

Case Law