What Changes Does the GDPR Bring to the European Union’s Privacy Protections?

By:

On May 25th, 2018, the European Union’s highly anticipated General Data Protection Regulation (GDPR) goes into effect.  Published in April 2015, companies around the world have spent nearly two years frantically preparing to comply with the Regulation.  Since the beginning of 2018, Google has begun to permit users to choose what information they wish to share with its products. Amazon, in turn, has improved the data encryption for its cloud storage service and enhanced the transparency of its privacy agreement with customers. Facebook launched a new global data privacy center to communicate to users who can see their information and why the social network serves them specific adds. Yet the requirements established and rights protected in the Regulation do not depart dramatically from the principles present in the Data Protection Directive (DPD) that the EU adopted in the mid-1990s.  Thus, the question remains as to why the GDPR, and not Europe’s existing privacy protections, has created such a panic in Silicon Valley and Seattle.

Perhaps the fact that the European Commission has decided to replace the DPD is indication enough to corporations that their handling of personal data has not met the EU’s expectations. In 1995, when DPD went into effect, Amazon was only a year old, Google’s founding was three years away, and social networks like Friendster, Myspace, and Facebook would not emerge for another decade.  Now, “information multinationals” like Google, Facebook, Amazon, Twitter, and their peers have developed an immensely valuable personal data market in which they hold dominant positions.  Instead of diminishing the collection and exploitation of personal data, the DPD permitted the data revolution to flourish.  The financial consequences, targeted language, and new provisions present in the GDPR all signal the EU’s goal to curtail the uninhibited amalgamation of personal data.

At its core, the GDPR embraces the same fundamental set of rights for EU residents that the DPD espoused three decades earlier.  Organizations who process and control personal data remain obligated to collect data only for specified, explicit, and legitimate purposes (Art. 6(1)(b), Art. 5(1)(b)). Processing operations for personal data still must be publicized (Art. 21, Art. 5(1)(a)). The requirement that companies attain consent to process personal data persists (Art. 7(a), Art. 6(1)), and data subjects retain the right to object to such processing (Art. 14(b), Art. 21(3))—including specific types of processing, like automated processing (Art. 15(1), Art. 22(1)). The right of individuals to request the personal information collected about themselves remains (Art. 12(a), Art. 15(3)), and in instances where personal data is incorrect, both pieces of privacy legislation entitle data subjects to have the information corrected (Art. 10(c), Art. 16). In short, the GDPR embraces the fundamental principles that the DPD already mandated EU Member States to impose on organizations that process or control personal data.  However, a few aspects of the Regulation set it apart from its predecessor and enhance the GDPR’s chances of having a profound impact.

The most fundamental difference between the DPD and the GDPR is apparent in the names of the different pieces of legislation—the DPD is a directive, while the GDPR is a regulation.  A directive sets goals for EU countries to achieve, but does not proscribe the methods of implementation.  The DPD provides sanctions to impose on Member States who refuse to incorporate the Union’s privacy goals into their domestic legal systems. Thus, under the DPD, although each country appointed its own privacy czar to enforce the domestic privacy laws passed in accordance with the Directive, enforcement varied between each country. For example, Germany and Spain established reputations for being the strictest protectors of personal privacy, issuing fines of 250,000 euros and 300,500 euros for privacy violations respectively. Other Member States, in contrast, issued less harsh standard fines for noncompliance. A regulation, in contrast, is a binding piece of legislation that becomes enforceable throughout the community without relying on Member States to implement its mandates through domestic legislation.  As a result, the GDPR’s principles apply immediately, directly, and uniformly to companies who conduct business within the EU or with EU residents, without the possibility that individual nations will dilute their efficacy.

Because the GDPR will become immediately binding throughout the EU on May 25, the substantial fines that the Regulation imposes on violators undoubtedly has captured the attention of corporations around the world.  Administrative fines under the GDPR start at 10,000,000 EUR or two percent of the company’s annual worldwide turnover of the previous fiscal year, whichever is higher (Art. 83(4)). In cases where a violation is more severe or a company refuses to comply with an order by the supervisory authority, the administrative fine rises to 20,000,000 EUR or four percent of the company’s annual worldwide turnover of the previous fiscal year, whichever is higher (Art. 83(5), Art. 83(6)). Facebook, for example, could face a maximum possible fine of $1.1 billion under the Regulation. Last year, the European Commissioned fined the social media network $122 million for antitrust violations in relation to its acquisition of WhatsApp. At the time, the fine constituted one of the largest regulatory penalties that Facebook has ever been required to pay. Now that Facebook and its peers could face regulatory fines nine times the value of the most severe penalties that they encountered a year ago, the most important incentive to prepare for the Regulation is obvious.

The GDPR also grants private persons the right to an effective judicial remedy. If users prove that they have suffered material or non-material damage as a result of a company’s infringement, they have a right to receive compensation from that company for the damage suffered. With more than 350 million users in the European Union, the potential volume of people seeking judicial remedies for privacy violations is staggering.

In the GDPR, the EU has re-defined some of the DPD’s principles to clearly indicate the legislature’s distaste for some of the “information multinationals” behavior. For example, with regard to the principle of transparency, the DPD merely required Member States to “take measures to ensure that processing operations are publicized” (Art. 21).  The GDPR, on the other hand, insists that organizations process personal data “lawfully, fairly, and in a transparent manner” (Art. 5(1)(a)). The Regulation also adds that organizations must provide data subjects with the necessary information relating to the processing of their data in a “concise, transparent, intelligible, and easily accessible form” (Art. 12(1)).  The redefined transparency requirement clearly targets the common practice among technology companies of presenting users with long, convoluted privacy policies.

The GDPR also further defines the meaning of the term “unambiguous consent” present in the DPD by clarifying that a data subject’s consent is only legitimate in the eyes of the Union if it is a “freely given, specific, informed and unambiguous indication of the data subject’s wishes” (5(1)(a)).  The data subject’s indication must manifest in a clear, affirmative action that signifies agreement (5(1)(a)).  The clarification of what qualifies as “unambiguous consent” addresses standard behavior among companies driven by personal data, in which they require users to agree to complicated and lengthy terms of service before customers may access the product.  The European Commission’s unequivocal desire to change the behavior of the corporations processing personal data, coupled with the harsh potential administrative and judicial consequences, sends a clear message to Facebook, Google, and their peers that their present method of operation has not gone unnoticed.

The GDPR also includes new components not present in the DPD that specifically address the behaviors of Silicon Valley’s and Seattle’s leading tech companies.  The Directive required that a company gain a data subject’s explicit consent before processing certain types of sensitive information relating to a person’s race, religion, political opinions, membership in a trade union, health, or sex life (Art. 9(1)).  Member States had the option to ban such activities even when subjects had given their consent, but the DPD did not require domestic laws to incorporate an outright prohibition (Art. 8(2)).  The GDPR creates an inalienable right to be free from profiling and provides no consent exceptions (Art. 9(1)).  Given that Facebook and Google hold a duopoly over online advertising in part because they offer advertisers such specific categories to target as “ethnic affinity,” purchasers of certain types of over-the-counter medications, and “political leanings,” to name a few, the Regulation sends a direct message that these methods of categorizing individuals will no longer be accepted within the Union.

Although the famous “Right to Be Forgotten” already existed as a common law right in the European Union since the Google Spain case in 2014, the GDPR codifies this right as the “Right of Erasure” and attaches the Regulation’s substantial regulatory fines and judicial remedies to it.  In Google Spain SL v. Agencia Española de Protección, the European Court of Justice held that, upon request of a data subject, Google must delete links to search results relating to that subject, so long as there is no strong public interest that suggests the information remain public.  Under the Regulation, when data is no longer necessary for its processing purposes, the subject withdraws consent to the processing, or there is no longer legal grounds for the processing, data subjects have a right to erasure of that data, so that they may be “forgotten” (Art. 17(1)).  The right to erasure does not apply to processing that is necessary for exercising the right of freedom of expression and information” (Art. 17(3)).  However, where the erasure requests meet the enumerated requirements, the erasure is obligatory and must be completed “without undue delay” (Art. 17(1)).

While most of the rights and requirements present in the GDPR already existed in the DPD, the Regulation adjusted the regulated behavior and the fines for noncompliance to apply more effectively to the post-data-revolution context.  The fines of a few hundred thousand euros that once rendered Germany and Spain the EU’s strictest privacy regulators no longer matter to a company like Google, whose parent company Alphabet just surpassed 80 billion euros in annual revenue this year.  Both Google and Facebook have generated over 95 percent of their annual revenue in the past several years from selling targeted advertising opportunities online.  Together, the Silicon Valley giants make up nearly 70 percent of the online advertising market, and their immense success in this area can largely be traced to the advantages that their comprehensive personal data sets give them.  As a result, for the most prominent companies in the behavioral advertising space, the incentives not to collect such information must outweigh the irresistible temptation to collect all potentially valuable personal data without inhibition.  The GDPR attempts to provide such an incentive by specifically addressing the common data collection practices of the world’s largest personal data hoarders and creating fines for violations that have already garnered their attention.  US tech giants are aware that the GDPR’s drafters wrote the Regulation with them in mind, and for this reason, they likely understand that inaction is not an option.

Sara Lynch is a second-year J.D. student at Columbia Law School and a Staff Editor of the Columbia Journal of Transnational Law. She holds a B.A. from Wesleyan University, where she studied History and German.