In March of 2018, the lid blew on the Cambridge Analytica/Facebook Scandal. Though data breaches in large companies are not, unfortunately, earth-shattering news anymore, this was an unprecedented blow to data privacy in one of the largest social media companies – affecting 50 million users. What was so shocking this time, was that rather than the traditional narrative we are far more accustomed to in this day and age of a hacker illegally obtaining information directly from a data controller or processer without their knowledge, citizens across the globe were shocked to find out that the data they had given their affirmative consent to have taken by a third party app provider through Facebook had been given to an outside third-party and used to target and create strategies in political campaigns. Furthermore, users were even more shocked to find out that, despite knowing for two years and having signed a consent decree with the Federal Trade Commission (FTC), Facebook had failed to warn its users that the information had been accessed and had not taken any steps until the story was reported on to ensure that the data accessed was destroyed by Cambridge Analytica. While much of the news coverage in the United States has covered Cambridge Analytica, Facebook, and the Trump campaign, the story is much larger than that.
Cambridge Analytica is effectively a shell company of the SCL Group, a British company, which has multiple subsidiaries such as AggregateIQ, which operated in the UK. SCL’s own website, at the time of publication, states that they have done work in over 60 countries. SCL has ties, either directly or through its subsidiaries, to political campaigns across the globe such as the Leave campaign in the UK, elections in Indonesia, Thailand, Kenya, India , and the US- having played a part in both Ted Cruz’s and Donald Trump’s presidential campaigns. Potentially illegal and definitely shady campaign tactics aside, Cambridge Analytica may have used illegally obtained, private information, from citizens across the globe to shape election outcomes. Even more worryingly, that data is still out there – some of it accessible by people with the means to find it, and some of it still on Cambridge Analytica servers, completely unencrypted.
Currently, there are multiple suits and investigations by States into the practices of both Facebook and Cambridge Analytica.
In the United States, the FTC has begun an investigation into Facebook’s privacy practices to determine whether Facebook has violated the consent decree it entered into with the FTC. If the FTC finds the Facebook has breached the consent agreement, Facebook could be fined up to $40,000 per violation per day, which could be hundreds of millions of dollars. However, it is still unknown whether the FTC will find a breach and, if it does, to what extent it will pursue the penalties. State and local governments in the United States have also filed suits against Facebook for breach of data privacy under state and local law. Also in the United States, two watchdog groups have filed suits against the Trump Campaign, which hired Cambridge Analytica, and Cambridge Analytica. They allege that Cambridge Analytica breached federal election law when they allowed non-U.S. citizens to work on the Trump Campaign, knowingly in breach of the law.
In Canada, the Privacy Commissioner, Daniel Therrien, has begun a formal investigation into Facebook to determine if there was a breach of privacy laws. Commissioner Therrien also called for more protective and comprehensive privacy laws. A Canadian Parliamentary committee recently issued a report calling for comprehensive changes to federal private sector privacy law that Commissioner Therrien championed.
In the U.K., the Information Commissioner’s Office (ICO) executed a search warrant against Cambridge Analytic to search its offices. The ICO was seeking evidence before deciding next steps in their “investigation into the use of personal data and analytics for political purposes.” Also in the U.K., a U.S. citizen, David Carroll, filed a suit in British Courts alleging breach of his data privacy rights by SCL. His solicitor argued that this is a test case for what the data privacy rights of private citizens are in Britain.
In Colombia, with their presidential elections impending in late May, the government took the drastic measure of blocking access to an app that had ties to Cambridge Analytica Mexico to prevent user data from being shared and used to influence the upcoming election.
The suits and investigations make one thing very clear – the world has woken up to the need for more protective, expansive, and comprehensive data privacy protections. But one thing is also very clear, the diversity and inconsistency and uncertainty of these investigations and suits show that there isn’t clear direction as to who should be held accountable, by whom, and to what degree. The suits and investigations span as to what laws and regulations are being invoked, which parties are being held accountable, and with what consequences. But while the status quo may not offer a clear direction, the world is not left without a compass as to where it might go, knowing now the risk and need for regulation.
In April of 2016, the E.U. passed the General Data Protection Regulation (GDPR). This regulation goes into effect on May 25th of 2018. This expands upon the previously enacted Data Protection Directive from 1995 – which was already more stringent than other international regulations. In a move that reflects consumer sentiment that the company who is breached, not the breacher is at fault, the GDPR holds data controllers (the organization that owns the data) and processers (the organization that manages the data) liable. While the E.U. obviously only has domain over its member States’ territories, it has effectively regulated almost all companies (about 92% of them) by expansively defining who would be held liable. This definition includes all companies with a presence in the E.U., companies without an E.U. presence that process personal data, companies with over 250 employees, and companies with under 250 employees who engage in data processing that impacts rights and freedoms more than occasionally OR that process certain types of sensitive data including basic identity information, web data, health and genetic data, biometric data, racial or ethnic data, political opinion, and sexual orientation. Furthermore, it increases the maximum fine to a level that allows deterrence and gives guidance for how these penalties’ will be weighed. Under the GDPR, at the upper level, a company will be fined up to 4% of its annual worldwide turnover or 20 million Euros, whichever is higher. The GDPR gives 10 criteria for how to determine the amount of the fine on a non-compliant party, found here. The GDPR, further, provides for a wide scope for individuals to bring private claims for their own compensation without a requirement of financial loss against both data controllers and processors, and allows for class action type suits – even those by employees against employers for failing to protect their rights.
All in all, the GDPR widely enhances individual protection over data, and gives better guidance over what parties to hold liable, by whom and how to hold them liable, and to what degree to hold them liable and with what mechanisms. The Cambridge Analytica/Facebook scandal may have multiple dimensions to it, but the risk to data privacy of private citizens is a clear issue on everybody’s minds. Action must be, and likely will be taken, in the coming months to better protect data privacy. Though the GDPR has yet to be enacted and the success is yet to be proven, the GDPR still exists as a model for countries to follow.
Kelly Tieu is currently a 2L at Columbia Law School. She completed her B.A. in Political Science and Women’s and Gender Studies at Vanderbilt University in Nashville, Tennessee in 2016.