Last week, France’s National Data Protection Commission (“CNIL”), imposed a $57 million fine against Google LLC for violations of the EU’s General Data Protection Regulation (“GDPR”). With regard to onboarding new Android users, the agency said Google did not live up to two of the GDPR’s key mandates: 1) transparency in personal data processing and 2) specific and unambiguous user consent.
Sidebar: Quick & Dirty Overview of GDPR
The General Data Protection Regulation (“GDPR”), concerns processing of personal data for commercial purposes. “Personal data” under the GDPR, means any information that relates to an identified or identifiable living individual, such as their full name, address, ID number, location data, IP address or advertising identifier of their phone. “Processing,” also a defined term, covers a range of activities such as collecting, recording, organizing, structuring, storing, adapting or altering, retrieving, consulting, using, disclosing by transmission, disseminating or otherwise making available, alignment or combination, restriction, erasure, or destruction of personal data. The GDPR’s principles are applied to data “controllers” as well as data “processors.” Here, Google was considered the data “controller” because it “determines the purposes and means” of how the data is processed.
The Regulation is enforced by national data protection authorities (DPAs), which are independent public authorities located in EU Member States. The DPAs supervise the GDPR’s application through their investigatory and remedial powers. (For excellent big-picture presentations of the GDPR, see these overviews.) Individuals and nonprofit advocacy organizations can bring complaints to the attention of national DPAs and allege violations of the GDPR’s protections.
Here, the relevant GDPR provisions were found in Articles 6, 7, 9, 12, and 13. Under Article 6 of the GDPR, processing of data is lawful only if one or more of six enumerated bases applies:
- consent by the data subject;
- necessity for the performance of a contract;
- necessity for the data controller to comply with law;
- necessity in order to protect the “vital interests” of the data subject or other natural person;
- necessity for the performance of a task carried out in the public interest or in the exercise of official authority, or
- necessity for the purposes of legitimate interests pursued by the data controller or a third party (with exceptions).
Under Article 9, processing of sensitive personal data such as health data, data pertaining to race, national origin, or political opinions is subject to a general prohibition with narrower delineated exceptions: for Google’s purposes, “explicit consent” by the data subject was required.
Article 7 gives some contour to the notion of “consent,” specifically that the controller should be able to demonstrate the consent of the data subject. In addition, consent may not be obtained by making consent a condition of performing a contract if processing is not “necessary for the performance of that contract.”
Article 12 requires controllers to provide information concerning the collection of personal data in a manner that is “concise, transparent, intelligible and easily accessible […] using clear and plain language.” And Article 13 requires the disclosure of a host of information at the moment any personal data is obtained from an individual, including:
the identity and the contact details of the controller and, where applicable, of the controller’s representative;
the contact details of the data protection officer, where applicable;
the purposes of the processing for which the personal data are intended as well as the legal basis for the processing;
where the processing is based on point (f) of Article 6(1) [legitimate interests], the legitimate interests pursued by the controller or by a third party;
the recipients or categories of recipients of the personal data, if any;
where applicable, the fact that the controller intends to transfer personal data to a third country or international organisation and the existence or absence of an adequacy decision by the Commission, or in the case of transfers referred to in Article 46 or 47, or the second subparagraph of Article 49(1), reference to the appropriate or suitable safeguards and the means by which to obtain a copy of them or where they have been made available.
Complaint Against Google LLC
A complaint was filed by NOYB (an acronym for “none of your business,” a fitting title) an Austrian nonprofit digital rights organization. The group, founded by Max Schrems, focuses their resources on targeted and strategic litigation to ensure compliance by the tech industry with the GDPR. Per the group’s website, they believe that the process for enforcing data privacy rights is too complicated and expensive for individuals.
Under Article 80, the GDPR allows data subjects to be represented by qualified nonprofit organizations to enforce the law. It also allows nonprofits to lodge complaints directly “if it considers that the rights of a data subject under this Regulation have been infringed as a result of the processing.” Here, NOYB filed one complaint against Google based on issues relating to the Android operating system, and simultaneously filed three others against Facebook, Instagram, and Whatsapp. (A similar advocacy group also filed a complaint against Google with the French authority, and CNIL’s decision addresses both complaints.)
NOYB claimed Google failed to obtain effective consent from users before processing both sensitive and non-sensitive personal data in reliance on consent as the basis of its processing. The consent failed for two reasons, NOYB argued because it was not specific to the type of processing that was being conducted. Further, the notion of “freely given” consent was negated by the fact that the only alternative to consenting in full was to not use the Android product (there was no way for users to even use the phone without consenting).
Additionally, NOYB argued that Google’s approach to informing users of how their data would be processed boiled down to no more than Google saying they would be “processing any data, for any purpose, on any of four legal grounds, for any of [its] products.” NOYB said that even a trained reader would have been unable to discern which data would be processed, for which purpose, and on which legal basis. Furthermore, the group stated that Google’s request for consent to conduct certain types of processing was not distinguished from its non-consent-based processing, therefore violating Article 7(2)’s requirement that consent must be “clearly distinguishable” from other matters. NOYB requested CNIL conduct an investigation, prohibit the offending processing practices, and issue an “effective, proportionate, and dissuasive” fine.
The Response from France’s DPA, CNIL
First, CNIL found (English translation; original here) two types of breaches. First, Google violated the obligation of transparency and information (presumably a violation of Articles 12 and 13) because:
- Google failed to provide essential information, such as the type of data collected, how the data would be used, and how long it would be stored;
- The information about processing Google did provide was spread across several documents which required users to go hunting for the information;
- Information about Google’s data collection practices were not clear or comprehensive;
- Users were not able to fully understand the extent of the processing operations carried out by Google;
- The processing operations were particularly massive and intrusive due to the number of services Google offers and the amount of data it collects;
- The legal basis of processing operations for the ads personalization was not clearly conveyed; and
- Information about the retention period for some data was not provided.
Second, Google violated the obligation to obtain a user’s consent for the use of their data (Article 6.1) because:
- The users’ consent was not sufficiently informed due to the difficulty or impossibility for users to know the full extent of how and across what platforms the data would be used;
- The consent was insufficiently “specific” or “unambiguous,” because Google failed to distinguish consent-based processing from other processing bases; and
- Google had pre-configured the users’ default privacy settings to “consent”/”agree.” GDPR requires an affirmative action by the user for effective consent; therefore, the boxes should have been unchecked thus allowing the user to make the choice to opt in.
The authority fined Google 50 million euros.
While the amount of the fine is relatively small for a company like Google LLC, it was large enough to accomplish the apparent purpose of grabbing everyone’s attention. By now everyone is somewhat familiar with the GDPR, but the manner of enforcement has been unclear. The fine looks to be a wake-up call to all data processing organizations, even those outside the European Union, and indicates that noncompliance will be costly. (Granted, the decision could be appealed—Google has four months to do so.)
What is still somewhat unclear is how much Google’s size and vast resources led to this result. In the complaint, NOYB requested CNIL consider Google’s actions “a willful, massive and profound violation by a major player within the data industry” which “must be adequately sanctioned to prevent similar violations of the GDPR…” CNIL agreed at least in part, stating the amount of the fine was justified given the severity of the violation and the core principles violated. In addition, it emphasized Google’s responsibility to properly implement the GDPR given the many benefits Google derived from powering its Android advertising business model with user data.
Many question still remain: how will firms heed this first warning from one of the GDPR’s enforcement authority? How high will the next fine be? Given that the amount of fines is discretionary (capped at the greater of €20 million or 4% of global revenue, which in Google’s case would have been a cool EUR 3.8 billion, roughly 4% of the FY17 worldwide revenue of Alphabet Group), the amount could climb considerably.
There is certainly some tension in the idea that a company must be thorough with its disclosures and concise and plain-speaking enough to permit average Internet users to understand. It’s likely that corporate paternalism won’t suffice in the face of an enforcement action. Perhaps this is the point of the GDPR; minimize companies’ use of data by raising the cost of compliance. Then again, if users’ options over which personal data they share curtail the current models of web-based revenues, much more may change than simply the landscape of enforcement actions. Will these regulatory costs mean internet advertising will become more expensive?
A few baseline instructions for business owners emerge from this early GDPR decision, if they are covered by the GDPR:
- Clearly explain to users:
- which data will be collected,
- exactly how it will be used or processed, and
- how long it will be stored.
- Make this information easily available; don’t make users go hunting for the information across multiple pages.
- Identify how information will be shared with or across affiliated and third-party services.
- State the legal basis for processing data (usually user consent).
- If you ask for consent, make sure it covers each processing purpose. Don’t “bundle” user consents.
- Make sure you can demonstrate users took an affirmative action to give consent. Keep records.
- Do all of this before the user can access the service.
With the GDPR in place, even U.S. companies of all sizes need to be careful with how they communicate their personal data processing practices to users and how they permit users to access the company’s websites and services. For questions about this article or help with your policies, feel free to shoot us a note, or attend one of our upcoming workshops on GDPR and state law compliance. And with that, we’re off to revise our website policies.