Jul 31, 2018 2:00 PM
by Derek Lackey
In this Privacy Tracker series, we look at laws from across the globe and match them up against the EU General Data Protection Regulation. The aim is to help you determine how much duplication of operational effort you might avoid as you work toward compliance and help you focus your efforts. In this installment, Lydia De La Torre, CIPP/US, compares the new California Consumer Privacy Act 2018 to the GDPR.
We all found out the results of the World Cup July 15, but there is a different matchup in the data protection world, the results of which will remain unknown until 2020: the EU General Data Protection Regulation and the California Consumer Privacy Act 2018.
Most data protection professionals would agree that the GDPR sets the global “gold-standard” for data protection and has forced companies across the globe to significantly update their data practices and ramp up their compliance programs. Many would likely dispute whether the CaCPA deserves to be placed at the same level, Honestly, it may be too early to tell. As the first U.S. attempt at a comprehensive data protection law, the CaCPA has the potential to become as consequential as the GDPR. After all, California is the fifth largest economy in the world, the home of many technology titans, and traditionally a trend-setting state for data protection and privacy in the U.S.
Although the CaCPA incorporates some concepts that data protection professionals are familiar with, it is not modeled after the GDPR. Thus, compliance with the GDPR does not equate compliance with the CaCPA. This article compares the scope and main features of both laws.
The territorial scope of both the CaCPA and the GDPR extends well beyond the physical borders of their respective jurisdictions.
Under the GDPR, entities established in the EU are subject to the GDPR for all their processing activities (Article 3.1.), Entities that are not established in the EU but offer goods and services or monitor the behavior of individuals within the EU are subject to the GDPR only to the extent they process the personal data of those individuals (Article 3.2.).
The CaCPA applies to certain controllers that “do business in the State of California” regardless of where they are located but only to the extent that they process data of California residents. In other words, the “do business in California” test is the CaCPA equivalent to the GDPR’s “activities of an establishment,” but it only subjects entities to the CaCPA to the extent they process data of California residents. There is an exception in the CaCPA for conduct that takes place wholly outside of California but it is very narrow. Controllers that do not “do business in California” are outside of the scope of the CaCPA, even if they monitor the behavior of residents, so long as such monitoring cannot be considered “doing business in California.” Processors that provide services to controllers subject to the CaCPA are subject to the CaCPA themselves but their obligations are limited.
Although both the GDPR and the CaCPA regulate the handling of personal information there are significant differences in terms of the material scope.
For starters, the CaCPA does not expressly limit applicability to automated processing of data unlike most (if not all) data protection laws around the world do. There is potential, however, that the legislature will add this requirement or it will be read into the statute by courts.
The GDPR is built on three roles: controller, processor and data subject. The distinction between controller and processor is based on a factual determination. Any entity that de-facto “determines the purposes and means of the processing” of personal data takes the role of controller as to that data and any entity that process personal data on behalf of a controller takes the role of processor as to that processing. Controllers take on the bulk of data protection responsibilities under the GDPR, but there are many requirements that apply to processors, as well.
Under the CaCPA there are four concepts: “businesses,” “service providers,” “third parties” and “consumers.” Consumers are California residents and they have rights under the CaCPA vis-a-vis organizations that hold their data — whether they have a direct relationship with them or not.
Most the CaCPA obligations apply only to “businesses,” which are for-profit controllers (see reference to “alone, or jointly with others, determines the purposes and means of the processing” in Sec. 1798.140(c) of the California Civil Code) that meet certain thresholds (annual gross revenue over $25M; buys, sells or receives/shares for “commercial purposes” the data of 50,000 California residents; or derives 50 percent of revenue from “selling” personal data of California residents). Once an entity in a company group qualifies as a controller, parent companies and subsidiaries may automatically qualify even if they do not meet the thresholds or act as controllers.
A “service provider” is a processor to a “business” that receives the data for “business purposes” under a written contract containing certain provisions. Only for-profit entities can be “service providers” under the current drafting of the CaCPA.
“Third parties” are entities other than "businesses" or “service providers” and they are only subject to the CaCPA to the extent that they receive data from a “business.”
To summarize, if we were to translate the CaCPA into GDPR jargon, a “consumer” is a data subject, a “business” is a controller that meets certain requirements, and also includes some entities in the controller’s group; a “service provider” is a processor for a “business” that meets certain requirements; and a “third party” is any entity that is neither a“business” nor a “service provider.”
Another definitional difference concerns “personal data.” The definition of personal data is expansive in the CaCPA. The CaCPA states that personal data “identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household,” and then provides a long list of examples of specific pieces of information that are to be considered personal data — including not only IP addresses, cookies, beacons and pixel tags that can be used to recognize a data subject but also things like “probabilistic identifiers” and “gait patterns.” This definition is potentially broader than the definition of personal data under the GDPR.
Data processing principles
One of the most striking differences between the CaCPA and the GDPR is that the CaCPA does not contain data processing principles and, in fact, imposes few restrictions on what a “business” can do internally with personal data. However, the CaCPA authorizes the California Attorney General to issue guidance on the law. It would make sense for that guidance to describe the CaCPA data protection principles, we will have to wait on that though.
The GDPR, like the 1995 Data Protection Directive, sets the rule that processing personal data is illegal unless the processing can be justified under one of six lawful bases. The CaCPA does not contain any similar provision; the general rule is that processing is allowed. It does, however, allow California residents to opt out of certain types of processing (what the CaCPA defines as a “sell”).
Data subject rights
The GDPR contains the traditional rights of access, rectification, correction and opposition, which are a common feature of most comprehensive data protection frameworks around the world. It also includes additional rights such as the right to data portability and the so-called “right to an explanation.”
The CaCPA confers six rights on California residents. The first one, the right to access personal data, is very similar to the access rights under the GDPR but the others are not. For example, the CaCPA contains a right to cancel (erase) data but it only applies to data that is collected by a “business” “from” the California resident exercising the right. What exactly that means is not clear at this point but we can anticipate a debate over whether data collected by CCTV cameras or data scraped from online public profiles is subject to the CaCPA's erasure right. One thing we can know for sure is that the CaCPA would not support a case like Spain's Costeja case, because Google did not collect the now famous (or infamous) newspaper bankruptcy report from Mr. Costeja but from a third party. One final point: The exceptions to the right to erase under CaCPA are also very different from the grounds that justify erasure and the balancing tests built into the GDPR and will require separate analysis.
The CaCPA contains two rights to know: The right to know what information has been collected, and the right to know what information has been shared. These rights are fairly prescriptive; however, the current version of the CaCPA contains contradictions that make providing a clear interpretation of exactly what will have to be disclosed impossible. What seems clear is that businesses will have to evaluate their practices to identify what sharing is to be considered for “business purposes” and what sharing is to be considered for “commercial purposes” under the CaCPA, as those two purposes will need to be separately disclosed.
As opposed to the GDPR, the CaCPA allows businesses to “sell” personal data but gives individuals the right to opt out of (or, in the case of minors under 16, the option to opt in to) the selling of their data (referred to as "the right to say no"). In GDPR terms, this right would be a limited version of the right to restrict processing under Article 18. The definition of a “sale” is not clear, it refers to transfers to “third parties” or “other businesses” for “monetary or other valuable consideration," and guidance from the California attorney general on this point is expected.
As with the GDPR, the CaCPA does not allow for discrimination against individuals who exercise their rights under the act. The CaCPA expressly allows for financial incentives so long as they are not “unjust, unreasonable, coercive, or usurious in nature.” The CaCPA's provisions on discrimination are unclear and somewhat contradictory. For example, the CaCPA states specifically that business are not prohibited from “charging a consumer a different price or rate, or from providing a different level or quality of goods or services to the consumer if that difference is reasonably related to the value provided to the consumer by the consumer’s data.” It is unclear exactly what is the value provided to the consumer by their own data.
Similar to the GDPR, the CaCPA assigns responsibility for enforcement to a governmental authority: the California Attorney General’s Office. Civil penalties can be significant under the CaCPA as they may reach up to $7,500 per violation. We will have to wait and see whether the attorney general will pursue a hard-line approach to enforcement or whether it will be moderate —since the attorney general is an elected position, we can anticipate that the approach will be somewhat dependent on the political winds at the time.
As opposed to the GDPR, the CaCPA does not create a private right of action except for data breaches. Specifically, the CaCPA allows any consumer whose “nonencrypted or nonredacted personal information” is “subject to an unauthorized access and exfiltration, theft, or disclosure as a result of the business’s violation of the duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the information” to sue to recover statutory damages between $100 and $750 per consumer per incident or actual damages, whichever is greater, and obtain other forms of relief. Service providers are not exposed to the private cause of action as it only applies to “businesses.” The plaintiff’s bar likely has high hopes for this provision. Companies that suffer a breach will see litigation on the basis of the CaCPA and face significant potential exposure in terms of damages awards (think “TCPA-plus”).
The private cause of action has many requirements, the most important being that potential plaintiffs must first notify the attorney general of their desire to sue, and they cannot proceed with their lawsuits if the attorney general prosecutes within six months. There is debate about the legality of these requirements, and we will likely see it challenged in court by the plaintiff’s bar.
In short, the CaCPA is the first overarching U.S. data protection law but it is significantly different from other data protection laws like the GDPR. It will require companies doing business in California to invest in compliance. Nobody should assume that being GDPR compliant makes them CaCPA compliant.
Read The Full Article
Jul 9, 2018 1:00 PM
by Derek Lackey
Lothar Determann provides a well written account of the new CCPA 2018 in article on the IAPP website.
Broad data and business regulation, applicable worldwide
As of January 1, 2020, companies around the world will have to comply with additional regulations related to processing of personal data of California residents. Pursuant to the California Consumer Privacy Act of 2018, companies have to observe restrictions on data monetization business models, accommodate rights to access, deletion, and porting of personal data, update their privacy policies and brace for additional penalties and liquidated damages. The California Legislature adopted and the governor signed the bill on June 28, 2018, after an unusually rushed process in exchange for the proposed initiative measure No. 17-0039 regarding the Consumer Right to Privacy Act of 2018, also known as the "ballot initiative," being withdrawn from the ballot the same day, the deadline for such withdrawals prior to the November 6, 2018 election.
In total, the California Consumer Privacy Act adds to the California Civil Code about 10,000 new words and Sections 1798.100 to 1798.198. In recitals, the legislature acknowledges some of the many existing data privacy laws that California has already enacted over the years. California has led the United States and often the world in codifying privacy protections, enacting the first laws requiring notifications of data security breaches (2002) and website privacy polices (2004). In the operative section of the new law, however, the California Consumer Privacy Act's drafters did not address any overlap or inconsistencies between the new law and any of California's existing privacy laws, perhaps due to the rushed legislative process, perhaps due to limitations on the ability to negotiate with the proponents of the Initiative. Instead, the new Cal. Civ. Code §1798.175 prescribes that in case of any conflicts with California laws, the law that affords the greatest privacy protections shall control. Notably Cal. Civ. Code §1798.194 instructs courts that the new law "shall be liberally construed to effectuate its purposes."
Consequently, companies, privacy officers, lawyers and others will have to deal with an even more complex and fragmented privacy law landscape in California, and therefore in the United States and the world. Given the impact of California on the global economy as the 5th largest, behind only the United States as a whole, China, Japan and Germany, most global companies will have to continue to do business in California.
This initial overview answers four key practical questions that my California Privacy Law guide and commentary already covers with respect to California's other privacy laws. It then offers some basic comparisons to the ballot initiative and the EU General Data Protection Regulation, and concludes with my outlook. I welcome your thoughts, questions and comments and look forward to additional views and insights.
Who and what data is protected?
Principally, all California residents are protected under the California Consumer Privacy Act with respect to any information that relates to them.
Californians are not just protected in their roles as consumers, but also as employees, patients, tenants, students, parents, children, etc. Cal. Civ. Code §1798.140(g) defines "consumer" as any "natural person who is a California resident, as defined in Section 17014 of Title 18 of the California Code of Regulations, as that section read on September 1, 2017, however identified, including by any unique identifier." According to these regulations, a “resident” "includes (1) every individual who is in the State for other than a temporary or transitory purpose, and (2) every individual who is domiciled in the State who is outside the State for a temporary or transitory purpose," subject to a number of clarifications and specifications set forth in Section 17014 of Title 18 of the California Code of Regulations.
Unlike other sections of the California Civil Code, however, or any other privacy laws, Cal. Civ. Code §1798.140(o)(1) defines the term "personal information" broadly as "any information that ... relates to ... a particular consumer or household." Data can be protected even if it does not relate to a single individual (since "households" are covered) and if it does not contain a name. For example, annual water or energy consumption of a household, a particular employee's job description, an Internet Protocol address, web browsing history and "purchasing tendencies" will be regulated as personal information, even if no names are associated with it.
A number of limited and complex exceptions apply to this definition. For example "publicly available information" (Cal. Civ. Code §1798.140(o)(2)) and "commercial conduct [that] takes place wholly outside of California" (Cal. Civ. Code §1798.145(a)(6)) are excluded.
Who must comply?
Companies around the world have to comply with the California Consumer Privacy Act if they receive personal data from California residents and if they — or their parent company or a subsidiary — exceed one of three thresholds: (a) annual gross revenues of $25 million; (b) obtains personal information of 50,000 or more California residents, households or devices annually; or (c) 50 percent or more annual revenue from selling California residents’ personal information. Parent companies and subsidiaries using the same branding are covered in the definition of "business," even if they themselves do not exceed the applicable thresholds.
(a) Companies must comply if they have annual revenues in excess of U.S. $25 million. It is not clear whether this number needs to include only their California revenue or global sales. While Cal. Civ. Code § 1714.43(a)(1) defines the scope of the California Transparency in Supply Chains Act expressly in reference to "annual worldwide gross receipts," the new Cal. Civ. Code §1798.140(c)(1)(A) merely refers to "annual gross revenues" without an expanding reference to "worldwide." Yet, a limiting reference to California is also missing, as for example contained in Section 17942(a)(2) of the California Revenue and Taxation Code, which refers to "total income from all sources derived from or attributable to this state."
(b) A company would need to comply if it obtains personal information of at least 50,000 California residents annually. Companies may pass this threshold more quickly than anticipated because the scope of personal information is broad. Most companies operate websites and inevitably capture IP addresses. Notably, companies need to comply regardless of whether the website targeted businesses or individual customers in California given that the term "consumer" is defined to mean any "resident." Even individual bloggers and relatively small businesses outside California may find it difficult to ensure that they do not receive personal information of more than 50,000 California resident visitors to their website annually, simply from having it be passively accessible from there, and, within California, most retailers, fitness studios, music venues and other businesses will meet this threshold.
(c) Companies can also be subject to the law based on whether they sell California residents' personal information. A relatively small company in California may need to comply if it derives more than 50 percent of its annual revenue from selling California residents’ personal information. “Selling” is defined broadly to mean any disclosing or making available for monetary or other valuable consideration, subject to a number of exceptions set forth in Cal. Civ. Code §1798.140(t), including consumer-directed disclosures to third parties that do not sell the personal information, limited sharing with service providers, and business transfers in bankruptcy, M&A and similar transactions.
(d) A company without a physical presence or affiliate in California may be able to avoid complying with the statute, however, if it can ensure that its "commercial conduct takes place wholly outside of California" for purposes of Cal. Civ. Code §1798.145(a)(6)) and that it is not doing "business in the State of California" for purposes of Cal. Civ. Code §1798.140(c)(1). Most U.S. companies will find it difficult to determine that they are not doing business in the state of California, because the term "doing business" is generally understood very broadly. According to California Revenue and Taxation Code Section 23101(a) an out-of-state company is "doing business in California if it actively engages in any transaction for the purpose of financial or pecuniary gain or profit in California." Furthermore, according to California Corporations Code Sections 191(a), 15901.02(ai) and 17708.03(a), companies outside of California and qualified to do business in California may be subject to the law if they enter "into repeated and successive transactions" in California, which could occur remotely and online.
How to comply?
Companies will need to take a number of affirmative steps to comply with the new requirements, including the following:
Prepare data maps, inventories or other records of all personal information pertaining to California residents, households and devices, as well as information sources, storage locations, usage and recipients, to add newly required disclosures to privacy policies, to prepare for data access, deletion, and portability requests, to secure prior consent for data sharing from parents and minors and to comply with opt-out requests to data sharing.
Consider alternative business models and web/mobile presences, including California-only sites and offerings, as suggested in Cal. Civ. Code §1798.135(b) and charges for formerly free services to address the complex and seemingly self-contradictory restrictions set forth in Cal. Civ. Code §1798.125 on a company's ability to impose service charges on California residents who object to alternate forms of data monetization.
Make available designated methods for submitting data access requests, including, at a minimum, a toll-free telephone number, pursuant to Cal. Civ. Code §1798.130(a).
Provide a clear and conspicuous “Do Not Sell My Personal Information” link on the business’ Internet homepage, that will direct users to a web page enabling them, or someone they authorize, to opt out of the sale of the resident’s personal information, per Cal. Civ. Code §1798.135(a)(1).
Fund and implement new systems and processes to comply with the new requirements, including to:
Verify the identity and authorization of persons who make requests for data access, deletion or portability.
Respond to requests for data access, deletion and portability within 45 days.
Avoid requesting opt-in consent for 12 months after a California resident opts out, per Cal. Civ. Code §1798.135(a)(5).
Update privacy policies with newly required information, including a description of California residents' rights per Cal. Civ. Code §1798.135(a)(2).
Determine the age of California residents to avoid charges that the company "willfully disregards the California resident’s age" and implement processes to obtain parental or guardian consent for minors under 13 years and the affirmative consent of minors between 13 and 16 years to data sharing for purposes of Cal. Civ. Code §1798.120(d); companies can try to obtain parental consent by providing a consent form to be signed by the parent and returned via U.S. mail, fax, or electronic scan.
What sanctions and remedies do companies face?
According to the new Cal. Civ. Code §1798.155, companies can be ordered in a civil action brought by the California Attorney General's Office to pay penalties of up to $7,500 per intentional violation of any provision of the California Consumer Privacy Act, or, for unintentional violations, if the company fails to cure the unintentional violation within 30 days of notice, $2,500 per violation under Section 17206 of the California Business and Professions Code. Twenty percent of such penalties collected by the State of California shall be allocated to a new "Consumer Privacy Fund" to fund enforcement.
According to the new Cal. Civ. Code §1798.150, companies that become victims of data theft or other data security breaches can be ordered in civil class action lawsuits to pay statutory damages between $100 to $750 per California resident and incident, or actual damages, whichever is greater, and any other relief a court deems proper, subject to an option of the California Attorney General's Office to prosecute the company instead of allowing civil suits to be brought against it.
Companies, activists, associations and others can be authorized to exercise opt-out rights on behalf of California residents according to Cal. Civ. Code §1798.135(c).
California Consumer Privacy Act and the initiative compared
The California Consumer Privacy Act implements many principles originally contained in the ballot initiative, but it notably moves the effective date from August 2019 to January 1, 2020. The California legislature can modify the California Consumer Privacy Act by simple legislative majority, whereas the initiative would have required another voter ballot or a 70 percent legislative majority and only allowed modifications that "are consistent with and further the intent of this Act." Also, the California Consumer Privacy Act scales back the options and incentives for enforcement through private litigation and provides greater differentiation in its restrictions regarding offers of charge-free and for-charge versions of services, depending on whether consumers opt out of or into data sharing (whereas the initiative contained an absolute prohibition of charges for consumers who opt out of data sharing). On the other hand, the California Consumer Privacy Act lowered the "big company threshold" to $25 million annual revenue whereas the Initiative had contemplated a $50 million threshold. Sponsors of the initiative published additional comparison points here.
California Consumer Privacy Act and EU GDPR compared
Companies around the world have been working feverishly on taking steps to comply with the EU General Data Protection Regulation, the first significant update of data protection laws in Europe for more than 20 years. The GDPR took effect on May 25, 2018, and required significant changes to documentation and data handling practices.
Some companies implemented many of their new privacy protection measures worldwide in the hopes of being able to avoid having to make further jurisdiction-specific updates for a while. The passage of the California Consumer Privacy Act has now raised the question as to whether these measures will be sufficient to the extent they reach California residents with their GDPR-related compliance measures. Unfortunately, the answer is largely, "No."
Global companies can and should try to address the requirements of the California Consumer Privacy Act, EU GDPR and other privacy regimes simultaneously and holistically in the interest of efficiency. But companies cannot just expand the coverage of their EU GDPR compliance measures to residents of California. For example, the California Consumer Privacy Act:
Prescribes disclosures, communication channels (including toll-free phone numbers) and other concrete measures that are not required to comply with the EU GDPR.
Contains a broader definition of "personal data" and also covers information pertaining to households and devices.
Establishes broad rights for California residents to direct deletion of data, with differing exceptions than those available under GDPR.
Establishes broad rights to access personal data without certain exceptions available under GDPR (e.g., disclosures that would implicate the privacy interests of third parties).
Imposes more rigid restrictions on data sharing for commercial purposes.
The EU GDPR leaves companies with the discretion to offer consumers a choice between for-charge services and charge-free services conditioned on informed, voluntary, specific and express consent to data monetization. Under Cal. Civ. Code §1798.125(a)(1), on the other hand, a "business shall not discriminate against a consumer because the consumer exercised any of the consumer’s rights ... including ... by ... charging different prices or rates ... including through the use of discounts or other benefits. ..." Under the California regime, if companies want to continue offering a charge-free service to Californians, companies cannot rely on revenue from data sharing or other usage to fund the service, because Californians can opt out of data sharing and demand data deletion. According to Cal. Civ. Code §1798.125(b)(3), companies may offer financial incentives to California residents, including compensation, for the collection or sale of their personal information, but only if they obtain prior opt-in consent which may be revoked by the customer at any time.
Companies around the world will need to start working right away to assess the California Consumer Privacy Act's impact on their business, systems and data handling practices. A year and a half is not a lot of time as anyone who has been working on EU GDPR compliance knows well.
The California legislature should also start working right away — on repealing, or at minimum simplifying and aligning the provisions within the California Consumer Privacy Act, as well as the dozens of existing privacy laws that are now partially or fully obsolete and create unnecessary complexities for companies within and outside California, particularly now that California has moved from sector- and harm-specific privacy legislation to a much broader and comprehensive privacy regime with the California Consumer Privacy Act. To keep its leadership role in privacy protections, the California legislature also has to play its part in keeping privacy protections and doing business in California manageable.
At the same time, Congress should consider whether it is time for a federal privacy law that harmonizes or preempts the fragmented landscape of divergent state privacy laws, including the 49 different state laws on data breach notifications that followed California's 2002 law and make it unnecessarily difficult and costly for companies to deal with crisis situations when they become victims of cyber attacks and data theft. The U.S. deliberately decided against overbroad federal data regulations from the 1970s until recently, and many good reasons against preemption still exist. However, California Consumer Privacy Act may drive the fragmentation of state law to the level that consensus may coalesce around suitable federal regulation.
Last but not least...
Read The Full Article
Lothar Determann practices and teaches international technology, commercial and intellectual property law. At Baker & McKenzie LLP in San Francisco and Palo Alto, he has been counseling companies since 1998 on taking their products, business models, intellectual property and contracts international, as well as on related commercial and compliance matters. He is admitted to practice in California and Germany. For more information see www.bakermckenzie.com.Prof. Determann has been a member of the Association of German Public Law Professors since 1999 and teaches Data Privacy Law, Computer Law and Internet Law at Freie Universität Berlin (since 1994), UC Berkeley School of Law (Boalt Hall, annually since 2004), Hastings College of the Law (since 2010), Stanford Law School (2011) and University of San Francisco School of Law (2000-2005). He has authored more than 100 articles and treatise contributions as well as 5books, including Determann’s Field Guide to Data Privacy Law (2nd Edition, 2015) and California Privacy Law -Practical Guide and Commentary (2016).
Jul 9, 2018 8:00 AM
by Derek Lackey
Editor's note: This story was updated at 10:30 a.m. on July 6 to reflect comments from the Department of Commerce.
While the U.S. was busy celebrating Independence Day July 4 with barbecues and fireworks, the European Parliament was debating the future of the Privacy Shield deal. The conclusion? Today, Parliament voted for its suspension.
The non-binding resolution was passed 303 to 223 votes, with 29 abstentions, and calls on the executive arm of the EU, the European Commission, to suspend the data-sharing deal “unless the U.S. is fully compliant” by Sept. 1.
Privacy Shield is the “gentlemen’s agreement” that came into force in 2016 after Safe Harbor was struck down. Like its predecessor, the arrangement allows the transfer of personal data from the EU to U.S. companies that have promised to adhere to European data protection standards.
However, Privacy Shield has been dogged by controversy since its inception, and Parliament’s own civil liberties committee found that the current Privacy Shield arrangement “does not provide the adequate level of protection.” This view has likely been reinforced by three recent hearings on the Facebook-Cambridge Analytica scandal where MEPs were left vexed by a lack of clear answers.
In their resolution, MEPs emphasized “the need for better monitoring of the agreement, given that both companies are certified under the Privacy Shield,” and expressed concern that “data breaches may pose a threat to democratic processes if data is used to manipulate political opinion or voting behaviour.”
The recent adoption of the U.S. Clarifying Lawful Overseas Use of Data Act (CLOUD Act), that allows police access to personal data across borders is also a worry and potentially in contravention of EU data protection laws.
British MEP Claude Moraes, who chairs the civil liberties committee and spearheaded the action against Privacy Shield, was pleased with the vote.
“This resolution makes clear that the Privacy Shield in its current form does not provide the adequate level of protection required by EU data protection law and the EU Charter. Progress has been made to improve on the Safe Harbor agreement but this is insufficient to ensure the legal certainty required for the transfer of personal data. The law is clear and, as set out in the GDPR, if the agreement is not adequate, and if the US authorities fail to comply with its terms, then it must be suspended until they do.”
A spokesperson for the U.S. Department of Commerce told The Privacy Advisor that Commerce is "surprised and disappointed the European Parliament disregarded the considerable information we provided — at the Parliament's express request — regarding the Trump Administration's commitment to the full functioning of Privacy Shield." Commerce called the information in the resolution "inaccurate and misleading" and said it "creates uncertainty for both U.S. and EU companies and consumers, and puts at risk the world's largest commercial relationship."
Paul Breitbarth, director at Nymity, said of the Septmeber 1 date: “That’s a pretty short deadline to renegotiate the Privacy Shield to make it GDPR compliant as well. The Shield is still based on the now defunct directive 95/46, and I doubt that they will be able to manage before the deadline. It takes two to tango, and the current U.S. administration does not seem to have privacy front of mind so far, unfortunately.”
Under the GDPR, important new notions like the right to data portability and additional obligations on data controllers, including the need to carry out data protection impact assessments and to comply with the principles of privacy by design and privacy by default, should be included in the Privacy Shield. That would require a renegotiation and approval, all within a few weeks. Breitbarth was skeptical such a feat could be managed.
However, not all MEPs were so keen to see Privacy Shield overhauled to make it fit for purpose. ECR Group MEP Dan Dalton called the vote “irresponsible and unrealistic,” and said it “could leave EU citizens in legal limbo."
He added, "Ultimatums from the European side may sound good to some politicians and their supporters, but in practice would be a disaster for people and businesses."
However, even Dalton concedes that the resolution “does make a number of useful recommendations to improve implementation, such as appointing a permanent Privacy Shield Ombudsman.”
The Computer & Communications Industry Association also cautioned against “a rushed suspension of this arrangement.”
CCIA Europe Senior Manager Alexandre Roure said, “Privacy Shield has extended EU privacy standards globally while safeguarding international data flows which European firms and Europe’s economy rely on."
Read The Full Article
Jul 9, 2018 8:00 AM
by Derek Lackey
The brand-new California Consumer Privacy Act of 2018, which swept through the California legislature last week with startling speed as a compromise measure preempting an even stricter ballot initiative, will apply to more than 500,000 U.S. companies, the vast majority of which are small- to medium-sized enterprises. These figures were derived by an IAPP examination of the language of the law as applied to U.S. census data about American businesses.
The new act, which provides California residents with new rights, including a right to transparency about data collection, a right to be forgotten, a right to data portability, and a right to opt out of having their data sold (opt in, for minors), applies to businesses that collect consumers’ personal information, as well as to those that sell consumers’ personal information or disclose it for a “business purpose.”
The law defines the term “business” as a for-profit legal entity that collects consumers’ personal information and does business in the state of California. For purposes of our analysis, we assume that this law does not apply to nonprofit entities, although that is not entirely clear from the definition. We also assume, consistent with well-established jurisprudence on long-arm jurisdiction, that “doing business” in California applies to companies that sell goods or services to California residents even if the business is not physically located in the state.
In addition, to fall within the law’s jurisdiction, a business must meet one of the following conditions:
Have $25 million or more in annual revenue.
Possess the personal data of more than 50,000 “consumers, households, or devices.”
Earn more than half of its annual revenue selling consumers’ personal data.
A “consumer” is defined as a natural person who is a California resident, which is very broadly defined in a separate statute as (1) every individual who is in the state for other than a temporary or transitory purpose, or (2) every individual who is domiciled in the state who is outside the state for a temporary or transitory purpose. This definition, therefore, includes California residents while they are traveling.
The law does not apply to information already regulated under the Health Insurance Portability and Accountability Act, the Graham-Leach Bliley Act, the Fair Credit Reporting Act, or the Drivers’ Privacy Protection Act; it still applies to entities covered by these laws to the extent they collect and process other personal information about California consumers.
The most objective measure in the definition is the application to companies with $25 million or more in annual revenue. Because finding information on annual revenue of privately held companies is challenging, we followed a rule of thumb in the business-reporting world that estimates a company’s revenue by the number of employees it has. Under this assumption, a company will gross an average of at least $100,000 per employee.
Using information from the U.S. Census Bureau, we find that in 2015, 121,687 California companies had more than 500 employees, which translates to more than $50 million in annual revenue. Another 36,818 companies had between 100 and 500 employees. Assuming conservatively that just 40 percent of them had at least 250 employees or an estimated $25 million in revenue, that leaves 136,414 companies in California in 2015 that likely fall under the jurisdiction of the new law. Excluding health care companies (approximately 18 percent of the U.S. GDP and thus 18 percent of companies), we’re left with 111,859 companies.
That’s a lot of companies. But this number accounts for just the companies in California that would be affected. The law, as explained above, has far broader reach, since outside of purely local businesses, few in any American companies with individual customers do not process data about consumers from California — by far the largest U.S. state. According to the latest U.S. Census Bureau, there are 1.2 million businesses in the United States with more than 500 employees, that is, according to our assumption, 1.2 million businesses with $50 million in annual revenue.
To be conservative, we take into account just the proportion of these companies that are in the financial services, retail, professional services or information industries, thus highly likely to process consumers’ personal data. According to official statistics, those sectors account for 44 percent of the U.S. economy, which is 528,000 companies of the 1.2 million U.S. companies that gross more than $50 million and 49,280 businesses out of the 111,859 California companies we arrived at above. We added the 49,280 California businesses to the 528,000 U.S. companies but subtracted a number equal to California’s 13.3 percent share of the U.S. economy (so as not to double count), arriving at a grand total of 507,280 companies.
Read The Full Article
Jul 8, 2018 1:00 PM
by Derek Lackey
Finally, the day arrived and the General Data Protection Regulation is now in place. In the first month of operation, some curious GDPR related practices evolved and serious consequences of the new privacy legislation started coming into surface.
People vs. Google, Amazon, Facebook, Apple, LinkedIn, et.al.
Immediately, on the first day of GDPR, Google and Facebook were confronted with four claims relating to their ‘take it or leave it’ approach about the consent to use personal data of their users.
Max Schrems and the None of Your Business organisation filed complaintsagainst Facebook in Belgium, Germany, Austria, and against Google in France. Schrems claimed a rough total of 7 billion Euro on behalf of unnamed users of Facebook, Instagram, Whatsapp and Android.
Schrems and Nyob argue that making users to agree to the privacy settings without providing the real choice as to how their personal data is being used by these applications represents ‘forced consent’ and is in clear violation of the new European personal data protection legislation.
Some days later, a French activist group La Quadrature du Net filed complaints against a broader circle of companies on behalf 12.000 individuals to CNIL (French data protection authority). Naturally, Google’s Gmail, Youtube and Search, as well as Facebook became subject to proceedings in France. The group also targeted Apple (iOS), Amazon and LinkedIn. The basis for the filed complaints is also ‘forced consent’. It was disclosed that the group plans to launch procedures against Whatsapp, Instagram, Android, Outlook and Skype in the future.
A recent study of the Norwegian Consumer Council titled ‘Deceived by Design’ looked closely at the practices of Google, Facebook and Microsoft regarding private information of their users. Google and Facebook specifically and, to a lesser extent, Microsoft offer privacy-unfriendly settings by default. And in the meantime, they make it difficult to access privacy-friendly settings by requiring users to actively look for these options through several layers of the privacy dedicated websites.
NCC concludes that neither Google, nor Facebook or Microsoft try to comply with GDPR because nudging tactics employed by these companies (and many others too) go against the ‘privacy by default’ and ‘privacy by design’ principles.
European authorities issue their first GDPR based decisions
Meanwhile, first cases interpreting the GDPR provisions started to appear. The highest number of cases, currently three, comes from Germany. Several provisions were tested in these recent weeks. Below are summaries of these decisions with links to the full texts of these rulings.
* data minimisation
The Regional Court in Bonn was the first to issue the official GDPR interpretation. ICANN sued German-based accredited ICANN Registrar company, EPAG for its refusal to collect administrative and technical contact information upon new domain name registrations. Such data can usually be found in the WHOIS directory.
For a long time, WHOIS and ICANN were criticised for risk connected to exposure of personal data stored in the database to various malicious attacks. Identity theft is one frequent example of such risk. EPAG argued that based on ‘data minimisation’ principle embodied in the GDPR, it can only collect the domain name registrant’s data.
The court supported the view of EPAG, ruling that ICANN could not credibly show the necessity to collect admin and technical contact information. The domain registrant’s personal data is sufficient for ICANN’s purposes, especially in relation to criminal offences, security breaches or other infringements. ICANN filed an appeal to the higher court.
* data controllers
The Court of Justice of European Union concluded that an administrator of a Facebook page shares responsibility with Facebook to protect personal data of the Facebook page visitors.
The CJEU case originated from the claim of the Independent Data Protection Centre for the Land of Schleswig-Holstein, Germany. It ordered a German education company (Wirtschaftsakademie Schleswig-Holstein GmbH), to deactivate its Facebook page because it, without explicit user consent, accessed and stored cookies from visitors’ hard drives to collect personal data.
The education service provider argued that ‘Facebook alone decided on the purpose and means of collecting and processing personal data used for the Facebook Insights function, Wirtschaftsakademie receiving only anonymised statistical information.’
The court did not agree. Creation of a Facebook page requires page administrators to define parameters of the page depending on target audience, objectives and promotion of the page.
The Facebook page admins can actively set filters that request processing of demographic data, trends relating to age, sex, relationship and occupation, information on the lifestyles, online purchasing habits, other data.
Although Facebook eventually transmits only anonymous data to page admins, still, since the Facebook page operators set these filters, they must be categorised as data controllers ‘responsible for that processing within the European Union, jointly with Facebook Ireland.’
Although this case origins predate GDPR, the decision of CJEU is dictated by the provisions of the new legislation. It distinguishes between data processors and controllers, drawing specific attention to responsibility of both.
* privacy vs. public interest
Higher Regional Court in Cologne has looked at the German Art Copyright Law (Kunsturhebergesetz, or KUG). It tested whether the KUG’s application is affected by the GDPR. The underlying dispute seems to be connected to a TV-program, where an individual has been aired without a consent. The case ended up in the regional court first and then was appealed to the higher court.
Both courts, regional and higher regional, ruled that KUG is applicable as a sector-specific regulation. It contains provisions that determine a so-called ‘media privilege’, meaning that photo- or videographers do not need to obtain an explicit consent at a recorded public event, if it is then disclosed as part of a journalistic assignment or to serve public interest.
In the meantime, it is still not clear, how GDPR would affect commercial image making, outside the sphere of public interest or journalism. After all, KUG regulates only publication of images, but not the collection of data (i.e. images themselves).
* right to access information
Most recent decision was issued by the Austrian data protection authority a few days ago. It ordered Austrian banks to provide historical account information to its users for free.
An individual wanted to access the bank account statements for the period of the last five years. Since it was not possible to download online, he asked the bank to provide it in person. The bank demanded a fee in the amount of 30 Euro for each year of the statements.
The account holder then sent the bank a formal request to access information based on the GDPR Article 15 (right to access). The bank did not respond.
The Austrian data protection authority found that the bank was in violation of GDPR and ordered to provide the information requested within two weeks from the date of the decision and for free.
In November and December 2014,Yahoo! came under a cyber-attack which resulted in exfiltration of approximately 500 million user accounts world wide. The compromised personal data included names, email addresses, telephone numbers, dates of birth, hashed passwords and, in some cases, even security questions and answers, both encrypted and unencrypted. The data breach has been publicly announced only two years later, in 2016.
The investigation by the UK Information Commissioner’s Office found Yahoo! UK Services Ltd. responsible as a data controller. ICO confirmed that Yahoo! UK Services failed, for a long period of time, not only to take appropriate measures to protect data and ensure that its processor in the U.S. (Yahoo! Inc.) complied with data protection standards, but also that appropriate monitoring was ensured to protect the credentials of its employees with access to the customer data.
This June, ICO issued Monetary Penalty Notice against Yahoo! UK Services, ordering it to pay a fine of 200.000 UK Pounds.
This is a cautionary example. The Yahoo! cyber-attack happened before GDPR was in place. The maximum penalty to be awarded under the UK legislation of the time was 500.000 UK Pounds. Nowadays the penalty for this type of breach will, be no doubt, much higher.
Read The Full Article
Jul 7, 2018 10:00 AM
by Derek Lackey
In a last-minute action, just a few hours before a looming deadline Thursday afternoon, the California legislature passed AB 375, the California Consumer Privacy Act of 2018. As a result of its passage, Alastair Mactaggart, the man behind a November ballot initiative to pass a similar law, has agreed to pull his bill from the ballot.
In a news conference held to celebrate the bill’s passage and signature by Gov. Jerry Brown, Assembly member Ed Chau, who leads the California Assembly’s Privacy Committee, called the bill a “historic step” for California consumers, “giving them control over their personal data.” The law, he said, “forges a path forward to lead the nation once again on privacy and consumer protection issues.”
California State Senator Bob Hertzberg was downright ebullient in striking a tone of victory: “This is a huge step forward for California,” he said, “for consumers all across the country.”
Mactaggart, who Hertzberg compared to Nelson Mandela and Mahatma Gandhi, chuckled in saying it’s “not every day you see a law made so quickly.” Indeed, he said, not more than a month ago he was convinced the ballot initiative was the only way the privacy law could be made reality. Instead, the legislature engaged only a week ago and quickly passed this sweeping legislation that brings into being significant new privacy rights for consumers.
“We have achieved a significant accomplishment,” Mactaggart said. “This is the strictest privacy bill in the history of the country.”
Assuming the law is not amended before it comes into force on January 1, 2020, the California Consumer Privacy Act would make it so:
• Consumers have the ability to request a record of what types of data an organization holds about them, plus information about what's being done with their data in terms of both business use and third-party sharing.
• Businesses will have to have a verification process so consumers can prove they are who they say they are when they do their requesting.
• Consumers have a full right to erasure, with carve-outs for completion of a transaction, research, free speech, and some internal analytical use.
• Organizations will have to disclose to whom they sell data, and consumers will have the ability to object to the sale of their data. Businesses will have to put a special "Do Not Sell My Personal Information" button on their web sites to make it easy for consumers to object.
• Sale of children's data will require express opt in, either by the child, if between ages 13 and 16, or by the parent if younger than that.
• Organizations cannot "discriminate against a consumer" based on the exercising of any of the rights granted in the bill. For example, you can't provide a different level or quality of service based on a consumer objecting to the sale of their data. However, organizations could offer higher tiers of service or product in exchange for more data as long as they're not "unjust" or "usurious."
• A covered "business" is defined as any for-profit entity that either does $25 million in annual revenue; holds the personal data of 50,000 people, households, or devices; or does at least half of its revenue in the sale of personal data.
• The law would be enforced by the Attorney General and create a private right of action for unauthorized access to a consumer's "nonencrypted or nonredacted personal information." Failure to address an alleged violation within 30 days could lead to a $7,500 fine per violation (which could be per record in the database, for example).
• Finally, the law protects any "consumer," defined as a "natural person who is a California resident," which is defined as "(1) every individual who is in the State for other than a temporary or transitory purpose, and (2) every individual who is domiciled in the State who is outside the State for a temporary or transitory purpose."
Asked if companies are likely to begin compliance now or wait until 2020, Hertzberg said he thinks the bill “sets a tone … Even though it will be delayed in implementation, you will have an impact just by the virtue of its existence.”
And what about talk that the legislature may make some adjustments to the law between now and 2020?
Read The Full Article
Jul 7, 2018 9:00 AM
by Derek Lackey
MEPs call on the EU Commission to suspend the EU-US Privacy Shield as it fails to provide enough data protection for EU citizens.
The data exchange deal should be suspended unless the US complies with EU data protection rules by 1 September 2018, say MEPs in a resolution passed on Thursday by 303 votes to 223, with 29 abstentions. MEPs add that the deal should remain suspended until the US authorities comply with its terms in full.
Data breaches and the Privacy Shield
Following the Facebook-Cambridge Analytica data breach, MEPs emphasize the need for better monitoring of the agreement, given that both companies are certified under the Privacy Shield.
MEPs call on the US authorities to act upon such revelations without delay and if necessary to remove companies that have misused personal data from the Privacy Shield list. EU authorities should also investigate such cases and if appropriate, suspend or ban data transfers under the Privacy Shield, they add.
MEPs are worried that data breaches may pose a threat to democratic processes if data is used to manipulate political opinion or voting behaviour.
Concern over new US law
MEPs are also worried about the recent adoption of the Clarifying Lawful Overseas Use of Data Act (CLOUD Act), a US law that grants the US and foreign police access to personal data across borders.
They point out that the US law could have serious implications for the EU and could conflict with EU data protection laws.
Civil Liberties Committee Chair and rapporteur Claude Moraes (S&D, UK) said: "This resolution makes clear that the Privacy Shield in its current form does not provide the adequate level of protection required by EU data protection law and the EU Charter. Progress has been made to improve on the Safe Harbor agreement but this is insufficient to ensure the legal certainty required for the transfer of personal data.”
"In the wake of data breaches like the Facebook and Cambridge Analytica scandal, it is more important than ever to protect our fundamental right to data protection and to ensure consumer trust. The law is clear and, as set out in the GDPR, if the agreement is not adequate, and if the US authorities fail to comply with its terms, then it must be suspended until they do.”
The Privacy Shield is...
Read The Full Release
REPORT FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT AND THE COUNCIL on the first annual review of the functioning of the EU–U.S. Privacy Shield