We previously reported on Brexit's impact on data protection here and here.
Shortly before Christmas, the draft Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 ("Exit Regulations", available here) were laid before Parliament. In this blog, we outline the changes under the Exit Regulations and consider what impact they will have if Brexit leaves the UK in a no-deal scenario.
Preparing for a no-deal Brexit
As a brief reminder of the current legislative landscape in the UK, so long as the UK is in the EU the GDPR has direct effect. The Data Protection Act 2018 ("DPA") must be read alongside the GDPR and has multiple functions. Firstly, it supplements the GDPR and contains derogations that are permitted by the GDPR (such as providing for additional conditions around the processing of special categories of personal data and exemptions in respect of data subject rights). The DPA also applies a broadly equivalent regime to certain data processing activities which fall outside the scope of the GDPR. These relate to, for example, the processing of personal data for immigration purposes and manual unstructured data held by a public authority covered by the Freedom of Information Act 2000. Finally, the DPA covers processing by law enforcement bodies and UK intelligence services.
In the event that the UK leaves the EU without a withdrawal agreement, the GDPR will form part of UK domestic law as 'retained EU law' ("UK GDPR") by virtue of section 3 of the EU (Withdrawal) Act 2018 ("EUWA"). However, in its current form the UK GDPR will not function effectively on the day that the UK leaves the EU ("exit day", currently scheduled for 29 March 2019) due to the numerous references to EU laws and institutions and the fact that the UK will cease to be a Member State of the EU. The Exit Regulations, made under powers conferred in the EUWA, will be required to ensure that the UK's legal framework for data protection continues to function.
If there is a withdrawal agreement, then the Exit Regulations will not come into force and the UK will instead enter the "transition period" where EU law will continue to apply as if the UK were still an EU Member State (subject to certain exceptions).
1. Territorial scope and UK representative
By virtue of the Exit Regulations, the UK GDPR will apply to any controllers and processors based in the UK as well as those outside the UK but which offer goods and services in the UK or monitor the behaviour of UK individuals. The UK GDPR will therefore continue to have extra-territorial effect in the same way as the GDPR currently does.
Many controllers and processors, whether based outside of the EEA or in one of the remaining EEA countries, will have already considered whether they are subject to UK data protection law. However, they will now also need to consider the requirement to appoint a representative in the UK. This will impact both non-EEA controllers and processors who may have already appointed a representative in a non-UK Member State as well as EEA controllers and processors who don't have a UK presence (and for whom this will represent an entirely new obligation).
Equally, the ICO has also indicated that a UK-based controller or processor that does not have any offices or establishments in the EEA but offers goods or services to or monitors the behaviour of EEA individuals will need to consider appointing a European representative.
2. The merger of the GDPR and "applied GDPR"
As previously mentioned, the DPA currently provides for two separate regimes for general processing: one for processing that falls within the scope of the GDPR and a separate, broadly equivalent regime for processing that falls outside the scope of the GDPR (the so-called "applied GDPR"). Given that EU law will not apply in the UK after Brexit, there will no longer be a need to distinguish between processing within the competence of EU law and that which is governed solely by UK law. The Exit Regulations will therefore merge these two regimes on exit day to create a single, unified regime for all general processing activities.
However, it is worth mentioning that this will not be a complete merger. Under s6 of the EUWA, any question around the interpretation of retained EU law (including the GDPR) must be decided in accordance with EU case law and general principles of EU law as they apply immediately before the UK leaves the EU. However, the Exit Regulations indicate that may not be the case for processing under the applied GDPR, which governs the processing of personal data in areas where the EU has no competence.
3. Data transfers outside the UK
Currently, any transfer of personal data from the UK to a country outside the EEA may only be made if that country has been granted adequacy status by the EU Commission or by using one of the "appropriate safeguards" described under Article 46 of the GDPR (i.e., the EU Commission's standard contractual clauses or approved BCR).
The Exit Regulations maintain the same restrictions for data transfers outside the UK (whether to a non-EEA country or a remaining member of the EEA) but ensure that data flows are not disrupted on exit day. They specify that certain countries and bodies are considered to have adequate status: these include all of the remaining EEA countries as well as Gibraltar, non-EEA countries which have already been granted adequacy status by the EU Commission or granted adequacy status prior to exit day, and the EU institutions and bodies. The Exit Regulations also provide that the EU's authorised standard contractual clause and approved BCR may continue as potential mechanisms for transfers outside the UK, whether in their original form or as modified for a UK-specific context. Finally, the existing derogations under Article 49 of the GDPR will continue to be available.
Once the UK has left the EU, the Secretary of State will have sole authority to grant adequacy status (by way of regulations) in respect of transfers outside the UK and will be required to publish a list of those countries and territories it has deemed adequate. The ICO will continue to authorise BCR and will also be able to issue new UK-only standard clauses.
4. Co-operation and consistency
From exit day, the ICO will no longer be able to take part in the existing co-operation mechanism between EU supervisory authorities. Equally, the European Commission and European Data Protection Board will not have competence over the regulation of personal data in the UK. Unsurprisingly, therefore, Chapter VII - which lays the foundations of the co-operation and consistency mechanism - will be redundant and is removed entirely from the UK GDPR. Article 50, which addresses broader aims of international co-operation and mutual assistance in the area of data protection, will be retained.
Another expected amendment is the removal of provisions addressing the co-operation of Member State courts. Currently, under Article 81 of the GDPR, where proceedings are issued in a UK court against the same controller or processor and in relation to the same subject matter as a case already pending in another EU Member State, the UK court may either decline jurisdiction or suspend those proceedings until the other court has made its determination. Arguably, the removal of these provisions increases the possibility of concurrent claims in the UK and the EU.
5. ICO fines
The Exit Regulations confirm that the ICO will continue be able to issue the same level of fines as provided under the GDPR. In particular, they state that from exit day the ICO will be able to administer fines of up of £8.7m or 2% of the total worldwide annual turnover (whichever is higher) for less serious breaches and £17.5m or 4% (whichever is higher) for more serious breaches.
What will the new year bring in privacy and data protection legislation? Well, to name just a few highlights, we've got a handful of EU member states still needing to pass laws addressing the General Data Protection Regulation, India is in the midst of debate over a new law, Brazil's law will get the enforcement body it has been lacking, and there are talks of a U.S. federal privacy law. But that's just the tip of the iceberg. This week's Privacy Tracker roundup consists of contributions from IAPP members across the globe outlining their expectations (and occasionally their hopes) for privacy legislation in the coming year. With more than 30 contributions, this year's global legislative predictions issue is our most comprehensive yet.
By Pablo Palazzi
The year 2019 will see Argentina with an important landmark in its history of data protection law. In September 2018 the government sent to congress the data protection bill, based on the EU General Data Protection Regulation. Now it is up to Congress (first the Senate, then the House of Representantes) to openly debate the bill and approve it. Argentina was the first country in Latin America to adopt a full fledged EU data protection law and the first country to be considered adequate by the EU. Now, nearly 20 years later Argentina has again the chance to follow EU law again.
By Tim Van Canneyt, CIPP/E
2019 will be another important year for data protection in Belgium. First, we should finally see the appointment of the directors of the new data protection authority. At the moment, Belgium has an interim supervisory authority which is pretty much forced to act on a day-to-day management basis. When the directors of the DPA are appointed in 2019, the authority will be able to adopt its strategic vision, publish more guidelines to help companies and offer better protection to citizens. In addition, we should hopefully see the implementation of the NIS Directive into national law. Furthermore, the Belgian Supreme Court will have to assess the lawfulness of the recent government decision obliging every Belgian resident to provide their fingerprints for inclusion on the ID card's chip. Finally, it will be interesting to follow the class action brought against Facebook by consumer protection body TestAankoop.
By Renato Leite Monteiro, CIPP/E, CIPM
2019: The year of compliance and the Brazilian Data Protection Authority!
What a year! Nobody could guess in the beginning of 2018 that Brazil would finally have its own General Data Protection Law, known as LGPD (I myself have written this column for the last three years and I always thought my predictions were only in a wild guess!). And then, out of the blue, it was approved in August. However, the president vetoed one of its pillars: the creation of the national data protection authority. Even though some provisions would only have an effect if the authority was created. This lack of DPA made the LGPD weak.
Then, on the dawn of the year, Dec. 28, 2018, the Executive Order n° 869/18 promoted several alterations to the law. One of the most important was the creation of the Brazilian National Data Protection Authority. It is also altered the vacatio legis period for the LGPD to 24 months, changing the enforcement date from February 2020 to August 2020. During this period, the ANPD must exercise collaborative and consultative functions, aiming to provide assistance in the process of compliance to the new law. With the creation of the DPA, business will know to whom and what to look for. They will have a straight channel to communicate. The ANPD will provide for a much more stable application of the law, and, for instance, more legal certainty, what will probably spur technological and economic development.
Nonetheless, despite the DPA, enforcement actions might continue. The Distrito Federal and Territories Public Prosecution Office has been heavily conducting investigations on data breaches and other issues regarding personal data. Recently, the Minas Gerais Public Prosecution Office fined a drugstore chain for exchanging customers’ personal Taxpayer Id numbers for discounts in products, which in fact is a common practice in Brazil. The total amount of the fine was R$ 7,930.801.72 (BRL), the largest related to data protection yet. This condemnation was vastly reported in mainstream media, national and international. Such actions are likely to continue.
Also, since LGPD will enter into force in August 2020, 2019 will be year companies will rush to become compliant, a practice that has already become a new niche market. Consulting companies and law firms are heavily investing in personnel and privacy software to take advantage of the escalating demand. Proof is that the IAPP has partnered with the first official training center of Brazil. Data Privacy Brasil will provide training courses for both CIPP/E and CIPM certifications.
Therefore, we can say that 2019 will be an interesting year for the data protection scenario in Brazil!
The balance of this article requires membership to IAPP. If you are interested in a specific country/region, please email us and we will provide details.
UK software company Syrenis is delighted to announce the recent appointment of their new Canadian reseller, Newport Thomson.
UK software company Syrenis is delighted to announce the recent appointment of their new Canadian reseller, Newport Thomson. Newport Thomson serve the US, Canada and the EU, helping businesses to manage risk within their businesses by operationalising data and privacy best practices.
Syrenis is renowned for its Preference Centre, which has recently been expanded and rebranded to accommodate changes to the personal data and privacy landscape. Now known as Cassie, the personal information platform currently handles almost a billion marketing preferences for 118 million customer contacts worldwide, and securing knowledgeable new partners such as Newport Thomson brings the platform to more businesses with currently unmet personal data needs.
‘The level of expertise within Newport Thomson became clear from our very first conversations with them,’ says Glenn Jackson, CEO of Syrenis. ‘Naturally we’re delighted to have them on board and we look forward to a long partnership with them.’
‘Record keeping is the biggest compliance issue for any of these new laws being implemented globally, from GDPR to CCPA, including PIPEDA and CASL,’ states Derek Lackey, Managing Partner, Newport Thomson. ‘Our client’s data and privacy infrastructure was never designed to prove consent claims or any of the other details required by these new laws. In order to be compliant an organization must re-think the way they manage data, privacy, consent and data subject rights and Syrenis is an exceptional solution at the right price.’
The system has been engineered to be flexible and intuitive for users, allowing for better brand consistency and a positive preference management experience. It now offers improved support functions: previously hidden features have been made more accessible, such as a bank of FAQs for reference, and users can also submit and manage support requests from within their portal interface.
‘The changing privacy landscape on the North American continent is something that we have been watching with great interest,’ Glenn continues. ‘Having a partner with such strong bases in both the US and Canada allows us to work together to deliver the best possible personal data solutions for those markets.’
More information about Newport Thomson can be found at www.newportthomson.com.
With only days to go before Vermont’s data broker regulation law takes effect, the Vermont Attorney General has finally issued guidance that explains how businesses can comply with the law and the nature of the obligations it imposes on them.
The Vermont Statute
This past May, the Vermont legislature passed the first law in the United States specifically regulating data brokers, effective January 1, 2019. Data brokers must register with the state by January 31, 2019, and annually thereafter, and must provide specified information to the state when they register.
The law also imposes certain minimum data security standards on data brokers, and prohibits data brokers – and everyone else – from acquiring certain personal information of consumers through fraudulent means or with the intent to commit wrongful acts.
What Is a Data Broker?
As we discussed in a previous Alert, the Vermont law defines “data broker” as a business that knowingly collects and sells or licenses to third parties “brokered personal information” of a consumer with whom the business does not have a direct relationship. The new guidance from the Vermont Attorney General amplifies the definition by listing four questions that can determine if a particular business is a data broker for purposes of the law. If a business answers “yes” to these questions, and if its activities do not fall within certain very limited exceptions, the business is a data broker.
The questions are:
1. Does the business handle the data of consumers with whom it does not have a direct relationship?
Data brokers collect and sell or license the data of consumers with whom they do not have a direct relationship. For example, a retailer that sells information about its own customers is not a data broker because it has a relationship with its customers.
2. Does the business both collect and sell or license the data?
A business that collects data for its own use or analysis is not a data broker. As an example, an insurance company that buys data about individuals to set rates and develop new products but that does not resell or license the data, is not a data broker.
The guidance makes clear that “collection” is a broad term, and can include the purchase or license of data from someone else, or the collection from original sources such as court records, surveys, or internet search histories.
According to the guidance, sale or license does not include a one-time or occasional sale of the assets of a business as part of a transfer of control of those assets that is not part of the ordinary conduct of the business. It also does not include a sale or license that is “merely incidental to the business.”
3. Is the data about individuals residing in Vermont?
Vermont’s data broker regulation does not apply to a company that has no data on Vermont residents or who is not otherwise subject to jurisdiction in Vermont. Importantly, the guidance suggests that a national data broker has a “non-trivial chance” of possessing Vermonters’ data. It states that if a data broker does not maintain the state of residence of individuals whose data it collects, it might presume that there may be at least one Vermonter in its data set.
4. Is the data brokered personal information (BPI)?
BPI is defined broadly. It must be computerized as well as categorized or organized for dissemination to third parties. According to the guidance, data is BPI if it contains one or more of a person’s name, address, date of birth, place of birth, mother’s maiden name, biometric information, name or address of a member of the consumer’s immediate family or household, or Social Security number or other government-issued identification number.
The guidance also provides that BPI includes “other information that, alone or in combination with the other information sold or licensed, would allow a reasonable person to identify the consumer with reasonable certainty.”
By contrast, BPI does not include publicly available information to the extent that it is related to a business or profession. For example, a doctor’s office address or phone number is not BPI, but a doctor’s home phone number (assuming it is not used for business) is BPI.
It has been a busy year for privacy and cybersecurity. Here is a look back at the highlights of 2018 and a preview of what 2019 may have in store in the United States, Europe, and China:
Privacy: This year saw a proliferation of several state and federal legislative proposals aimed at protecting consumer privacy and bolstering cybersecurity protections. Notably, California passed the most sweeping privacy law in the country thus far (the California Consumer Privacy Act of 2018), and amendments to the law will continue well into next year until the law enters into force in 2020. Following the CCPA, many members of Congress and the administration began proposing their own, federal-level privacy laws (including Senators Ron Wyden (D-OR) and Brian Schatz (D-HI)). It is unclear which of these proposals stands the greatest chance of moving forward in 2019, or how many additional proposals will be introduced by members of the new Congress.
Cybersecurity: As of 2018, all 50 states (plus the District of Columbia, Puerto Rico, Guam, and the U.S. Virgin Islands) have their own state breach notification laws. In addition, Ohio Senate Bill 220 entered into force in November. The new law creates a “safe harbor” from certain types of tort-based liability for any “covered entity” that implements a cybersecurity program that satisfies certain requirements. These new laws were enacted against the backdrop continually increasing frequency of and costs associated with cyber incidents.
Federal Trade Commission: The year 2018 ushered in a brand new Federal Trade Commission, which began to signal its enforcement priorities through the “Hearings on Competition and Consumer Protection in the 21st Century.” The hearings suggest that the FTC will continue to focus on privacy, and may pay closer attention to the intersection of privacy and competition. 2019 may be a particularly interesting year for the agency, as many federal privacy legislative proposals include provisions that would expand the scope of the agency’s authority and provide the agency with rulemaking authority and/or the ability to levy civil penalties.
Surveillance Law: The CLOUD Act, which was signed into law in March 2018, created a framework for government access to data held by tech companies worldwide. Next year, we may see that framework be put into action as the United States considers how to approach entering into bilateral executive agreements with certain countries. In addition, the Supreme Court’s decision in Carpenter vs. United Statesheld that law enforcement must get a warrant in order to obtain cell cite location information from cell phone providers. Going into 2019, debate over the scope of the decision will continue as federal courts consider what, if any, additional types of information held by third parties may require a warrant.
Privacy in the Courts: In October 2018, a New Jersey federal court dismissed an eight-count class action complaint against smart TV makers, which included a complaint that the makers allegedly violated the Video Privacy Protection Act (VPPA). In addition, in 2019 the Illinois Supreme Court will decide the statutory standing requirements under the Illinois Biometric Information Privacy Act (BIPA)—the only state biometric law that contains a private right of action.
Of course, the story of the year was the General Data Protection Regulation (GDPR) entering into force on May 25, 2018. The law radically overhauled the European Union’s data protection framework, and may have inspired similar laws and legislative proposals in countries such as Brazil and India. European regulators already are intensifying their enforcement of the GDPR, with several investigations launched and fines levied in the past few months alone.
In addition, in December 2018 the European Commission published its report on the second annual review of the EU-U.S. Privacy Shield. The report concluded that the Privacy Shield “continues to ensure an adequate level of protection” for personal data transferred from the EU to the United States. Separately, the International Trade Administration’s Privacy Shield Team released new guidance regarding how a Privacy Shield participant may rely on the Privacy Shield to receive personal data from the United Kingdom following its planned withdrawal from the EU. In particular, the guidance advised that companies that wish to receive data from the United Kingdom will need to update their privacy policies to do so.
The EU also continued to consider the privacy implications of next-generation technologies such as artificial intelligence. The Declaration on Ethics and Protection in Artificial Intelligence was issued at the 40th Annual Data Protection and Privacy Commissioner’s Conference in Brussels in October 2018, and in December 2018 the EU High-Level Expert Group on AI published new draft guidance on “AI Ethics”. The non-binding guidance stresses that AI must be developed and implemented with a “human-centric approach” that results in “Trustworthy AI,” including by respecting privacy.
In 2019, the story of the year will likely be Brexit, with the United Kingdom scheduled to leave the European Union. As of the date of this post, we are unsure whether there will be a transition period, or whether the departure will be a “hard” Brexit. However, on December 13, 2018, the Information Commissioner’s Office (ICO) issued guidance on the state of UK data protection law in the event of a “hard” Brexit.
In 2018, China issued the national standard on protection of personal information (GB/T 35273-2017 Information Technology – Personal Information Security Specification), which entered into force on May 1, 2018.
For those who may have missed the public letter to Minister Bains from our Federal Privacy Commissioner, regarding Canada's lack of action in the area of privacy laws. Daniel Therrien is not mincing words.
November 23, 2018
The Hon. Navdeep Singh Bains, P.C.
Minister of Innovation, Science and Economic Development (ISED)
235 Queen Street
Ottawa, Ontario K1A 0H5
Dear Mr. Bains:
Subject: ISED’s National Digital and Data Consultations
I am writing you in the context of the National Digital and Data Consultations you launched this past summer, and further to my last discussion with Deputy Minister John Knubley this fall. I have been reflecting a great deal on the Government’s overall strategy to position Canada as a global leader in an increasingly fast-paced digital and data-driven economy, and I would like to offer some views within this context.
The digital revolution is causing us to examine some of the most fundamental questions of our time. It is not an exaggeration to say that the digitization of so much of our lives is reshaping humanity. There are lofty ambitions for the power of digital technologies and big data, and its anticipated ability to drive productivity, growth and competitiveness, and improve our lives in various ways. Yet, at the same time, we have reached a critical tipping point upon which privacy rights and democratic values are at stake. Recent events have shed light on how personal information can be manipulated and used in unintended, even nefarious, ways. I am growing increasingly troubled that longstanding privacy rights and values in Canada are not being given equal importance within a new digital ecosystem eagerly focused on embracing and leveraging data for various purposes. Individual privacy is not a right we simply trade away for innovation, efficiency or commercial gain.
Global opposition to the mass collection of personal data for commercial and political purposes is growing rapidly, and even tech giants are recognizing that the status quo cannot continue. Apple Chief Executive Tim Cook recently spoke of a “data industrial complex” and warned that, “our own information, from the everyday to the deeply personal, is being weaponized against us with military efficiency.” He added, “(t)his is surveillance.” Likewise, Facebook’s Mark Zuckerberg admitted that his company committed a “serious breach of trust” in the Cambridge Analytica matter. Both companies have expressed support for a new U.S. law, similar to Europe’s General Data Protection Regulation (GDPR). You know that the ground has shifted and that we have reached a crisis point when the tech giants have become outspoken supporters of serious regulation. Now is the time to ensure we adopt the best approach for Canadians.
ISED launched its National Digital and Data Consultations this past summer with the message that, to spur digital innovation, investment, and job creation in Canada, citizens must have trust and confidence that their data and privacy will be protected. On privacy and trust, ISED asked Canadians how government should achieve the right balance between protecting privacy and innovation, as well as ways to increase citizens’ trust and confidence on data use “while not impeding innovation.” I am wary of this discourse as it suggests to Canadians that privacy is at odds with innovation, or similarly, that privacy is at one end of the spectrum and digital innovation at the other.
The Government rightly points out that Canadians must have trust and confidence that their data and privacy will be protected. However, I strongly believe that the trust needed to allow the digital economy to flourish, and the social license the government will need from Canadians to innovate with their personal data, hinges on having an appropriate legal framework in place. Yet, when it comes to effecting real legislative change in this context, the Government has been slow to act, putting at continued risk the trust Canadians have in the digital economy and confidence that our Canadian values will be preserved.
We should remember that the Canadian Charter of Rights and Freedoms and the federal Privacy Act were concurrently debated in Canada and born of the realization that privacy rights are intrinsic to other fundamental rights and values including liberty, dignity, and freedom from government intrusion. Privacy is more than a set of technical rules and administrative safeguards; it is certainly not a barrier as is often implied. Instead, it is a necessary precondition for the protection of fundamental values in Canada and worthy of legal protections. At a time when new and intrusive targeting techniques are already influencing democratic processes, and data analytics, automated decision-making technologies, and artificial intelligence are raising important ethical questions that have yet to be answered, Canadians need stronger privacy laws, not more permissive ones. Our laws should protect us when organizations fail to do so.
Under PIPEDA, organizations have a legal obligation to be transparent and accountable, but Canadians cannot rely exclusively on companies to manage their information responsibly. Transparency and accountability are necessary, but they are not sufficient. The reality is that our principles-based law is quite permissive and gives companies wide latitude to use personal information for their own benefit. While our law should probably continue to be principles-based and technologically neutral, it must be rights-based and drafted not as an industry code of conduct but as a statute that confers rights, while allowing for responsible innovation.
There is such a model emerging in the U.S., with Democratic Representatives pushing for an Internet Bill of Rights. It would be principles-based but would also establish rights for consumers. The list of rights includes opt-in consent for collection and sharing of data with a third party, a right related to data portability, a right to have personal information secured and to be notified following a security breach, a right to have an entity that collects personal information to have reasonable business practices and accountability to protect privacy, and probably most importantly, a right not to be unfairly discriminated against or exploited based on one’s personal data. To be sure, these rights would have to be supported by more comprehensive legislation and real remedies, but it is refreshing to see these proposals. They are a simple and clear way to frame principles-based legislation for privacy, compared to our industry code of practice-inspired Act which the courts have said is often difficult to interpret, and importantly, apply.
The position paper for ISED’s national consultation suggested we need an “intentional and agile approach to legislation and regulation that can assist in unlocking the full potential of the digital and data revolution.” Indeed, but I would stress that we cannot allow Canadian democracy to be disrupted, nor can we permit our institutions or rights to be undermined in a race to digitize everything and everyone, simply because technology makes this possible. Canada should simultaneously pursue privacy and innovation, and Privacy by Design is an excellent way to achieve both.
In my recent appearance before the Standing Committee on Access to Information, Privacy and Ethics (ETHI) on the study of the breach of personal information involving Cambridge Analytica and Facebook, I commented that while the EU GDPR is a major development in data protection and offers several excellent solutions, we should seek to develop an approach that reflects the Canadian context and values, including our close trading relationships within North America, with Europe, and the Asia Pacific region. Along these lines, I proposed that a new Canadian law include the following important aspects. It should:
Continue to be technology neutral and principles-based, because these features enable the law to endure over time and create a level playing field, but it should mostly be drafted as a rights based statute, meaning a law that confers enforceable rights to individuals, while also allowing for responsible innovation.
Maintain an important place for meaningful consent but it should also consider other ways to protect privacy where consent may not work, for instance in certain circumstances involving the development of artificial intelligence. The concept of ‘legitimate interest’ in the GDPR may provide one such alternate approach.
Empower a public authority to issue binding guidance or rules that would clarify how general principles and broadly framed rights are to apply in practice. A principles based legislation has important virtues, but it does not bring an adequate level of certainty to individuals and organizations. Binding guidance or rules would ensure a more practical understanding of what the law requires. They could also be amended more easily than legislation as technology evolves.
Confer to the OPC stronger enforcement powers, including the power to make orders and impose fines for non-compliance with the law. These powers should include the right to independently verify compliance, without grounds, to ensure organizations are truly accountable to Canadians for the protection of their personal information.
Give the OPC the ability to choose which complaints to investigate, in order to focus limited resources on issues that pose the highest risk or may have greatest impact for Canadians. At the same time, to ensure no one is left without a remedy, give individuals a private right of action for PIPEDA violations.
Allow different regulators to share information. Meaningful protection of consumers and citizens in the fast-paced digital and data-driven economy understandably must involve several regulators, and they must be able to better coordinate their work.
Finally, it is absolutely imperative for privacy laws to be applied to Canadian political parties.
I believe the best way for Canada to position itself as a digital innovation leader is to demonstrate how we can establish a framework for innovation that also successfully protects Canadian values and rights, and protects our democracy. I offer this feedback in an effort to promote a more balanced approach in Canada, and ensure we assign equal importance to the treatment of data as a valuable asset and the value of privacy in our society. I look forward to hearing the outcomes of your consultations with Canadians. Please note that I am ready to discuss these important issues further, and to engage on legislative reform.
Valuable consideration is not defined under CCPA, but the act authorizes the attorney general to provide guidelines in furtherance of the CCPA’s purpose, and it is expected that a public consultation period will open in 2019.
Arguably the most important right the California Consumer Privacy Act provides to California residents is the right to opt-out of data sales. “Sale” is defined as “selling, renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating orally, in writing, or by electronic or other means, a consumer’s personal information to another business or a third party for monetary or other valuable consideration.” Valuable consideration is not defined under CCPA, but the act authorizes the attorney general to provide guidelines in furtherance of the CCPA’s purpose, and it is expected that a public consultation period will open in 2019.
Absent guidelines, this article proposes a possible framework to interpret “valuable consideration” in light of existing California law.
Under contract law, one of the requirements for the formation of a contract is the existence of valuable consideration. California law defines consideration as “[a]ny benefit conferred, or agreed to be conferred, upon the promisor, by any other person, to which the promisor is not lawfully entitled, or any prejudice suffered, or agreed to be suffered, by such person, other than such as he is at the time of consent lawfully bound to suffer, as an inducement to the promisor, is a good consideration for a promise.” Moreover, where the agreement is in writing, California law provides that “[a] written instrument is presumptive evidence of a consideration.” There are many examples of contract formation with non-monetary consideration. For example, in a non-disclosure agreement, one party agrees to allow another access to confidential information (a detriment) in exchange for service (a benefit).
Article 3 of the GDPR was written to address when GDPR applies. Fact is, it left more questions than answers and the EDPB has delievered the long awaited guidelines for public consultation, in order to clear up some of the confusion.
Does the EDPB answer frequently asked questions on territorial scope?
The European Data Protection Board (EDPB, the successor to the Article 29 Working Party) has issued guidelines (for consultation) on one of the key foundation elements of the General Data Protection Regulation (GDPR); namely, Article 3 on territorial scope.
Article 3 is supposed to answer the important questions of when GDPR applies (depending on the location of an entity processing personal data, or of the individuals whose data is being processed). Unfortunately, Article 3 was drafted in a way that left many key concerns unanswered.
The Guidelines 3/2018 on the territorial scope of the GDPR adopted on 16 November 2018 (Guidelines) seek to answer some of those concerns.
The EDPB was somewhat delayed in issuing this much trumpeted document. It was supposedly agreed in principle (subject to legal checks) at its plenary meeting over three months ago. Perhaps those legal checks found some issues since it wasn't until the next plenary meeting (on 16 November) that the document was issued.
Thankfully, it was worth the wait – since there is some valuable guidance for those trying to navigate difficulties inherent in the drafting of Article 3.
Before turning to the Guidelines it is worth recapping Article 3. It is in two (main) parts:
Article 3(1) (the "establishment" criteria) provides that GDPR applies to processing "in the context of an establishment" of a controller or processor in the EU.
Article 3(2) (the "targeting" criteria) provides that GDPR applies to non-EU controllers or processors in two situations (i) those that offer goods or services to individuals in the EU ("targeting by selling") and (ii) those who monitor the behaviour of individuals in the EU ("targeting by monitoring").
We are an EU company, does GDPR apply to us?
Of course. Any entity incorporated or registered within the EU is of course "established" there.
My company is incorporated in, say, Mexico, but I have a branch or office in the EU - does GDPR apply?
Very likely, yes. Whilst "establishment" is not in fact defined, Recital 22 makes clear that
“[e]stablishment implies the effective and real exercise of activities through stable arrangements. The legal form of such arrangements, whether through a branch or a subsidiary with a legal personality, is not the determining factor in that respect"
The Guidelines reiterate this. What is important is that there is some permanent ("stable") presence, and a branch office of a non-EU company will generally fulfil this requirement. Indeed, the Guidelines suggest that a mere one person or agent may be enough to indicate such presence.
My company is a processor and incorporated in the EU, but all customers are non-EU entities – does GDPR apply?
According to the Guidelines, GDPR applies to the processor (subject to the data being processed "in the context" of the establishment) since the processor is indeed established in the EU. It is irrelevant that the controller is not in the EU for the purposes of the processor's compliance. However, using a processor in the EU does not, automatically, make the non-EU controller subject to GDPR. See below!
We are a controller, but not in the EU. However, we do have an EU sales affiliate, but that entity does not actually process personal data itself – so presumably we are both outside of scope?
Not necessarily. The Guidelines support and restate the decisions of the Court of Justice of the European Union that it is possible even for non-EU entities to be "established" in the EU.
The processing need not be by the entity which has an establishment in the EU (in this example, the EU sales affiliate); GDPR will apply to any entity involved if the processing is "in the context" of the establishment in the EU.
This is the same outcome as in the Google Spain case. All that is required is an "inextricable link" between the non-EU entity and the EU establishment. If that exists, then in effect the EU affiliate is also an establishment of the non-EU entity – and GDPR applies to the non-EU entity even if the EU affiliate plays no actual role in processing. The EDPB makes clear that the language in Article 3(1) must be understood in the context of that decision (and other decisions such as Weltimmo).
My company is established in the EU, but we only sell to individuals out of the EU – does GDPR apply?
Yes. The processing of the data about individuals is in the "context of the establishment" of your company, the controller, in the EU. The Guidelines reiterate that it is irrelevant that the data subjects are not in the EU. GDPR is in this respect "nationality blind".
The Guidelines give an example of a French company selling to individuals in North Africa – GDPR applies.
We are an EU company but outsource all our processing activities to entities outside of the EU
GDPR still applies. The processing remains in the context of the EU establishment. The location of the actual processing is irrelevant.
We are a processor outside of the EU, but our customers are within the EU
GDPR does not directly apply to the processor. This is a situation where it had been possible to read Article 3(1) as extending GDPR to the non-EU entity only because it services EU controllers. The Guidelines helpfully end this line of interpretation.
Whilst GDPR does not directly apply to the processor, the Guidelines emphasise the indirect application through Article 28. The controller within the EU is obliged to ensure (under Article 28) that certain data protection obligations are accepted by the processor under contract.
We are a controller outside of the EU, but we are using an EU processor
GDPR does not apply to the controller simply because it chooses to use a processor in the Union.
This is also helpful from EDPB as, again, it is possible to read Article 3(1) more widely (that the processor being within the EU was sufficient to make the controller subject to GDPR).
The Guidelines clarify that such a controller is outside of scope of GDPR on the "establishment" criteria (but of course if EU citizens' data is processed then Article 3(2) might apply). The EU processor, however, will be subject to the GDPR (see above).
We are that EU processor (our customer is outside the EU), do we have to comply with all parts of GDPR?
There was a worry that if the customer was not subject to GDPR, that the processor might be responsible for such things as ensuring a legal basis and other controller responsibilities (since no other entity was within the EU).
The Guidelines (again) helpfully make clear that the processor only has to comply with processor obligations.
We are NOT an EU company, so GDPR does not apply to us
No. If you are established outside the EU, you may still be caught by the GDPR under article 3(2). Keep reading.
We are outside the EU and selling goods and services into the EU
Yes, clearly, under Article 3(2) it is enough for you to be targeting your goods or services in the EU (see further below on "targeting").
But our services are only targeted to non-EU nationals (the diaspora of our country)
Again, GDPR is nationality blind. The Guidelines make clear that presence in the EU is enough.
OK, but we are only providing our service to US tourists whilst on vacation in the EU
This depends on whether there is targeting towards those individuals whilst in the EU or if the fact that they are within the EU is only incidental. If the key feature is to provide the service to individuals because they are within the EU, then GDPR will apply and the fact that they are only there temporarily is irrelevant.
But if the tourists just happen, say, to read a US news website whilst in the EU, that will not make that site subject to GDPR. This is in fact an example given by the EDPB and perhaps inspired to prevent some well publicised US news companies from geo-blocking EU visitors because of GDPR (see a BBC news story here).
We provide our online services from outside the EU to individuals within the EU but do not charge for them
The Guidelines reiterate that the fact that a service is free is irrelevant. GDPR will still apply if services are targeted to them.
Now that six months have passed since the EU General Data Protection Regulation went into effect, gauging the potential for enforcement action is top of mind here in Brussels. Threaded throughout this opening day of the IAPP Europe Data Protection Congress has been insights from some of the EU's top data protection regulators — from European Data Protection Board Chairwoman Andrea Jelinek and newly renamed Data Protection Commissioner for Ireland Helen Dixon to representatives from French data protection authority, the CNIL, the EDPB and the data protection wing of the European Commission.
The big takeaway? Get ready for some enforcement action in 2019.
During her interview with IAPP Chief Knowledge Officer Omer Tene, Dixon said major GDPR-related fines will not come down the pike in 2018, but it's safe to expect some fines in 2019. This notion was foreshadowed earlier in the day by the EDPB's Jelinek during her keynote address. She said the board is already working on a number of cross-border enforcement cases — Dixon separately noted there are 14 — but those cases are complicated and resolutions will come in "a few months from now."
Notably, both Jelinek and Dixon said no cross-border cases have been escalated to the EDPB. Jelinek explained that national regulators thus far have been able to collaborate without triggering any EDPB resolutions or mediation.
But that doesn't mean enforcement is far away. During a panel session on GDPR enforcement, CNIL Director of Rights Protection and Sanctions Directorate Mathias Moulin did not mince words, warning that the time for the GDPR's transition "is coming to an end," and that it's "time for action" and there will be "teeth."
Romain Robert, legal advisor to the EDPB, fleshed out what the board has been up to in the last six months. He said the EDPB is currently communicating about 350 cases on the IMI system — a network built for the supervisory authorities to exchange information. Robert also said there are 280 mutual assistance requests under Article 61 and 22 local case requests under Article 56.
No doubt, DPAs across the EU have been busy. Complaints are up, as are breach notifications. Jelinek noted that complaints are more than doubled, and notifications tripled, at the Austrian DPA. It's clear, however, that the EDPB has been focused on building its one-stop shop mechanism and seeking to set groundwork for harmonization, consistency and proportionality. Karolina Mojzesowicz, the deputy head of data protection at the European Commission, said proportionate fines and sanctions is often discussed among the regulators and that harmonization is important so there is not a difference in fining levels among national authorities.
Jelinek said "the GDPR has substantially changed the way national DPAs" work together. She also pointed out that DPAs now wear two hats: one as their national regulators and the other as members of the EDPB. This "high frequency of meetings, which requires resources," helps to ensure a harmonized approach, which, in turn, she argued, will increase legal certainty for businesses.
Not everything, however, will be about enforcement in 2019. Jelinek said the EDPB knows there's a demand for more guidance. "We will continue to work with stakeholders in a more structured manner next year. ... We do not believe in an ivory-tower approach" to regulating.
"The rubber hasn't hit the road with one-stop shop, yet," Ireland's Dixon said. "We haven't had a case that requires the consensus of all 28 DPAs. It really is a case in progress ... and there are clear challenges that are involved and complex."
In addition to publishing more case studies and best practices, Ireland's Dixon said supervisory authorities need to start exploring certifications and seals under the GDPR. She also suggested that it will be helpful for DPAs to highlight good examples of GDPR compliance as well as bad ones to help the business community.
Dixon also praised some of the GDPR's near-term effects on industry: "We're seeing demonstrable efforts at accountability." And though the 72-hour breach notification requirement "has some issues," the fact there is now mandatory breach reporting "has opened our eyes to breach trends we wouldn't have been aware of" previously. For example, in the more than 3,000 data breaches it's been notified of, the DPCI has found that a large amount of breach notifications are related to coding errors.
In her keynote address, Jelinek reflected on where the EDPB might be a few years from now: "It will be a well-established body" that will be transparent and efficient, concluding, "I'm convinced, as data protection continues to go mainstream, the IAPP members will be the ambassadors."
This is an update to our previous blogs on Brexit.
EU leaders have signed off the withdrawal agreement between the UK and the EU, as well as the political declaration on the framework for the future relationship between the UK and the EU. The political declaration is an outline of what a future EU-UK trade agreement might look like. But the trade agreement has yet to be negotiated and that process won't start until the UK has left the EU on 29th March 2019. If negotiations are quick (and successful) then the intention is that the future trade agreement between the EU and the UK would come into force at the end of the transition period (31st December 2020, but the transition period could be extended).
Next month the withdrawal agreement and the framework for the future EU-UK relationship will be put to the UK Parliament (likely on the 12th December), which will vote on whether to approve them. At the moment, the Parliamentary arithmetic looks worrying for Theresa May. If Parliament votes the deal down then the UK looks to be heading for a constitutional crisis. But we're not there yet, and the Prime Minister is doing her utmost to convince MPs and the public to back the deal. She has also staved off a potential vote of no confidence by Brexit supporting MPs in her own party, unhappy with the withdrawal agreement and the political declaration.
At the same time, the Court of Justice of the European Union is to hear a case on 27th November on the question of whether under Article 50 of the Treaty on European Union, a Member State which has given notice of its intention to withdraw from the EU can unilaterally revoke that notice.
What do the withdrawal treaty and the framework for the future relationship mean for data protection?
If the deal is secured then data flows between the UK and the EU (as well as the rest of the world) continue as normal between the UK's departure from the EU (29th March 2019) and the end of the transition period (ie until 31st December 2020, unless this period is extended).
For the future relationship (i.e. after the transition period), the intention as set out in the political declaration is that data transfers should take place on the basis of an adequacy decision. An adequacy decision means that the European Commission has determined that a country offers an adequate level of data protection, taking into account its domestic legislation and international commitments. This enables personal data to flow freely from the EU to that country. Examples of countries which already benefit from an adequacy decision include Argentina, New Zealand, Canada, and the US (for transfers made to organisations that have certified compliance with the EU-US Privacy Shield).