It has been a busy year for privacy and cybersecurity. Here is a look back at the highlights of 2018 and a preview of what 2019 may have in store in the United States, Europe, and China:
Privacy: This year saw a proliferation of several state and federal legislative proposals aimed at protecting consumer privacy and bolstering cybersecurity protections. Notably, California passed the most sweeping privacy law in the country thus far (the California Consumer Privacy Act of 2018), and amendments to the law will continue well into next year until the law enters into force in 2020. Following the CCPA, many members of Congress and the administration began proposing their own, federal-level privacy laws (including Senators Ron Wyden (D-OR) and Brian Schatz (D-HI)). It is unclear which of these proposals stands the greatest chance of moving forward in 2019, or how many additional proposals will be introduced by members of the new Congress.
Cybersecurity: As of 2018, all 50 states (plus the District of Columbia, Puerto Rico, Guam, and the U.S. Virgin Islands) have their own state breach notification laws. In addition, Ohio Senate Bill 220 entered into force in November. The new law creates a “safe harbor” from certain types of tort-based liability for any “covered entity” that implements a cybersecurity program that satisfies certain requirements. These new laws were enacted against the backdrop continually increasing frequency of and costs associated with cyber incidents.
Federal Trade Commission: The year 2018 ushered in a brand new Federal Trade Commission, which began to signal its enforcement priorities through the “Hearings on Competition and Consumer Protection in the 21st Century.” The hearings suggest that the FTC will continue to focus on privacy, and may pay closer attention to the intersection of privacy and competition. 2019 may be a particularly interesting year for the agency, as many federal privacy legislative proposals include provisions that would expand the scope of the agency’s authority and provide the agency with rulemaking authority and/or the ability to levy civil penalties.
Surveillance Law: The CLOUD Act, which was signed into law in March 2018, created a framework for government access to data held by tech companies worldwide. Next year, we may see that framework be put into action as the United States considers how to approach entering into bilateral executive agreements with certain countries. In addition, the Supreme Court’s decision in Carpenter vs. United Statesheld that law enforcement must get a warrant in order to obtain cell cite location information from cell phone providers. Going into 2019, debate over the scope of the decision will continue as federal courts consider what, if any, additional types of information held by third parties may require a warrant.
Privacy in the Courts: In October 2018, a New Jersey federal court dismissed an eight-count class action complaint against smart TV makers, which included a complaint that the makers allegedly violated the Video Privacy Protection Act (VPPA). In addition, in 2019 the Illinois Supreme Court will decide the statutory standing requirements under the Illinois Biometric Information Privacy Act (BIPA)—the only state biometric law that contains a private right of action.
Of course, the story of the year was the General Data Protection Regulation (GDPR) entering into force on May 25, 2018. The law radically overhauled the European Union’s data protection framework, and may have inspired similar laws and legislative proposals in countries such as Brazil and India. European regulators already are intensifying their enforcement of the GDPR, with several investigations launched and fines levied in the past few months alone.
In addition, in December 2018 the European Commission published its report on the second annual review of the EU-U.S. Privacy Shield. The report concluded that the Privacy Shield “continues to ensure an adequate level of protection” for personal data transferred from the EU to the United States. Separately, the International Trade Administration’s Privacy Shield Team released new guidance regarding how a Privacy Shield participant may rely on the Privacy Shield to receive personal data from the United Kingdom following its planned withdrawal from the EU. In particular, the guidance advised that companies that wish to receive data from the United Kingdom will need to update their privacy policies to do so.
The EU also continued to consider the privacy implications of next-generation technologies such as artificial intelligence. The Declaration on Ethics and Protection in Artificial Intelligence was issued at the 40th Annual Data Protection and Privacy Commissioner’s Conference in Brussels in October 2018, and in December 2018 the EU High-Level Expert Group on AI published new draft guidance on “AI Ethics”. The non-binding guidance stresses that AI must be developed and implemented with a “human-centric approach” that results in “Trustworthy AI,” including by respecting privacy.
In 2019, the story of the year will likely be Brexit, with the United Kingdom scheduled to leave the European Union. As of the date of this post, we are unsure whether there will be a transition period, or whether the departure will be a “hard” Brexit. However, on December 13, 2018, the Information Commissioner’s Office (ICO) issued guidance on the state of UK data protection law in the event of a “hard” Brexit.
In 2018, China issued the national standard on protection of personal information (GB/T 35273-2017 Information Technology – Personal Information Security Specification), which entered into force on May 1, 2018.
For those who may have missed the public letter to Minister Bains from our Federal Privacy Commissioner, regarding Canada's lack of action in the area of privacy laws. Daniel Therrien is not mincing words.
November 23, 2018
The Hon. Navdeep Singh Bains, P.C.
Minister of Innovation, Science and Economic Development (ISED)
235 Queen Street
Ottawa, Ontario K1A 0H5
Dear Mr. Bains:
Subject: ISED’s National Digital and Data Consultations
I am writing you in the context of the National Digital and Data Consultations you launched this past summer, and further to my last discussion with Deputy Minister John Knubley this fall. I have been reflecting a great deal on the Government’s overall strategy to position Canada as a global leader in an increasingly fast-paced digital and data-driven economy, and I would like to offer some views within this context.
The digital revolution is causing us to examine some of the most fundamental questions of our time. It is not an exaggeration to say that the digitization of so much of our lives is reshaping humanity. There are lofty ambitions for the power of digital technologies and big data, and its anticipated ability to drive productivity, growth and competitiveness, and improve our lives in various ways. Yet, at the same time, we have reached a critical tipping point upon which privacy rights and democratic values are at stake. Recent events have shed light on how personal information can be manipulated and used in unintended, even nefarious, ways. I am growing increasingly troubled that longstanding privacy rights and values in Canada are not being given equal importance within a new digital ecosystem eagerly focused on embracing and leveraging data for various purposes. Individual privacy is not a right we simply trade away for innovation, efficiency or commercial gain.
Global opposition to the mass collection of personal data for commercial and political purposes is growing rapidly, and even tech giants are recognizing that the status quo cannot continue. Apple Chief Executive Tim Cook recently spoke of a “data industrial complex” and warned that, “our own information, from the everyday to the deeply personal, is being weaponized against us with military efficiency.” He added, “(t)his is surveillance.” Likewise, Facebook’s Mark Zuckerberg admitted that his company committed a “serious breach of trust” in the Cambridge Analytica matter. Both companies have expressed support for a new U.S. law, similar to Europe’s General Data Protection Regulation (GDPR). You know that the ground has shifted and that we have reached a crisis point when the tech giants have become outspoken supporters of serious regulation. Now is the time to ensure we adopt the best approach for Canadians.
ISED launched its National Digital and Data Consultations this past summer with the message that, to spur digital innovation, investment, and job creation in Canada, citizens must have trust and confidence that their data and privacy will be protected. On privacy and trust, ISED asked Canadians how government should achieve the right balance between protecting privacy and innovation, as well as ways to increase citizens’ trust and confidence on data use “while not impeding innovation.” I am wary of this discourse as it suggests to Canadians that privacy is at odds with innovation, or similarly, that privacy is at one end of the spectrum and digital innovation at the other.
The Government rightly points out that Canadians must have trust and confidence that their data and privacy will be protected. However, I strongly believe that the trust needed to allow the digital economy to flourish, and the social license the government will need from Canadians to innovate with their personal data, hinges on having an appropriate legal framework in place. Yet, when it comes to effecting real legislative change in this context, the Government has been slow to act, putting at continued risk the trust Canadians have in the digital economy and confidence that our Canadian values will be preserved.
We should remember that the Canadian Charter of Rights and Freedoms and the federal Privacy Act were concurrently debated in Canada and born of the realization that privacy rights are intrinsic to other fundamental rights and values including liberty, dignity, and freedom from government intrusion. Privacy is more than a set of technical rules and administrative safeguards; it is certainly not a barrier as is often implied. Instead, it is a necessary precondition for the protection of fundamental values in Canada and worthy of legal protections. At a time when new and intrusive targeting techniques are already influencing democratic processes, and data analytics, automated decision-making technologies, and artificial intelligence are raising important ethical questions that have yet to be answered, Canadians need stronger privacy laws, not more permissive ones. Our laws should protect us when organizations fail to do so.
Under PIPEDA, organizations have a legal obligation to be transparent and accountable, but Canadians cannot rely exclusively on companies to manage their information responsibly. Transparency and accountability are necessary, but they are not sufficient. The reality is that our principles-based law is quite permissive and gives companies wide latitude to use personal information for their own benefit. While our law should probably continue to be principles-based and technologically neutral, it must be rights-based and drafted not as an industry code of conduct but as a statute that confers rights, while allowing for responsible innovation.
There is such a model emerging in the U.S., with Democratic Representatives pushing for an Internet Bill of Rights. It would be principles-based but would also establish rights for consumers. The list of rights includes opt-in consent for collection and sharing of data with a third party, a right related to data portability, a right to have personal information secured and to be notified following a security breach, a right to have an entity that collects personal information to have reasonable business practices and accountability to protect privacy, and probably most importantly, a right not to be unfairly discriminated against or exploited based on one’s personal data. To be sure, these rights would have to be supported by more comprehensive legislation and real remedies, but it is refreshing to see these proposals. They are a simple and clear way to frame principles-based legislation for privacy, compared to our industry code of practice-inspired Act which the courts have said is often difficult to interpret, and importantly, apply.
The position paper for ISED’s national consultation suggested we need an “intentional and agile approach to legislation and regulation that can assist in unlocking the full potential of the digital and data revolution.” Indeed, but I would stress that we cannot allow Canadian democracy to be disrupted, nor can we permit our institutions or rights to be undermined in a race to digitize everything and everyone, simply because technology makes this possible. Canada should simultaneously pursue privacy and innovation, and Privacy by Design is an excellent way to achieve both.
In my recent appearance before the Standing Committee on Access to Information, Privacy and Ethics (ETHI) on the study of the breach of personal information involving Cambridge Analytica and Facebook, I commented that while the EU GDPR is a major development in data protection and offers several excellent solutions, we should seek to develop an approach that reflects the Canadian context and values, including our close trading relationships within North America, with Europe, and the Asia Pacific region. Along these lines, I proposed that a new Canadian law include the following important aspects. It should:
Continue to be technology neutral and principles-based, because these features enable the law to endure over time and create a level playing field, but it should mostly be drafted as a rights based statute, meaning a law that confers enforceable rights to individuals, while also allowing for responsible innovation.
Maintain an important place for meaningful consent but it should also consider other ways to protect privacy where consent may not work, for instance in certain circumstances involving the development of artificial intelligence. The concept of ‘legitimate interest’ in the GDPR may provide one such alternate approach.
Empower a public authority to issue binding guidance or rules that would clarify how general principles and broadly framed rights are to apply in practice. A principles based legislation has important virtues, but it does not bring an adequate level of certainty to individuals and organizations. Binding guidance or rules would ensure a more practical understanding of what the law requires. They could also be amended more easily than legislation as technology evolves.
Confer to the OPC stronger enforcement powers, including the power to make orders and impose fines for non-compliance with the law. These powers should include the right to independently verify compliance, without grounds, to ensure organizations are truly accountable to Canadians for the protection of their personal information.
Give the OPC the ability to choose which complaints to investigate, in order to focus limited resources on issues that pose the highest risk or may have greatest impact for Canadians. At the same time, to ensure no one is left without a remedy, give individuals a private right of action for PIPEDA violations.
Allow different regulators to share information. Meaningful protection of consumers and citizens in the fast-paced digital and data-driven economy understandably must involve several regulators, and they must be able to better coordinate their work.
Finally, it is absolutely imperative for privacy laws to be applied to Canadian political parties.
I believe the best way for Canada to position itself as a digital innovation leader is to demonstrate how we can establish a framework for innovation that also successfully protects Canadian values and rights, and protects our democracy. I offer this feedback in an effort to promote a more balanced approach in Canada, and ensure we assign equal importance to the treatment of data as a valuable asset and the value of privacy in our society. I look forward to hearing the outcomes of your consultations with Canadians. Please note that I am ready to discuss these important issues further, and to engage on legislative reform.
Valuable consideration is not defined under CCPA, but the act authorizes the attorney general to provide guidelines in furtherance of the CCPA’s purpose, and it is expected that a public consultation period will open in 2019.
Arguably the most important right the California Consumer Privacy Act provides to California residents is the right to opt-out of data sales. “Sale” is defined as “selling, renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating orally, in writing, or by electronic or other means, a consumer’s personal information to another business or a third party for monetary or other valuable consideration.” Valuable consideration is not defined under CCPA, but the act authorizes the attorney general to provide guidelines in furtherance of the CCPA’s purpose, and it is expected that a public consultation period will open in 2019.
Absent guidelines, this article proposes a possible framework to interpret “valuable consideration” in light of existing California law.
Under contract law, one of the requirements for the formation of a contract is the existence of valuable consideration. California law defines consideration as “[a]ny benefit conferred, or agreed to be conferred, upon the promisor, by any other person, to which the promisor is not lawfully entitled, or any prejudice suffered, or agreed to be suffered, by such person, other than such as he is at the time of consent lawfully bound to suffer, as an inducement to the promisor, is a good consideration for a promise.” Moreover, where the agreement is in writing, California law provides that “[a] written instrument is presumptive evidence of a consideration.” There are many examples of contract formation with non-monetary consideration. For example, in a non-disclosure agreement, one party agrees to allow another access to confidential information (a detriment) in exchange for service (a benefit).