Blockchain technology is on a collision course with EU privacy law

 Feb 28, 2018 1:00 PM
by Derek Lackey

Those who have heard of "blockchain" technology generally know it as the underpinning of the Bitcoin virtual currency, but there are myriad organizations planning different kinds of applications for it: executing contracts, modernizing land registries, even providing new systems for identity management.

There's one huge problem on the horizon, though: European privacy law.

The bloc's General Data Protection law, which will come into effect in a few months' time, says people must be able to demand that their personal data is rectified or deleted under many circumstances. A blockchain is essentially a growing, shared record of past activity that's distributed across many computers, and the whole point is that this chain of transactions (or other fragments of information) is in practice unchangeable – this is what ensures the reliability of the information stored in the blockchain. 

For blockchain projects that involve the storage of personal data, these two facts do not mix well. And with sanctions for flouting the GDPR including fines of up to €20 million or 4 percent of global revenues, many businesses may find the ultra-buzzy blockchain trend a lot less palatable than they first thought.

"[The GDPR] is agnostic about which specific technology is used for the processing, but it introduces a mandatory obligation for data controllers to apply the principle of 'data protection by design'," said Jan Philipp Albrecht, the member of the European Parliament who shepherded the GDPR through the legislative process. "This means for example that the data subject's rights can be easily exercised, including the right to deletion of data when it is no longer needed.

"This is where blockchain applications will run into problems and will probably not be GDPR compliant." — Jan Philipp Albrecht, MEP

Altering data "just doesn't work on a blockchain," said John Mathews, the chief finance officer for Bitnation a project that aims to provide blockchain-based identity and governance services, as well as document storage. "Blockchains are by their nature immutable. The GDPR says you must be able to remove some data, so those two things don't square off."

There are two main types of blockchain: private or "permissioned" blockchains that are under the control of a limited group (such as the Ripple blockchain that's designed to ease payments between financial services providers); and public or "permissionless" blockchains that aren't really under anyone's control (such as the Bitcoin or Ethereum networks).

It is technically possible to rewrite the data held on a blockchain, but only if most nodes on the network agree to create a new "fork" (version) of the blockchain that includes the changes — and to then continue using that version rather than the original. That's relatively easy on a private blockchain, if not ideal, but on a public blockchain it's a seismic and exceedingly rare event. At least as the technology is currently designed, there is little to no scope for fixing or removing bits of information here and there on an ongoing basis.

"From a blockchain point of view, the GDPR is already out of date," Mathews said. "Regulation plays catch-up with technology. The GDPR was written on the assumption that you have centralized services controlling access rights to the user's data, which is the opposite of what a permissionless blockchain does." 

Jutta Steiner is the founder of Parity.io, a startup that develops decentralized technologies, and the former security chief for the Ethereum Foundation. She agrees with Mathews that "the GDPR needs a proper review."

"From a practitioner's perspective, it sounds to me that it was drafted by trying to implement a certain perspective of how the world should be without taking into account how technology actually works," Steiner said. "The way [public decentralized network] architecture works, means there is no such thing as the deletion of personal data. The issue with information is once it's out, it's out."

"Given the stage where the technology is at, I think there's time to hopefully adjust certain things in the GDPR," Steiner added. "I can't see the regulators being so stubborn as to not adjust the regulation. … They'll just see the other countries will use the technology and Europe is at a disadvantage."

"I can't see the regulators being so stubborn as to not adjust the regulation. … They'll just see the other countries will use the technology and Europe is at a disadvantage." — Jutta Steiner, Parity.io

That seems unlikely to happen anytime soon. The GDPR is a new regulation, and EU laws tend to last for a long time before revision — the Data Protection Directive that preceded the GDPR was drafted way back in 1995.

"Certain technologies will not be compatible with the GDPR if they don't provide for [the exercising of data subjects' rights] based on their architectural design," Albrecht insisted. "This does not mean that blockchain technology in general has to adapt to the GDPR, it just means that it probably cannot be used for the processing of personal data. This decision is the responsibility of every organization that processes personal data."

Although the clash between the GDPR and blockchain technology has received little attention so far, it has occurred to some people.

The Interplanetary Database was, until its main funder recently pulled support, a project that aimed to build a blockchain-based database system – it was to be a sort of hybrid private-public blockchain, where the nodes in the network were preselected, but anyone could send transactions to the network or read the data stored on it. According to IPDB Foundation co-founder Greg McMullen, the Berlin-headquartered team was well aware of the problems posed by the GDPR.

One problem, McMullen said, was the inability to modify or delete data stored in a blockchain. But there was another issue, too.

"The GDPR is written for a cloud services model where, say, I'm a startup and I collect restaurant order data and I store it all on Amazon Web Services, and they do my hosting for me, so I have to have a contract with Amazon that passes on my privacy obligations to them," McMullen said. "It works really well when there's one or two providers, but when you start having a decentralized network it breaks down entirely. You can't have a contract with [all] the nodes on the Ethereum network. It's unfeasible."

So who actually is liable for data protection in a decentralized network? After all, one of the big attractions of such networks is that they are resistant to censorship, because there's no central body – no Amazon or Facebook – for enforcers to go after, and because the nodes or users that make up the network are scattered around the world.

According to Albrecht, if it's a private blockchain, GDPR compliance is the responsibility of the organization that's deploying it. "For decentralized and public blockchain applications, it would be the responsibility of each user who puts personal data in the distributed ledger to ensure this is GDPR compliant," the parliamentarian said. "Which in most cases it won't [be]."

"It's true that the regulations will need to catch up with the technology, but you have to be realistic about the fact that the GDPR is a real thing and it's happening, and there will be enforcement of it." -Greg McMullen, IPDB

The liability issue will scare many businesses off using blockchains, McMullen warned. "It's true that the regulations will need to catch up with the technology, but you have to be realistic about the fact that the GDPR is a real thing and it's happening, and there will be enforcement of it," he said. "When you're asking companies to use blockchains, they're not going to take that risk with their customers' data – or at least they shouldn't be."

According to McMullen, the IPDB Foundation had been working on various ideas for dealing with the data protection problem. One was a system of "blacklisting" certain data so that, even if it wasn't deleted from the network when this was required, it wouldn't be served when requested. 

Another idea was to...

Read The Full Article

 

 

  

Legislating privacy by design in Canada

 Feb 24, 2018 12:00 PM
by Derek Lackey

With a shout out to Tim Banks, here is his recent update on PIPEDA and Privacy By Design as published on the IAPP website.

"The Standing Committee on Access to Information, Privacy and Ethics is ready to table its report following its months’ long review of Canada’s Personal Information Protection and Electronic Documents Act. The committee adopted its report, entitled “Towards Privacy by Design: A Review of Personal Information Protection and Electronic Documents Act (PIPEDA)” Feb. 13 and ordered that the chair of the committee table the report to the House of Commons. The House of Commons will resume sitting Feb. 26, and the report could be tabled soon afterwards.

What should we expect to see in the report? The title provides a clue to its major theme. We expect to see a recommendation that PIPEDA be amended to expressly require “privacy by design.” What will be interesting is whether the committee will recommend Parliament give the Office of the Privacy Commissioner teeth by providing it with order-making powers. The combination of “privacy by design” and “order-making” would bring PIPEDA a giant step toward substantive equivalency with the European Union’s General Data Protection Direction. This would satisfy one of the goals of many privacy advocates: to ensure that Canada retains its adequacy designation when PIPEDA is reviewed by the EU against the GDPR once the GDPR goes into effect.

Privacy by design comes home

The concept of “privacy by design” can be traced back to the work of Ann Cavoukian when she was the Information Privacy Commissioner of Ontario. In its original form, privacy by design has seven foundational principles, including a proactive approach to embed privacy protective measures in design and to ensure that privacy is the default setting in systems.

The possibility that the committee will recommend legislating “privacy by design” would not be surprising. The committee heard from several witnesses who claimed that PIPEDA will be at risk of losing its “adequacy” designation when it is reviewed by the EU under the lens of the GDPR. These advocates cited “privacy by design” as a gap. Privacy by design is embedded in Article 25 of the GDPR. Article 25 requires data controllers to consider privacy impacts early in the design stage and to ensure that data minimization occurs. This may involve ensuring that user settings default to the minimum amount of data sharing necessary to provide the services requested by the individual. The organization should deploy pseudonymization and other techniques to limit the impact of the collection and use of personal information on an individual.

Although a comparison of PIPEDA with the GDPR was a consistent theme in the committee hearings, the committee had a special opportunity to hear directly from the data protection authority for the European Union institutions, bodies and agencies. June 13, 2017, the committee held a special meeting to hear from Giovanni Buttarelli, the European Data Protection Supervisor. During his testimony, Buttarelli cited Article 25 of the GDPR as an important legislated addition to EU data protection law. He also noted that Canada’s Privacy Commissioner Daniel Therrien cited “privacy by design” as a key difference between the GDPR and PIPEDA. Moreover, throughout the committee’s hearings, Liberal Member of Parliament Raj Saini consistently asked witnesses whether it would be important to embed an express requirement for privacy by design in PIPEDA. Witnesses generally stated that an express legislative requirement would help.

Legislating privacy by design won’t be the committee’s only recommendation; however, it may be an organizing principle for several other possible recommendations. For example, the committee heard from many witnesses on the issue of children. Legislating an age for consent could be vulnerable to constitutional challenges, though, as impermissibly interfering in an area that is within provincial jurisdiction. Privacy by design could become a less constitutionally controversial method by which the federal government could require organizations to design online products and services to minimize data collection from children, to pseudonymize that data, and to delete it when it is no longer required.

If the committee does recommend legislating privacy by design and Canada’s federal government obliged, this would be a home-coming for this Canadian-made principle.

Order-making powers as a companion to privacy by design?

A recommendation that the OPC receive order-making powers seems very likely at this point. Currently, the OPC can make recommendations to an organization following an investigation or enter into a compliance agreement. The authority to enter into binding compliance agreements is relatively new. However, it is not yet clear what would motivate an organization to enter into a compliance agreement without the OPC having order-making powers. Currently, if an organization refuses to enter into a compliance agreement or abide by recommendations, the OPC would have to go to Federal Court. The OPC is not given deference before the Federal Court and would have to convince the Federal Court to come to the same conclusion as the OPC. This creates significant litigation risk for the OPC.

Many privacy advocates argued that the OPC requires order-making powers, as did former federal Privacy Commissioner Jennifer Stoddart in her testimony. For his part, Buttarelli commented that the Canadian “ombudsman approach seems to be much less effective” than the approach in Europe where most data protection authorities will likely have direct order-making powers. In Commissioner Therrien’s final appearance before the Committee, the Liberal Member of Parliament Nathaniel Erskine-Smith asked Therrien directly whether providing greater enforcement powers, particularly the power to make orders, would be “getting a head of international practice.” Therrien was succinct in his response: “It is not at all getting ahead. We are behind, so it would be more consistent with what is becoming the norm.”

If PIPEDA is amended to require privacy by design and to provide the OPC with order-making powers, the OPC’s effectiveness in addressing systemic issues would be radically enhanced. Once armed with a legislative requirement to implement privacy by design, the OPC could investigate if it had a reason to believe that an organization was not implementing privacy as the default in its design of services. This may prove to be a more flexible basis to investigate systemic issues. The OPC could then directly order remediation.

A recommendation to give the OPC order-making powers would be contrary to the counsel of the Canadian Bar Association and most lawyers in private practice representing commercial organizations subject to PIPEDA. The CBA’s written submissions cautioned that conferring order-making powers on the OPC could result in a violation of the principles of fundamental justice. The OPC would be the investigator, the advocate and the decision-maker. Even if the OPC could be restructured to ensure separation of activities, the CBA argued this new power could have a “chilling effect” on open and cooperative dialogue. Instead, the CBA recommended that the prudent approach would be to wait for more experience with the new power to enter into binding compliance agreements.

Next Steps

As is customary, the committee has requested that the federal government table a comprehensive response to the report. The Liberal government has shown itself willing to grant the information commissioner order-making power. However, it also charged the Standing Committee on Industry, Science and Technology with a review of Canada’s Anti-Spam Legislation, which quickly escalated out of control by what has been perceived by businesses to be disproportionately aggressive activity by the regulator. Order-making is not, therefore, an inevitable outcome.

Timothy M Banks CIPP/C, CIPM is a lawyer supporting the Amazon Web Services business in Canada. In this capacity, Tim provides advice to AWS on issues relating to the cloud computing business in Canada, including the negotiation of enterprise agreements for AWS’s cloud services. Tim previously developed and led the privacy and cybersecurity practice at Dentons Canada LLP and was part of the global leadership team for the privacy and cybersecurity practice at Dentons. While a partner at Dentons, Tim advised clients on the full range of privacy and cybersecurity issues, including private, public and health sector privacy laws, online behavioral advertising, data breaches, consumer litigation, and cloud computing assurance. He has contributed regularly to the Privacy Tracker and was formerly a KnowledgeNet Toronto chapter volunteer. He counts the IAPP as the most important professional organization of which he is a member.

  

GDPR + e-Privacy = :-(

 Feb 24, 2018 12:00 PM
by Derek Lackey

At some point in your life, you’ve probably had the experience of meeting someone who you feel you ought to like but, no matter how hard you try, you just can’t seem to gel with them - awkward silences creep into conversations and you find that, while you may share similar values, the ways you each go about approaching things are just different. Ultimately, despite both your best efforts, there’s just no chemistry.

That’s what I imagine Europe’s GDPR and e-Privacy Directive would be like as playmates, if only they were people. They share common values - the protection of individuals’ fundamental rights to privacy and to data protection - and yet, try as they might, they just don’t play together all that nicely. 

Unambiguous consent for cookies?

Nowhere is this more apparent than when it comes to the issue of cookie consent. The e-Privacy Directive is a lex specialis (meaning a law that deals with a specific subject matter - in this case, the preservation of privacy over electronic communications channels).  It sits alongside the current Data Protection Directive / soon-to-be-in-effect GDPR (I’ll just say GDPR from hereon), setting out special rules deal with things like the privacy of communications content and metadata, e-marketing, and - of course - cookie requirements. The GDPR applies for any wider data protection issues concerning personal data which aren’t addressed by the e-Privacy Directive.

So far, so good, but the treatment of cookies under these two laws raises a real conundrum. In 2009, the e-Privacy Directive was updated to require “consent” for all non-essential cookies.  This led to a flurry of activity all across online Europe, as websites everywhere hurriedly erected cookie consent banners. It also led to heated debate between regulators, industry, lawyers, and civil advocacy groups as to whether consent could be “implied” through the mere display of a cookie banner and continued browsing of a website, with cookies being dropped at the same time the cookie banner was displayed. Whatever the rights and wrongs of that debate, implied consent quickly became the norm.

Time passes by, and along comes the GDPR. The GDPR doesn’t directly affect cookies, because cookies are addressed under the e-Privacy Directive. It does have an indirect effect though - namely by re-defining consent to say that any consent given must be “unambiguous”. And, because “consent” under the e-Privacy Directive is interpreted by reference to the definition of “consent” under GDPR, it follows that cookie consent banners post-GDPR must now collect visitors’ unambiguous consent.

What practical effect does that have? While it’s not entirely certain, it seems that the use of implied cookie consent mechanisms is, at least in principle, still possible - even if not what regulators would really like to see. Unambiguous consent requires a “clear affirmative action” on the part of the website visitor - and, so, if a website makes sufficiently clear that continued navigation amounts to consent, and a visitor continues to navigate a website (the affirmative action) after having been given this information and the opportunity to decline cookies, then there is at least a decent argument that an unambiguous consent was given.

I say “decent argument” because the ability to maintain that an implied consent is unambiguous depends upon at least a couple of critical factors: first, the prominence of the cookie banner itself - a banner which is buried out of sight, or which uses font sizes or colouring that make it near impossible to read will not serve to sufficiently inform the visitor that their continued use of the website will amount to consent, and so no unambiguous consent can be obtained; second, the timing of the cookie drop - if cookies are dropped at the same time as the banner, as is very often the case today, then it’s more-or-less impossible to maintain any argument that the visitor “unambiguously” consented to those cookies, given that they only learned about them after the cookies had already been served. To have a decent argument for unambiguous implied consent, the user at least needs to be informed about, and have the opportunity to decline, cookies before they get served.

The “consent + legitimate interests” debate

There is a more challenging and technical problem, however, and this is the interplay of the need to get cookie “consent” under the e-Privacy Directive and the requirement to have a lawful basis for processing personal data under Article 6 of the GDPR.  

The problem is something like this: if we need consent to use cookies under the e-Privacy Directive, does it follow that the appropriate lawful basis for processing any personal data collected using those cookies under Art. 6 GDPR must also be consent? Or could, for example, you use consent to serve the cookies (under e-Privacy), but rely upon legitimate interests (or another lawful ground) to process the personal data collected using the cookies (under GDPR)?

This might seem like a somewhat academic debate, but it has some important regulatory and practical implications. For one thing, if your lawful basis for processing personal data under GDPR is consent, then - at least, according to regulatory guidance - there are greater obligations to identify by name (rather than by category) the third parties with whom data may be shared. For another, you also need to keep verifiable consent records (not a requirement for legitimate interests). Next, the Right to be Forgotten becomes more powerful where consent is the lawful basis under the GDPR (the individual simply has to withdraw consent). And data portability rights are also triggered with consent, whilst they don’t apply when processing is based upon legitimate interests.

This inevitably will lead some businesses to prefer a “consent (e-Privacy) + legitimate interests (GDPR)” approach, and again there are grounds for considering this a reasonable thing to do. The e-Privacy Directive, while it complements the GDPR, is a separate piece of legislation, and its consent requirements serve a subtly different purpose to the requirement to have lawful processing grounds under the GDPR (consent under e-Privacy is for access to or storage of information on an end user’s terminal equipment, while a lawful basis under GDPR is needed for processing of personal data).  

For any business reliant upon cookie use...

Read The Full article on Fieldfisher's site

 

GDPR, ePrivacy