Freedom and democracy cannot exist without privacy

 Jan 28, 2019 4:00 PM
by Derek Lackey

On this international Data Privacy Day, and after a year of severe abuses, it is worth reflecting on why it is essential to protect privacy.  

Privacy is often cast as an abstract or undervalued concept associated with a desire to keep secret certain aspects of our activities or our personality that we prefer to keep to ourselves.

This is a very narrow outlook. In fact, privacy is nothing less than a prerequisite for freedom:  the freedom to live and develop independently as a person, away from the watchful eye of a surveillance state or commercial enterprises, while still participating voluntarily and actively in the regular, day-to-day activities of a modern society.

Data-driven technologies undoubtedly bring great benefits to individuals.  They can be fun and convenient but they can also be powerful tools for personal development.  They open the door to huge opportunities for improving health care and hold the promise for a future built on artificial intelligence (AI) in which the possibilities seem endless.

On the other hand, these technologies also create new risks. For example, some AI applications, which rely on the massive accumulation of personal data, also put other fundamental rights at risk.

One such risk is the potential for discrimination against people resulting from decisions made by artificial intelligence systems. These systems are generally non-transparent and some have been found to rely on data sets that contain inherent bias, in violation of privacy principles. Such discrimination could potentially result in the restriction of availability of certain services, or result in the exclusion of people from certain aspects of personal, social and professional life, including employment.

In December, AI ethics researchers released the Montreal Declaration for the Responsible Development of Artificial Intelligence – a set of 10 principles for developers and organizations that implement AI, as well as the individuals subject to it.

While this ethical framework marks an important, made-in-Canada development that should help guide this emerging sector, I would agree with the Declaration’s authors who say it is only a first step, and that public authorities now need to act.  Governments and legislators in particular have an important role to play in drawing on ethical principles to create an enforceable legal framework for AI that formally requires relevant actors to act fairly and responsibly.

We have also seen in recent years, and in particular in 2018, how privacy breaches can adversely impact the exercise of our democratic rights. The massive accumulation of personal data by certain state and commercial actors makes our democracies vulnerable to manipulation, including by foreign powers. It is unfortunate that the 2019 federal election will take place without any significant strengthening of our personal data protection laws.

In 2019, as the federal government and legislators consider what should be Canada’s national data strategy and laws for the modern age, it is important as a society to remember privacy’s role in protecting other fundamental rights and values, including freedom and democracy. If this happens, we will have drawn the right lessons from 2018.

Link to original 

 

  

PhD Candidate Robbert van Eijk measures privacy component in online advertising

 Jan 28, 2019 10:00 AM
by Derek Lackey

You check out Facebook to see if one of your friends or someone in your family has done something interesting. Your attention is drawn to a holiday advert. That’s a coincidence, you think, because just before you went to Facebook you had been searching internet for a holiday destination. But this is no coincidence: dozens of parties are looking over your shoulder to see what you are getting up to on internet and this influences which adverts you get to see and where. PhD Candidate Robbert van Eijk investigated this process and the observance of privacy legislation in European countries. He will defend his doctoral thesis on 29 January.

The technology which facilitates online advertising is called 'real-time bidding' (RTB). When you visit a website, within a few tenths of a second the advert space on that page is ‘auctioned’: on the basis of data saved in cookies it is determined what kind of adverts are most relevant for you.  The provider who places the highest bid for this kind of advert ‘wins’ and is given - upon payment of course - space by the advertiser to promote his product. 'The motive to write this doctoral thesis came from the desire to investigate real-time bidding at the intersection between technology and the law’, Van Eijk explains. 'I wanted to find out more about what happens when as a visitor to a website you get to see adverts which appear to be tracking your steps. This topic is relevant in light of the application of the General Data Protection Regulation (GDPR) and the current cookie legislation and its rules which are laid down, among others, in Article 11.7a of the (Dutch) Telecommunications Act.'

In his research Van Eijk demonstrates that this privacy component can be measured. 'I combine law and data science in my research by applying mathematic algorithms to the network traffic picked up between the browser and the websites visited. Taking a network-science perspective to the privacy component of RTB is new, by being able to distinguish the networks of partners involved in an advertisement system when displaying an advert on the website which an internet user visits. These advertisement networks partly overlap one another. This new way of observing the process also shows which role partners have in an advertisement network in collecting and sharing the data of website visitors.'

Dominant companies

Van Eijk demonstrates in the research that two kinds of algorithms enable transparency in the mutual collaboration arrangements (the betweenness). 'These are cluster edge betweenness and node betweenness. The first is a standard that is based on the shortest paths between the partners in an advertisement network. The algorithm solves an important issue: which RTB partners are clustered in an RTB system?  The second solves another important issue: who are the dominant companies in a network of RTB partners? Node betweenness helps us to distinguish between the companies.'

In addition, the researcher provides transparency concerning various differences between European countries. 'I show that a Graph-Based Methodological Approach (GBMA) can indicate the situation concerning differences in permission in 28 European countries; for example, differences in cookie notifications and cookie walls. In Europe we see two mechanisms in relation to permission. An implicit permission (where tracking cookies have already been installed before the end user has given permission) and a strict permission mechanism (where the legal requirements are implemented to the extent that no tracking cookies are (allowed to be) installed on the equipment of the end user or information can be read from the equipment when he visits a webpage). In this way, countries with implicit mechanisms can be compared to countries where strict mechanisms predominate. This leads to unequal rights.'

Through his research...

Read The Full Article

 

 

  

European Commission adopts adequacy decision on Japan

 Jan 23, 2019 12:00 PM
by Derek Lackey

The Commission has adopted today its adequacy decision on Japan, allowing personal data to flow freely between the two economies on the basis of strong protection guarantees.

This is the last step in the procedure launched in September 2018, which included the opinion of the European Data Protection Board (EDPB) and the agreement from a committee composed of representatives of the EU Member States. Together with its equivalent decision adopted today by Japan, it will start applying as of today.

Věra Jourová, Commissioner for Justice, Consumers and Gender Equality said: “This adequacy decision creates the world's largest area of safe data flows. Europeans' data will benefit from high privacy standards when their data is transferred to Japan. Our companies will also benefit from a privileged access to a 127 million consumers' market. Investing in privacy pays off; this arrangement will serve as an example for future partnerships in this key area and help setting global standards.” 

The key elements of the adequacy decision

Before the Commission adopted its adequacy decision, Japan put in place additional safeguards to guarantee that data transferred from the EU enjoy protection guarantees in line with European standards. This includes:

  A set of rules (Supplementary Rules) that will bridge several differences between the two data protection systems. These additional safeguards will strengthen, for example, the protection of sensitive data, the exercise of individual rights and the conditions under which EU data can be further transferred from Japan to another third country. These Supplementary Rules will be binding on Japanese companies importing data from the EU and enforceable by the Japanese independent data protection authority (PPC) and courts.

  The Japanese government also gave assurances to the Commission regarding safeguards concerning the access of Japanese public authorities for criminal law enforcement and national security purposes, ensuring that any such use of personal data would be limited to what is necessary and proportionate and subject to independent oversight and effective redress mechanisms.

  A complaint-handling mechanism to investigate and resolve complaints from Europeans regarding access to their data by Japanese public authorities. This new mechanism will be administered and supervised by the Japanese independent data protection authority. 

The adequacy decisions also complement the EU-Japan Economic Partnership Agreement- which will enter into force in February 2019. European companies will benefit from free data flows with a key commercial partner, as well as from privileged access to the 127 million Japanese consumers. The EU and Japan affirm that, in the digital era, promoting high privacy and personal data protection standards and facilitating international trade must and can go hand in hand.

Next steps

The adequacy decision – as well as the equivalent decision on the Japanese side –will start applying as of today.

After two years, a first joint review will be carried out to assess the functioning of the framework. This will cover all aspects of the adequacy finding, including the application of the Supplementary Rules and the assurances for government access to data. The Representatives of European Data Protection Board will participate in the review regarding access to data for law enforcement and national security purposes. Subsequently a review will take place at least every four years.  

Background

The mutual adequacy arrangement with Japan is a part of the EU strategy in the field of international data flows and protection, as announced in January 2017 in the Commission's Communication on Exchanging and Protecting Personal Data in a Globalised World.

The EU and Japan successfully concluded their talks on reciprocal adequacy on 17 July 2018 (see press release). They agreed to recognise each other's data protection systems as adequate, allowing personal data to be transferred safely between the EU and Japan.

In July 2017, President Juncker and Prime Minister Abe committed to adopting the adequacy decision, as part of the EU and Japan's shared commitment to promote high data protection standards on the international scene (see statement).

The processing of personal data in the EU is based on the General Data Protection Regulation (GDPR), which provides for different tools to transfer personal data to third countries, including adequacy decisions. The European Commission has the power to determine whether a country outside the EU offers an adequate level of data protection. The European Parliament and the Council can request the European Commission to maintain, amend or withdraw these decisions. 

For More Information 

 

  

WTF!?! CNIL Fines GOOGLE $57 Million under GDPR

 Jan 22, 2019 10:00 AM
by Derek Lackey

The early fines to American tech firms will reveal another level of guidance from the Data Protection Authorities. First you should read the LAW. Then seek clarity from the official guidance documents. Then finally, look to the details of the violations. WHAT they fine for is critical information for operations people to set new practices. HOW MUCH they fine for is critical for business risk analysis.

The recent fine from CNIL for GOOGLE is based on "the infringements observed deprive the users of essential guarantees regarding processing operations that can reveal important parts of their private life since they are based on a huge amount of data, a wide variety of services and almost unlimited possible combinations."

GDPR is all about TRANSPARENCY. INFORMATION. CONSENT.

CNIL claim that when setting up an Android device (GOOGLE) consents for processing data must be 

  • clear, 
  • unambiguous, 
  • easy to understand, 
  • easily accessible, 
  • communicate a legal basis for processing,  and 
  • must show a positive action on the part of the data subject.

Specifically, when signing up for an Android account the purpose of processing your personal data is far too generic and of a "vague manner". The same could be said for communicating "the CATEGORIES of data processing for various purposes". With more than 20 different service offerings, GOOGLE 's consent requests are not easily understood. It must be clear for each type of consent which legal basis for processing is being claimed AND how long GOOGLE planned to keep that information.

Therefore the consent GOOGLE believes they have, is not considered valid by the CNIL. While it "is possible to configure the display of personalized ads", CNIL determined that GOOGLE was "not sufficiently informed regarding the extent of the consents requested". GOOGLE was neither "specific" nor "unambiguous". In fact many of those consents were not easily accessed and when they were, the boxes were pre-checked, therfore no positive action was required on behalf of the data subject.

Finally, in order to set up an account, the user must agree to Terms of Service - "I agree to Google’s Terms of Service» as well as  « I agree to the processing of my information as described above and further explained in the Privacy Policy». GDPR requires "specific consent". It is only valid if it is "provided distinctly for each purpose."

It is important to note: "the violations are continuous breaches of the Regulation as they are still observed to date. It is not a one-off, time-limited, infringement." Look for GOOGLE to be fined again for these very same activities if these practices are not corrected immediately.

 

  

I Mentored Mark Zuckerberg. I Loved Facebook. But I Can't Stay Silent About What's Happening.

 Jan 19, 2019 8:00 AM
by Derek Lackey

Written by Roger McNamee in TIME magazine:

"I am really sad about Facebook.

I got involved with the company more than a decade ago and have taken great pride and joy in the company’s success … until the past few months. Now I am disappointed. I am embarrassed. I am ashamed. With more than 1.7 billion members, Facebook is among the most influential businesses in the world. Whether they like it or not–whether Facebook is a technology company or a media company–the company has a huge impact on politics and social welfare. Every decision that management makes can matter to the lives of real people. Management is responsible for every action. Just as they get credit for every success, they need to be held accountable for failures. Recently, Facebook has done some things that are truly horrible, and I can no longer excuse its behavior."

 

Nine days before the November 2016 election, I sent the email above to Facebook founder Mark Zuckerberg and chief operating officer Sheryl Sandberg. It was the text for an op-ed I was planning to publish about problems I was seeing on Facebook. Earlier in the year, I noticed a surge of disturbing images, shared by friends, that originated on Facebook Groups ostensibly associated with the Bernie Sanders campaign, but it was impossible to imagine they came from his campaign. I wanted to share with Sandberg and Zuckerberg my fear that bad actors were exploiting Facebook’s architecture and business model to inflict harm on innocent people.

I am a longtime tech investor and evangelist. Tech has been my career and my passion. I had been an early adviser to Zuckerberg–Zuck, to many colleagues and friends–and an early investor in Facebook. I had been a true believer for a decade. My early meetings with Zuck almost always occurred in his office, generally just the two of us, so I had an incomplete picture of the man, but he was always straight with me. I liked Zuck. I liked his team. I was a fan of Facebook. I was one of the people he would call on when confronted with new or challenging issues. Mentoring is fun for me, and Zuck could not have been a better mentee. We talked about stuff that was important to Zuck, where I had useful experience. More often than not, he acted on my counsel.

When I sent that email to Zuck and Sheryl, I assumed that Facebook was a victim. What I learned in the months that followed–about the 2016 election, about the spread of Brexit lies, about data on users being sold to other groups–shocked and disappointed me. It took me a very long time to accept that success had blinded Zuck and Sheryl to the consequences of their actions. I have never had a reason to bite Facebook’s hand. Even at this writing, I still own shares in Facebook. My criticism of the company is a matter of principle, and owning shares is a good way to make that point. I became an activist because I was among the first to see a catastrophe unfolding, and my history with the company made me a credible voice.

This is a story of my journey. It is a story about power. About privilege. About trust, and how it can be abused.

The massive success of Facebook eventually led to catastrophe. The business model depends on advertising, which in turn depends on manipulating the attention of users so they see more ads. One of the best ways to manipulate attention is to appeal to outrage and fear, emotions that increase engagement. Facebook’s algorithms give users what they want, so each person’s News Feed becomes a unique reality, a filter bubble that creates the illusion that most people the user knows believe the same things. Showing users only posts they agree with was good for Facebook’s bottom line, but some research showed it also increased polarization and, as we learned, harmed democracy.

To feed its AI and algorithms, Facebook gathered data anywhere it could. Before long, Facebook was spying on everyone, including people who do not use Facebook. Unfortunately for users, Facebook failed to safeguard that data. Facebook sometimes traded the data to get better business deals. These things increased user count and time on-site, but it took another innovation to make Facebook’s advertising business a giant success.

From late 2012 to 2017, Facebook perfected a new idea–growth hacking–where it experimented constantly with algorithms, new data types and small changes in design, measuring everything. Growth hacking enabled Facebook to monetize its oceans of data so effectively that growth-hacking metrics blocked out all other considerations. In the world of growth hacking, users are a metric, not people. Every action a user took gave Facebook a better understanding of that user–and of that user’s friends–enabling the company to make tiny “improvements” in the user experience every day, which is to say it got better at manipulating the attention of users. Any advertiser could buy access to that attention. The Russians took full advantage. If civic responsibility ever came up in Facebook’s internal conversations, I can see no evidence of it.

The people at Facebook live in their own bubble. Zuck has always believed that connecting everyone on earth was a mission so important that it justified any action necessary to accomplish it. Convinced of the nobility of their mission, Zuck and his employees seem to listen to criticism without changing their behavior. They respond to nearly every problem with the same approach that created the problem in the first place: more AI, more code, more short-term fixes. They do not do this because they are bad people. They do this because success has warped their perception of reality. They cannot imagine that the recent problems could be in any way linked to their designs or business decisions. It would never occur to them to listen to critics–How many billion people have the critics connected?–much less to reconsider the way they do business. As a result, when confronted with evidence that disinformation and fake news had spread over Facebook and may have influenced a British referendum or an election in the U.S., Facebook followed a playbook it had run since its founding: deny, delay, deflect, dissemble. Facebook only came clean when forced to, and revealed as little information as possible. Then it went to Plan B: apologize, and promise to do better.

Thanks to Facebook’s extraordinary success, Zuck’s brand in the tech world combines elements of rock star and cult leader. He is deeply committed to products and not as interested in the rest of the business, which he leaves to Sandberg. According to multiple reports, Zuck is known for micromanaging products and for being decisive. He is the undisputed boss. Zuck’s subordinates study him and have evolved techniques for influencing him. Sheryl Sandberg is brilliant, ambitious and supremely well organized. Given Zuck’s status as the founder, the team at Facebook rarely, if ever, challenged him on the way up and did not do so when bad times arrived. (A Facebook spokesperson replies: “People disagree with Mark all the time.”)

You would think that Facebook’s users would be outraged by the way the platform has been used to undermine democracy, human rights, privacy, public health and innovation. Some are, but nearly 1.5 billion people use Facebook every day. They use it to stay in touch with distant relatives and friends. They like to share their photos and their thoughts. They do not want to believe that the same platform that has become a powerful habit is also responsible for so much harm. Facebook has leveraged our trust of family and friends to build one of the most valuable businesses in the world, but in the process, it has been careless with user data and aggravated the flaws in our democracy while leaving citizens ever less capable of thinking for themselves, knowing whom to trust or acting in their own interest. Bad actors have had a field day exploiting Facebook and Google, leveraging user trust to spread disinformation and hate speech, to suppress voting and to polarize citizens in many countries. They will continue to do so until we, in our role as citizens, reclaim our right to self-determination.

We need to begin to reform Facebook and Big Tech in these key areas:

Read The Full Article in TIME

 

  

Gartner Predicts for the Future of Privacy 2019

 Jan 18, 2019 12:00 PM
by Derek Lackey

Security and risk management leaders, including CISOs and privacy professionals, must recognize maturing privacy regulations to ensure a privacy-friendly operation.

Privacy is a business-critical discipline for many organizations, enforced by multiple regulations. Most recently, the European Union’s General Data Protection Regulation (GDPR) has driven a global movement of maturing privacy and data protection laws with stricter requirements.

“Privacy requirements dramatically impact an organization’s strategy, purpose and methods for processing personal data”

“Multiple countries are implementing regulations inspired by the GDPR principles, a movement that is likely to continue into the foreseeable future,” says Bart Willemsen, Senior Director Analyst, Gartner. “These privacy requirements dramatically impact an organization’s strategy, purpose and methods for processing personal data. Furthermore, breaches of these requirements carry financial, reputational and regulatory implications.”

Security and risk management leaders must take note of the following Gartner 2019 predictions for privacy to ensure transparency and customer assurance.

 

By 2020, the backup and archiving of personal data will represent the largest area of privacy risk for 70% of organizations, up from 10% in 2018

 

Today, organizations hold backups of large volumes of personal data that is both sensitive and vulnerable with no clear intentions of using it. Because the sensitivity is a constant characteristic and the vulnerability is arguably equivalent, the volume dictates the level of risk, and represents the largest area of privacy risk today. Additionally, privacy regulations have introduced penalties and stiff fines for violations, making the risk of holding unused personal data potentially very expensive.

Over the next two years, organizations that don’t revise data retention policies to reduce the overall data held, and by extension the data that is backed up, will face a huge sanction risk for noncompliance as well as the impacts associated with an eventual data breach. GDPR, for example, introduced regulatory fines of up to 4% of annual global turnover or €20 million, whichever is greater, for noncompliance.

 

By 2022, 75% of public blockchains will suffer “privacy poisoning” — inserted personal data that renders the blockchain noncompliant with privacy laws

 

Blockchain is a promising technology; however, businesses looking to implement blockchain technology must determine whether the data being used is subject to any privacy laws. For example, public blockchains need an immutable data structure, meaning once data is recorded, it cannot easily be modified or deleted. Privacy rights granted to individuals include the option for customers to invoke the “right to be forgotten.” In many such cases, personal data processed about them must be deleted.

This raises immediate concerns, as entries in a public blockchain poisoned with personal data can’t be replaced, anonymized or structurally deleted. Therefore, businesses are unable to meet the need to keep records with their obligations to comply with privacy laws. Organizations that implement blockchain systems without managing privacy issues by design will run the risk of storing personal data that can’t be deleted without compromising chain integrity.

By 2023, over 25% of GDPR-driven proof-of-consent implementations will involve blockchain technology, up from less than 2% in 2018

Although GDPR guidelines have been in effect since 25 May 2018, organizations are at different levels of compliance. The pressure to fully comply is increasing, driving organizations in or doing business with the EU to further evaluate their data collection processes. However, most are struggling with integration costs and technologies that can help speed up compliance.

“The application of blockchain to consent management is an emerging scenario at an early stage of experimentation,” says Willemsen. “Various organizations have started exploring the use of blockchain for consent management because the potential immutability and tracking of orthodox blockchains could provide the necessary tracking and auditing required to comply with data protection and privacy legislation.”

Read The Full Report at Gartner (membership required)

 

  

A look at the UK's data protection law in a no-deal Brexit situation

 Jan 12, 2019 8:00 AM
by Derek Lackey

We previously reported on Brexit's impact on data protection here and here.

Shortly before Christmas, the draft Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 ("Exit Regulations", available here) were laid before Parliament. In this blog, we outline the changes under the Exit Regulations and consider what impact they will have if Brexit leaves the UK in a no-deal scenario. 

Preparing for a no-deal Brexit

As a brief reminder of the current legislative landscape in the UK, so long as the UK is in the EU the GDPR has direct effect. The Data Protection Act 2018 ("DPA") must be read alongside the GDPR and has multiple functions. Firstly, it supplements the GDPR and contains derogations that are permitted by the GDPR (such as providing for additional conditions around the processing of special categories of personal data and exemptions in respect of data subject rights). The DPA also applies a broadly equivalent regime to certain data processing activities which fall outside the scope of the GDPR. These relate to, for example, the processing of personal data for immigration purposes and manual unstructured data held by a public authority covered by the Freedom of Information Act 2000. Finally, the DPA covers processing by law enforcement bodies and UK intelligence services.

In the event that the UK leaves the EU without a withdrawal agreement, the GDPR will form part of UK domestic law as 'retained EU law' ("UK GDPR") by virtue of section 3 of the EU (Withdrawal) Act 2018 ("EUWA"). However, in its current form the UK GDPR will not function effectively on the day that the UK leaves the EU ("exit day", currently scheduled for 29 March 2019) due to the numerous references to EU laws and institutions and the fact that the UK will cease to be a Member State of the EU. The Exit Regulations, made under powers conferred in the EUWA, will be required to ensure that the UK's legal framework for data protection continues to function. 

If there is a withdrawal agreement, then the Exit Regulations will not come into force and the UK will instead enter the "transition period" where EU law will continue to apply as if the UK were still an EU Member State (subject to certain exceptions).

1. Territorial scope and UK representative

By virtue of the Exit Regulations, the UK GDPR will apply to any controllers and processors based in the UK as well as those outside the UK but which offer goods and services in the UK or monitor the behaviour of UK individuals. The UK GDPR will therefore continue to have extra-territorial effect in the same way as the GDPR currently does.

Many controllers and processors, whether based outside of the EEA or in one of the remaining EEA countries, will have already considered whether they are subject to UK data protection law. However, they will now also need to consider the requirement to appoint a representative in the UK. This will impact both non-EEA controllers and processors who may have already appointed a representative in a non-UK Member State as well as EEA controllers and processors who don't have a UK presence (and for whom this will represent an entirely new obligation).

Equally, the ICO has also indicated that a UK-based controller or processor that does not have any offices or establishments in the EEA but offers goods or services to or monitors the behaviour of EEA individuals will need to consider appointing a European representative.

2. The merger of the GDPR and "applied GDPR"

As previously mentioned, the DPA currently provides for two separate regimes for general processing: one for processing that falls within the scope of the GDPR and a separate, broadly equivalent regime for processing that falls outside the scope of the GDPR (the so-called "applied GDPR"). Given that EU law will not apply in the UK after Brexit, there will no longer be a need to distinguish between processing within the competence of EU law and that which is governed solely by UK law. The Exit Regulations will therefore merge these two regimes on exit day to create a single, unified regime for all general processing activities.

However, it is worth mentioning that this will not be a complete merger. Under s6 of the EUWA, any question around the interpretation of retained EU law (including the GDPR) must be decided in accordance with EU case law and general principles of EU law as they apply immediately before the UK leaves the EU. However, the Exit Regulations indicate that may not be the case for processing under the applied GDPR, which governs the processing of personal data in areas where the EU has no competence.

3. Data transfers outside the UK

Currently, any transfer of personal data from the UK to a country outside the EEA may only be made if that country has been granted adequacy status by the EU Commission or by using one of the "appropriate safeguards" described under Article 46 of the GDPR (i.e., the EU Commission's standard contractual clauses or approved BCR).

The Exit Regulations maintain the same restrictions for data transfers outside the UK (whether to a non-EEA country or a remaining member of the EEA) but ensure that data flows are not disrupted on exit day. They specify that certain countries and bodies are considered to have adequate status: these include all of the remaining EEA countries as well as Gibraltar, non-EEA countries which have already been granted adequacy status by the EU Commission or granted adequacy status prior to exit day, and the EU institutions and bodies. The Exit Regulations also provide that the EU's authorised standard contractual clause and approved BCR may continue as potential mechanisms for transfers outside the UK, whether in their original form or as modified for a UK-specific context. Finally, the existing derogations under Article 49 of the GDPR will continue to be available.  

Once the UK has left the EU, the Secretary of State will have sole authority to grant adequacy status (by way of regulations) in respect of transfers outside the UK and will be required to publish a list of those countries and territories it has deemed adequate. The ICO will continue to authorise BCR and will also be able to issue new UK-only standard clauses.

4. Co-operation and consistency

From exit day, the ICO will no longer be able to take part in the existing co-operation mechanism between EU supervisory authorities. Equally, the European Commission and European Data Protection Board will not have competence over the regulation of personal data in the UK. Unsurprisingly, therefore, Chapter VII - which lays the foundations of the co-operation and consistency mechanism - will be redundant and is removed entirely from the UK GDPR. Article 50, which addresses broader aims of international co-operation and mutual assistance in the area of data protection, will be retained.

Another expected amendment is the removal of provisions addressing the co-operation of Member State courts. Currently, under Article 81 of the GDPR, where proceedings are issued in a UK court against the same controller or processor and in relation to the same subject matter as a case already pending in another EU Member State, the UK court may either decline jurisdiction or suspend those proceedings until the other court has made its determination. Arguably, the removal of these provisions increases the possibility of concurrent claims in the UK and the EU.

5. ICO fines

The Exit Regulations confirm that the ICO will continue be able to issue the same level of fines as provided under the GDPR. In particular, they state that from exit day the ICO will be able to administer fines of up of £8.7m or 2% of the total worldwide annual turnover (whichever is higher) for less serious breaches and £17.5m or 4% (whichever is higher) for more serious breaches.

6. Amendments to the PECR

Finally,

Read The Full Article

 

 

  

2019 Global Legislative Predictions

 Jan 7, 2019 2:00 PM
by Derek Lackey

What will the new year bring in privacy and data protection legislation? Well, to name just a few highlights, we've got a handful of EU member states still needing to pass laws addressing the General Data Protection Regulation, India is in the midst of debate over a new law, Brazil's law will get the enforcement body it has been lacking, and there are talks of a U.S. federal privacy law. But that's just the tip of the iceberg. This week's Privacy Tracker roundup consists of contributions from IAPP members across the globe outlining their expectations (and occasionally their hopes) for privacy legislation in the coming year. With more than 30 contributions, this year's global legislative predictions issue is our most comprehensive yet.

Argentina

By Pablo Palazzi
The year 2019 will see Argentina with an important landmark in its history of data protection law. In September 2018 the government sent to congress the data protection bill, based on the EU General Data Protection Regulation. Now it is up to Congress (first the Senate, then the House of Representantes) to openly debate the bill and approve it. Argentina was the first country in Latin America to adopt a full fledged EU data protection law and the first country to be considered adequate by the EU. Now, nearly 20 years later Argentina has again the chance to follow EU law again.

Belgium

By Tim Van Canneyt, CIPP/E
2019 will be another important year for data protection in Belgium. First, we should finally see the appointment of the directors of the new data protection authority. At the moment, Belgium has an interim supervisory authority which is pretty much forced to act on a day-to-day management basis. When the directors of the DPA are appointed in 2019, the authority will be able to adopt its strategic vision, publish more guidelines to help companies and offer better protection to citizens. In addition, we should hopefully see the implementation of the NIS Directive into national law. Furthermore, the Belgian Supreme Court will have to assess the lawfulness of the recent government decision obliging every Belgian resident to provide their fingerprints for inclusion on the ID card's chip. Finally, it will be interesting to follow the class action brought against Facebook by consumer protection body TestAankoop.

Brazil

By Renato Leite Monteiro, CIPP/E, CIPM
2019: The year of compliance and the Brazilian Data Protection Authority!

What a year! Nobody could guess in the beginning of 2018 that Brazil would finally have its own General Data Protection Law, known as LGPD (I myself have written this column for the last three years and I always thought my predictions were only in a wild guess!). And then, out of the blue, it was approved in August. However, the president vetoed one of its pillars: the creation of the national data protection authority. Even though some provisions would only have an effect if the authority was created. This lack of DPA made the LGPD weak.

Then, on the dawn of the year, Dec. 28, 2018, the Executive Order n° 869/18 promoted several alterations to the law. One of the most important was the creation of the Brazilian National Data Protection Authority. It is also altered the vacatio legis period for the LGPD to 24 months, changing the enforcement date from February 2020 to August 2020. During this period, the ANPD must exercise collaborative and consultative functions, aiming to provide assistance in the process of compliance to the new law. With the creation of the DPA, business will know to whom and what to look for. They will have a straight channel to communicate. The ANPD will provide for a much more stable application of the law, and, for instance, more legal certainty, what will probably spur technological and economic development.  

Nonetheless, despite the DPA, enforcement actions might continue. The Distrito Federal and Territories Public Prosecution Office has been heavily conducting investigations on data breaches and other issues regarding personal data. Recently, the Minas Gerais Public Prosecution Office fined a drugstore chain for exchanging customers’ personal Taxpayer Id numbers for discounts in products, which in fact is a common practice in Brazil. The total amount of the fine was R$ 7,930.801.72 (BRL), the largest related to data protection yet. This condemnation was vastly reported in mainstream media, national and international. Such actions are likely to continue. 

Also, since LGPD will enter into force in August 2020, 2019 will be year companies will rush to become compliant, a practice that has already become a new niche market. Consulting companies and law firms are heavily investing in personnel and privacy software to take advantage of the escalating demand. Proof is that the IAPP has partnered with the first official training center of Brazil. Data Privacy Brasil will provide training courses for both CIPP/E and CIPM certifications.

Therefore, we can say that 2019 will be an interesting year for the data protection scenario in Brazil!

Canada...

 

The balance of this article requires membership to IAPP. If you are interested in a specific country/region, please email us and we will provide details.

 

  

Syrenis Appoints Canadian Reseller

 Jan 4, 2019 10:00 AM
by Derek Lackey

UK software company Syrenis is delighted to announce the recent appointment of their new Canadian reseller, Newport Thomson.

UK software company Syrenis is delighted to announce the recent appointment of their new Canadian reseller, Newport Thomson. Newport Thomson serve the US, Canada and the EU, helping businesses to manage risk within their businesses by operationalising data and privacy best practices.

Syrenis is renowned for its Preference Centre, which has recently been expanded and rebranded to accommodate changes to the personal data and privacy landscape. Now known as Cassie, the personal information platform currently handles almost a billion marketing preferences for 118 million customer contacts worldwide, and securing knowledgeable new partners such as Newport Thomson brings the platform to more businesses with currently unmet personal data needs.

‘The level of expertise within Newport Thomson became clear from our very first conversations with them,’ says Glenn Jackson, CEO of Syrenis. ‘Naturally we’re delighted to have them on board and we look forward to a long partnership with them.’

‘Record keeping is the biggest compliance issue for any of these new laws being implemented globally, from GDPR to CCPA, including PIPEDA and CASL,’ states Derek Lackey, Managing Partner, Newport Thomson. ‘Our client’s data and privacy infrastructure was never designed to prove consent claims or any of the other details required by these new laws. In order to be compliant an organization must re-think the way they manage data, privacy, consent and data subject rights and Syrenis is an exceptional solution at the right price.’

The system has been engineered to be flexible and intuitive for users, allowing for better brand consistency and a positive preference management experience. It now offers improved support functions: previously hidden features have been made more accessible, such as a bank of FAQs for reference, and users can also submit and manage support requests from within their portal interface.

‘The changing privacy landscape on the North American continent is something that we have been watching with great interest,’ Glenn continues. ‘Having a partner with such strong bases in both the US and Canada allows us to work together to deliver the best possible personal data solutions for those markets.’

More information about Newport Thomson can be found at www.newportthomson.com.

  

Happy New Year, Data Brokers! Now, Register With Vermont

 Jan 4, 2019 10:00 AM
by Derek Lackey

With only days to go before Vermont’s data broker regulation law takes effect, the Vermont Attorney General has finally issued guidance that explains how businesses can comply with the law and the nature of the obligations it imposes on them.

The Vermont Statute

This past May, the Vermont legislature passed the first law in the United States specifically regulating data brokers, effective January 1, 2019. Data brokers must register with the state by January 31, 2019, and annually thereafter, and must provide specified information to the state when they register.

The law also imposes certain minimum data security standards on data brokers, and prohibits data brokers – and everyone else – from acquiring certain personal information of consumers through fraudulent means or with the intent to commit wrongful acts.

What Is a Data Broker?

As we discussed in a previous Alert, the Vermont law defines “data broker” as a business that knowingly collects and sells or licenses to third parties “brokered personal information” of a consumer with whom the business does not have a direct relationship. The new guidance from the Vermont Attorney General amplifies the definition by listing four questions that can determine if a particular business is a data broker for purposes of the law. If a business answers “yes” to these questions, and if its activities do not fall within certain very limited exceptions, the business is a data broker.

The questions are:

1. Does the business handle the data of consumers with whom it does not have a direct relationship?

Data brokers collect and sell or license the data of consumers with whom they do not have a direct relationship. For example, a retailer that sells information about its own customers is not a data broker because it has a relationship with its customers.

2. Does the business both collect and sell or license the data?

A business that collects data for its own use or analysis is not a data broker. As an example, an insurance company that buys data about individuals to set rates and develop new products but that does not resell or license the data, is not a data broker.

The guidance makes clear that “collection” is a broad term, and can include the purchase or license of data from someone else, or the collection from original sources such as court records, surveys, or internet search histories.

According to the guidance, sale or license does not include a one-time or occasional sale of the assets of a business as part of a transfer of control of those assets that is not part of the ordinary conduct of the business. It also does not include a sale or license that is “merely incidental to the business.”

3. Is the data about individuals residing in Vermont?

Vermont’s data broker regulation does not apply to a company that has no data on Vermont residents or who is not otherwise subject to jurisdiction in Vermont. Importantly, the guidance suggests that a national data broker has a “non-trivial chance” of possessing Vermonters’ data. It states that if a data broker does not maintain the state of residence of individuals whose data it collects, it might presume that there may be at least one Vermonter in its data set.

4. Is the data brokered personal information (BPI)?

BPI is defined broadly. It must be computerized as well as categorized or organized for dissemination to third parties. According to the guidance, data is BPI if it contains one or more of a person’s name, address, date of birth, place of birth, mother’s maiden name, biometric information, name or address of a member of the consumer’s immediate family or household, or Social Security number or other government-issued identification number.

The guidance also provides that BPI includes “other information that, alone or in combination with the other information sold or licensed, would allow a reasonable person to identify the consumer with reasonable certainty.”

By contrast, BPI does not include publicly available information to the extent that it is related to a business or profession. For example, a doctor’s office address or phone number is not BPI, but a doctor’s home phone number (assuming it is not used for business) is BPI.

Registration

Data brokers must register with...

Read The Full Article