Skip To Content

AccessPrivacy Podcast: Special Thought Leadership Roundtable on Privacy and the COVID-19 Pandemic

May 27, 2020

May 2020 AccessPrivacy Call

May’s Monthly call focused on global privacy themes, challenges and opportunities arising from the COVID-19 Pandemic.

A special one-hour teleconference call was moderated by Adam Kardash and featured the following expert panelists:

  • Bojana Bellamy, President of the Centre for Information Policy Leadership at Hunton Andrews Kurth LLP
  • Jules Polonetsky, CEO of the Future of Privacy Forum.
  • Martin Abrams, Foundation Executive Director and Chief Strategist of the Information Accountability Foundation
  • Patricia Kosseim, Current Co-Leader of AccessPrivacy and Incoming Information and Privacy Commissioner of Ontario

Listen to the privacy call now


Register for future AccessPrivacy Calls

Already registered as a member with AccessPrivacy?

Register for Upcoming AccessPrivacy Calls

Not registered as a member with AccessPrivacy?

Create a free AccessPrivacy account to register for events and our free e-newsletters. Once you are registered, click on the Events Tab to sign up for Upcoming AccessPrivacy Calls.


PRESENTER: Good morning, ladies and gentlemen. Welcome to the AccessPrivacy monthly conference call. I would now like to turn the meeting over to Mr. Adam Kardash and Miss Patricia Kosseim. Please go ahead.

ADAM KARDASH: Thank you. And hello, everyone. My AccessPrivacy co-lead Pat Kosseim and I want to welcome you to our main monthly privacy call for colleagues who are not able to join us today. Note that all of our AccessPrivacy calls are incorporated under the Resources tab of our online platform for subscribers to listen to any time at their convenience.

While we don't have an opportunity to answer questions during our call, subscribers can find more information on today's topics and many others in this month's monthly scan also available under the Resources tab of our new subscription platform. The scan is intended to provide subscribers with a convenient one page snapshot to a consolidated list of hyperlinks to recent decisions, guidance documents, and other notable developments that have occurred in the privacy arena together with easily accessible links all in one place to help keep you in the know and hopefully save you valuable time.

This month's call is a special one hour roundtable to mark Pat's last AccessPrivacy call before she becomes the new information and privacy commissioner of Ontario and begins her five year term on July 1st. As many of you know, Pat has played a pivotal role in Osler's privacy practice over the past two years and has been instrumental in bolstering our AccessPrivacy knowledge offerings and thought leadership platform.

And on a personal note, it's just been a privilege and invaluable to have Pat as part of our team as a superb lawyer, a former regulator, thought leader, and overall just a phenomenal colleague. So Pat, congratulations from all of us again. We wish you nothing but the best in your new role. And as so many have mentioned to me over the last few weeks, we are all confident that the province of Ontario-- the country, actually, will benefit greatly from your deep experience, deep expertise, just great background in the privacy and data space. Congratulations, Pat.

PATRICIA KOSSEIM: Thank you, Adam. Thank you so much for those words.

ADAM KARDASH: So this call will focus on global privacy themes, challenges, and opportunities arising from the COVID-19 pandemic. And for this discussion, we've assembled a complete powerhouse of global thought leaders in the privacy and data space.

Our thought leadership roundtable will feature Bojana Bellamy, who is the president of the Center of Information and Policy Leadership at Hunton Andrews Kurth. Jules Polonetsky, the CEO of the Future of Privacy Forum. Marty Abrams, foundation executive director and chief strategist of the Information Accountability Foundation. And last, but definitely not least, toco same, who in addition to being counsel at the firm over the past two years has co-led our firm's AccessPrivacy Thought Leadership Platform.

Thank you all for joining us. Let's kick off the discussion with Bojana. So Bojana, let's begin with your reflections on the major privacy challenges for global companies as they prepare to reopen the economy with a-- without the guarantee of a vaccine any time soon. Very interested in your comments in that regard.

BOJANA BELLAMY: Thank you, Adam. And hello, everybody. Very pleased to be here. It's a very sunny day. We're still in lockdown in London. So I think I'm talking to many global organizations on this call. So the things that I talk about will not be surprising to you. But I think what is probably a moment for all of us to reflect is this current situation is almost forcing us to live a real life case study, right?

On data protection, on how to balance some challenges that we've never had, new opportunities, and it really has been a time I think for us in privacy practice to think tactically but also strategically how the world is going to change because of COVID. And I think organizations will be reporting a number of changes.

The first immediate concern, I think, is all about business continuity resilience. How do we enable this remote work in lockdown and then return to work? How do we also enable our business to continue? So something around COVID reimagining the work force has been a big trend in all global organizations. Enabling remote working, enabling even internal tests and tracing and checks and temperature, managing safety, well-being, and mental [INAUDIBLE] mental well-being of people-- employees in the work force.

And of course, monitoring people, systems, devices, and now as we are all coming back to work, how do we reimagine this new office? How do we schedule? I hear companies are going to be 10% to 20% or having occupancy-- how do we make sure we actually schedule people to be in-- how do we make sure visitors come back? Will there immunity passports and so on and so forth?

So all of this, again, has raised a number of data protection challenges, and I'll talk about that a bit later. Another sort of a new trend has been [INAUDIBLE] companies have to reimagine and reinvent themselves to continue to operate in this new world. So we have seen a phenomenal digital transformation. Every brick and mortar company has gone online and they're going to stay there, right?

New services and features to address the pandemic addresses additional needs of customers and change in behavior. So customers and consumers and business as well as customers. So companies really have to reinvent themselves. And you've seen, Uber is not just a logistical company of just 40 people. Uber now transport medicines and NHS or other health professionals around the world as well.

The third area of change has been the realization and call that companies need to be seen as contributing to fighting the pandemic. So obvious one was data for good. And to me, that's shorthand to say that companies have been increasingly trying to find a way to use data for beneficial purposes from contact tracing apps to mobility reports like Google and Apple have done to research in vaccine medicine, contributing location data.

And there's been this real sense that employees and all the shareholders require companies who can say, what have you been doing during this COVID crisis? Like what have you done? And then, of course, final point was that we have seen increased call for data sharing on all levels between public sector organizations, between public and private sectors both ways and between the private sector. And I think that is the new reality that companies simply have to deal with.

Now, all of these changes, of course, have a real impact on data protection compliance, both tactically, but also strategically. The new realities of monitoring, managing workforce, and of course, health data as well have been a real struggle for many companies, particularly as the national law differs so much and there is a core complication of employment or labor law complicating things further.

And really, companies who are global have had trouble to actually have a one stop shop, as I call it, one way to deal with their employees in lock down and now return to work. And some have actually decided to ignore some of the local diversities and adopt, again, as I say, one stop shop, one single approach to coming back to work and to taking temperature checks or dealing with employee's data. This new employee data reality is going to be with us for some time, even in countries where employee data is not covered by data protection laws. I think we're going to see more issues arise as well.

The second challenge from data protection point of view has been the legal basis for processing, right? Those countries that have got legal basis like GDPR [INAUDIBLE]. What the vital interest of individuals mean? What does public interest means when private sector companies are now delivering public interest services? The more needs to rely on legitimate interest. And then, the role of consent, of course. Again, in employment fields, almost impossible to obtain that consent because an imbalance of power.

And again, a deeply robust debate whether consent is the appropriate legal basis from temperature checks of employees to contact tracing apps or some of the more intrusive surveillance measures in the society. Do we actually have rights to refuse to take part in the [INAUDIBLE] data for good use, right?

And so, I think this-- none of this has been, actually, answered so far. The discussion between use of personal data versus anonymized deidentified data and shock and horror when we heard that some of the regulators think that there is no such thing as anonymous data. All data is always personal. And therefore, never should be shared with things like location data and cultural context.

And then, another challenge from data protection points has been the use of data for analytics, scientific, and other research. Big data AI, but particularly, those of you who have GDPR geeks follow these 89, article 89 exemption.

How do we apply this exemption in the times of crisis when there is a need to use data beyond just research and academic institutions, but in companies as well. And then, how do we interpret some of the traditional principles like data minimization principles or not using data for incompatible purposes.

I think we are seeing, really, a challenge to all of these principles and notions under existing data protection laws and existing regulatory guidance as well. And then, the final point is that everything that organizations have been doing in the context of AI has accelerated tenfold. So AI applications and all the tension that they bring in terms of legal roles have been emphasized or multiplied in this COVID time.

So those are the impacts of the compliance. Now, I do have some good news, but maybe [INAUDIBLE] the good news for later questions, Adam.

ADAM KARDASH: Bojana, so interesting and so striking how so many of the themes and challenges that you mentioned, so many of them aren't really new, they're just exacerbated within the pandemic crisis. But we've been talking about so many of those for years.

But let me turn to another question. What are some of the common themes emerging from recent EDP guidelines and guidance from individual EUDPAs in the context of COVID-19?

BOJANA BELLAMY: Right. So I mean, they have been-- DPAs have been impacted, of course, by this, again, just like organizations have. And it took some time for them to find their voice, I think, in this debate. We saw quite conservative guidance emerging for some national data protection authorities in the beginning. And that is now [INAUDIBLE] somewhat.

So there has been an explosion of guidance from national DPAs. But again, one of the big [INAUDIBLE] geopolitical impact of COVID is this de-globalization, right? So we're going back to nation states who have seen all the responses in national. And now, and the same has been the data protection authorities. They have forgotten all of those techniques and methodologies and [INAUDIBLE] that they had to work together. They've all retracted to their own nation states and have been addressing [INAUDIBLE] from national point of view both employee data processing and contact tracing and use of mobile data, telco data for good.

And so the explosion of guidance that has been, as I say, conservative at the beginning, a little softer at the moment, but still appears to not converge always. And those of you that deal with employee data processing globally have seen that, for example, in France or Belgium there is a reticence and the advice is not to collect and use health data. Effectively, not to do temperature checks. In the Anglo-Saxon world, but even in Italy, it is possible to do temperature checks or [INAUDIBLE] before coming back to work as well.

So we still have got inconsistent guidance from national data collection authorities. Now, at the European level, the boards, again, took some time to act. But they have acted. They're three particular sets of guidance. One was just a statement which said, look GDPR does not prevent data being used responsibly for the purpose of fighting COVID.

The second most around-- and that was really more helpful, I think, around use of health data for research purposes. And again, there they call upon some of the traditional principles. But also allowing [INAUDIBLE] more broader interpretation of these principles. And then, the final point was the final guidance from the board was on the using of geolocation and tracing tool in the context of outbreak.

And as you do know, there is a big debate now going on in every country around what particular apps are going to be developed by the government and deployed. Some will be deployed locally using local centralized tools. Some will be used on Apple, Google infrastructure. And of course, DPAs are very busy at the moment reviewing these guidelines.

The second big thing, I think, that happens with the DPA is that what I call-- overnight they had to reassess their regulatory oversight and enforcement approach. And again, you might not be surprised the UK information commissioner has been very pragmatic in leaving that-- being that voice of reason and call-- some of the Asian DPAs, I think, were pragmatic as well.

But for example, the UK, I feel sad that she is going to take a much more [INAUDIBLE] and pragmatic approach to privacy regulation and enforcement and oversight as well. They will concede it a hardship in the situation when issuing fines. And in fact, we do expect that fines [INAUDIBLE] and BA fines which are still to be determined. And you remember one was 200 million, the other almost 200, 190 [INAUDIBLE] 99 are going to be substantially reduced because those companies got in a hardship [INAUDIBLE].

I fear that they're going to focus on areas that are likely to cause greatest harm to public. And particularly, those who are looking to exploit, if you like, the public health emergency. And again, that's really [INAUDIBLE] trying to be-- to go beyond what the [INAUDIBLE] have done and say, we cannot be tone deaf and we cannot be blind to what's happening as well.

And now you are seeing even further some of the priorities shifting beyond oversight. And in addition to that regulatory forbearance policy, the idea actually-- as I said, our priorities are going to shift. So we are going to stop investigating [INAUDIBLE], for example. And I'm not surprised because I don't think that is the biggest harm that we have to deal with, right? Even though the business [INAUDIBLE] an ecosystem that needs reforming and changing. But it's not the biggest harm we have now whether people receive an add or not.

And so [INAUDIBLE] is going to be talking about focusing on protecting vulnerable citizens, the [INAUDIBLE] importantly economic growth, [INAUDIBLE] including enabling small businesses to return back and to lead that-- bringing back digital economy and shaping proportion surveillance and enabling good practice in AI, transparency, and business continuity.

So these are all, what I call, the realization of some of the DPA. They've got this dual role to play in society. Not just about protecting blindly the rights, but it is actually enabling responsibility with the data. And frankly, by that, raising their own value to the government, to the society because they are becoming a solution making regulator.

These three were the shifts that I have seen [INAUDIBLE] the immediate impact as we have seen with the data protection authorities, Adam.

ADAM KARDASH: Thank you, Bojana. Let's turn now to Marty. Marty, you've begun-- well, you're continuing, but really focused now some interesting work and thinking about the concept of data proportionality, which is just critical in the Canadian context for privacy law consideration of whether a given initiative or data processing activity would be deemed to be reasonable and appropriate in the particular circumstances.

So in your view, how was the response to COVID-19 pandemic testing the boundaries of these concepts of necessity and proportionality? And how are these being applied across different jurisdictions?

MARTY ABRAMS: So thank you very much, Adam, for the question. And again, my congratulations to Pat for her new office. I think she's going to do a great job as the Information Commissioner in Ontario.

The question of proportionality is actually a question that we have been indirectly addressing for the last four years. All of the work that we've done in Canada with Canadian business and with the Office of the Privacy Commissioner and with the Ministry of Innovation Science and Economic Development is really pushing towards this concept of how do you determine what's necessary and what does it mean for use of data to be proportional.

And what we've learned from that research and what we've learned from the process of companies that are adopting the methodologies that we've talked about is that you need to start with stakeholder analysis. You have to begin with the sense of who is going to be impacted by the use of data, and what are the risks related to those parties? What are the benefits? How do you parse out this balance of risks and benefits and what you're trying to achieve with the use of data?

And it means that it's not just about focusing in a linear fashion in terms of an individual as a source of data and an organization as an observer and user of that data. It's really about thinking about all the stakeholders. And in this case, the stakeholders are society as a whole, as individuals as data subjects, individuals as organic beings that want to survive. Being able to restore the economy to its prior position. All of these things come together as you do stakeholder analysis.

And what we're learning is that to make sound decisions one really needs to do analysis around that balancing of interests as it relates to stakeholders. What is the impact on health? What is the impact on the ability to bring people back to work in safe fashion? What is the impact on society as a whole? But also keeping in mind that individuals do want a safe space where they're not observed.

So it comes down to this ability to do sound analysis and be able to defend what you do based on that sound analysis. So the question of whether processing is necessary is based on this ability to do stakeholder analysis, be able to identify who those stakeholders are, and document who they are and when processing is necessary for them.

It also comes down to the type of data and the amount of data that is necessary to do so. So any processing that is insufficient to meet your objectives is not meeting a necessity test. So the fact is that this balance between data minimization and the adequate level of data in order to make sound decisions comes into play in terms of this whole question of necessity.

When we turn to the question of proportionality, we run into this-- so this question of what is proportional to what? Historically in data protection, we typically talk about the amount of data proportional to the processing that you're going to do. But if we look at what the European Court of Justice has been considering in the cases that are before them, proportionality is a much bigger question. It comes down to the balancing of multiple interests that are fundamental.

So we have historically in Europe a balancing between the fundamental right to privacy, the fundamental right to data protection, the fundamental right to free expression, and the fundamental right to organize and conduct a business. And I think, increasingly, this concept of the fundamental right to have life itself comes into play so that when you're doing an analysis of whether something is proportional it's not, again, a linear processing. It is a process that comes out in terms of identifying those fundamental interests and rights that come into play and being able to defend the decisions you make and making sure that those decisions are based on the stakeholders and the stakeholders in play.

I think we see this in some of the leading companies in Canada in terms of the announcements they've made about data for good. I think very much they're bringing to play the fact that they have a sound system of analysis that in the end determines what is necessary to do and when and how you apportion out the risk and benefits related to a full range of fundamental rights that come into play with proportionality.

ADAM KARDASH: Marty, what are the challenges organizations face as they attempt to balance multiple stakeholder interests in the data involving-- responding to the COVID 19 pandemic?

MARTY ABRAMS: So the first thing is getting over their own reticence risk. And we've seen this with organizations that are in positions of being highly observational. They've been reticent to use that data, putting pressure on public policy to give them coverage for bringing that data forward. In fact, I've seen organizations that are resisting data for good because of past experience of providing data. It goes to what Jules is going to talk about in terms of the use of data for national security. In the matter of health, they show reticence in part because of what they learned in the past when they supplied data to government.

The second is that you really do need to have a sound process for doing the analysis and being able to defend that analysis. This ability to be answerable when the question is raised-- is it truly proportional, -- is something that requires a sound process and documentation of that process. This is increasingly pushing us to these questions of how do we certify processes to make sure that they're competent and that they're done with integrity. So that question comes to the forefront even more.

And the third is just a general mistrust of organizations related to data in an observational world. We have the sense that people are over-observed. So when we talk about the application of new apps that gather data that is particular to this particular pandemic, the concern is that that data will be misused because of a lack of trust in organizations to show any level of restraint when it comes to the use of the data that they have.

ADAM KARDASH: Marty, thank you so much for that. Let's turn to Jules. Jules, the current crisis has kicked off a digital acceleration at full if now warp speed. What are the key features of this emerging landscape and what are the corresponding data protection issues that are being exacerbated as part of this acceleration?

JULES POLONETSKY: Thanks [INAUDIBLE] the chance to join the conversation with you. And really excited to see what is ahead for Pat because, clearly, the opportunity for someone who's been a regulator, has been on the private practice side can step back with the deep knowledge of how data really moves and really works as both an insider and an outsider is really critical. So looking forward to exciting leadership.

Let me talk about the digital acceleration in two ways. One, the audience is a sophisticated audience. Lots of privacy practitioners. I think nobody needs to sort of spell out that we're overall working remote and that is obviously coming with a whole range of challenges. The audience is sophisticated about those.

The delivery services-- the tools that we are all depending on are, obviously, in so many cases, data driven, data dependent, technology driven, and obviously many of those will stay with companies and organizations in the new normal whenever that happens and whatever that looks like. So let me skip that because I think we've all been seeing those issues and are working many of us on those today.

Let me talk about what actually has happened in a number of times where we've gone through these sorts of dramatic stress or change periods. And we see a digital acceleration in that many trends that were happening-- we were slowly moving-- certain sectors were starting to rise. Technologies were starting to trend even outside of the intense demand for very specific services.

Companies that were struggling are now going to be pushed out of business more quickly because of the economic pressure. Companies that were stronger and have resources who were already more technically sophisticated and already more data driven or already leading in one particular sector or another are very likely to gain even more of an upper hand.

And so I think the challenge for those of us who are looking a couple of years out is to say, what are those trends? Where are those technologies and where are those practices that suddenly are going to be rapidly-- commonplace rapidly the majority, and what does that mean for people who work in the space? So let me take a couple of the takeaways I think that are perhaps useful.

We've seen slow evolution of trusted models for access to data for research, for good, for sharing. The technologies that have been emerging have been awkward. Some of them have been used. We've got good examples in some places. But we still have the vast number of organizations, whether public sector or private sector, still very unsure about what they can share, where they can share it, and again, I mean, beyond sort of the natural business chains, right? That has always been driven because of the business imperative.

But the models of I'd like to share this data for research across every sector, not just health care, what do data trusts look like? What do trusted intermediary look like? What do ethical review structures? All the things that we've all been working on and others have all been working on, I think, suddenly the realization that we did have those and we didn't have easy answers for here's how we'll get what we need.

Some of these things happened. There was weeks of debate around large scale aggregated well anonymized data sharing or data analysis to show population movement. You saw the [? swell, swell, swell. ?] And then, people move forward because it was clear that data was useful and it was clear that when well aggregated it was very low risk.

But we didn't have the easy answer. It wasn't an obvious no brainer. And there are dozens of other areas like that where holes were punched through because of the crisis that got us to answers. And so I think that's a trend that even after the immediate crisis goes if one shows up and says, look, we need to do a better job of being able to share data for this sort of purpose, for this sort of reason, for this sort of value, we've demonstrated that it can be done and that it's necessary.

Another area where I would like to think that we're going to see real change is that we call it the bring in the experts. That the DPA is not the sole vision for whether or not a particular use of data is actually effective-- is actually societally beneficial. I think, in many jurisdictions around the world, some of our colleagues on the data protection authority site have seen their regulation as the regulation of all data, even though there might be a very significant health technical issue. Even though there might be a very sophisticated detailed what is really needed for this specific purpose?

And I think we, at least, saw some degree of [? default ?] by data protection authorities to the fact that governments or epidemiologists or public health folks were saying, yes, this is actually needed and it is necessary. And you saw a bit of an accommodation after some period of time to that. And I'd like to think that there will be at least a greater appreciation that, in so many sectors, you may need to be the legal balancer at the end of the day in assessing the data protection rights balance.

But it actually may be, of course, experts in those sectors government or out of government who are maybe the definitive people arguing that this will help for climate change. That this will-- that this is actually what is needed and what is necessary.

I think we'll see an acceleration of flexibility for research. I can tell you that on the US site, getting research exemptions as we go through this process of moving to legislation, which I think will be accelerated. But one of the frustrations that we certainly had as states have proposed legislation or even as the members of Congress who put forward legislation, we very often had to be the ones to say, yeah, what about research? Because your exemption is very limited and it doesn't actually work and it only covers this sector and it doesn't cover that sector.

You even see today where the California Consumer Privacy [INAUDIBLE] has a very limited exception to the rights that are mandatory for [INAUDIBLE] research that has gone through a common rule federally regulated type of IRB. And so much of the other research that exists, particularly some of the important health research that may not be subject to those, is technically subject to different deletion rights and different requirements that it only be consent-based and the like with no exceptions. Forget about legitimate interest. Forget about carved out exceptions for the very other specific area.

So I think we're going to, I hope, see a bit more openness and awareness. Look, I think for a lot of people, tech as infrastructure has accelerated. We've obviously always known that certain aspects of telecom and the like have been regulated in a very specific way because of their unique role. But I think the dependence that so many have dealt on different models of tech are going to even further force the rules that need to be set or that those argue should be set around those areas.

What else can I tell you? I think one other piece that I think many of us will see is the understanding that it really is about data and that data is the key tool to make decisions about how to act while monitoring every day the number of this the number that. And we actually do need to know that in a very precise way that may risk identifiability. But we actually need that data. And we need race because we're worried about how it might impact different groups. We need income if we're going to understand how society is being affected. Detailed data.

And so the argument of we don't collect it because it's sensitive or it'll come with a certain set of restrictions. Clearly a core principle, but one where parsing out when data is truly in a protected category and when it may need to be used for more nuanced purposes.

So all of those were trends I think that were happening. But I think we're going to see them now far more-- moving ahead rapidly and aggressively. And that's why just about everybody who works in privacy, our ears are spinning. Not just because of all the COVID issues, but because of, I think, companies and governments that are thinking ahead who suddenly have a whole plethora of issues that they all need to be figured out now and quickly so people can make smart planning decisions.

ADAM KARDASH: Jules, what are some of the parallels you are seeing with national security measures instituted post 9/11 that have become widely accepted as standard and commonplace today?

JULES POLONETSKY: Well, I'll draw up a bit higher. Let's look at the whole 9/11-- post 9/11 situation. Governments around the world rolled out new measures in order to respond. And here was the way it played out, I think, for many of us.

The first short period of time was why didn't we know? Why didn't the government put together the pieces when it could have and when it should have? And obviously, see the parallel here, too. Did we know what was happening in China? Did we properly screen and have the data? Could we have done more with what we knew?

So the blame, the shame, the we could have done, we had the information if we had only handled it better. And so many of the initial post 9/11 measures were data integration measures where how do we join up all of the databases and how do we ensure that the regulations allow that sort of broad collection and that sort of broad analysis?

And that's what we saw then. And that's, I think, what we're going to be seeing as we react post-pandemic, right? The blame, for instance, when we have new governments and new administrations will be, what do they know? How did they screw up? What can we do better next time by actually having the information and being able to respond and understand what it means?

We then saw very quickly a trend afterwards by advocates and others saying, what did you just do? Did we now put in place long-term measures that are violating rights? And we saw efforts to undo and expire and set limits. And I think some of the learning that we're seeing today is people being very clear to say, if we do something, it needs to be limited. It needs to expire. It needs to come with automatic data deletion.

One big mistake, I think, that was made in some countries was around the transparency. We actually didn't understand what government thought some of the authorities they got were, right? And the [INAUDIBLE] revelations and other measures that we're still seeing being revealed showed that by not having that transparency the government lost trust. And the fact that steps were taken that were not publicly known, that were not publicly debated, or that even [? illegal ?] interpretation of what statutes meant were secret and were used far more broadly.

And so I think there are obviously a couple of obvious lessons beyond the don't keep the data when it is only being collected or don't pass the rules only for the emergency time. Understand that there-- needs to be done in a way that is limited and that when that balance shifts, the balance has to shift back.

But I think that focus on the transparencies that we actually understand what is and what isn't changing. But let me point out something else that happened. I was at DoubleClick at the time. Chief privacy officer at the tech company. And we were the focus of lots of the privacy debate at the time around cookies and tracking and so forth.

And post 9/11, the ad market crashed. So companies weren't running around doing the sexiest things that they were all talking about. And the public debate shifted a bit away from advertising and tracking to these big questions of civil liberties and 9/11 and who's going to be locked up and who can't travel.

And so you can imagine that what we will see, and Bojana pointed out the delay because of the COVID issues from the UK ICO. And maybe that is simply because of timing and allocation of priorities. But I think it's going to be important to see whether or not the focus of work shifts away from the areas that are not all that exciting and interesting anymore.

Now look, part of this here has changed and that lots of the advertising in tech is Amazon, is Google, is big companies who are going to be a lot of the focus and continue to be a lot of the focus. But that long tail of publishers who are struggling, many who despite incredible efforts to cover and be trusted sources, have also seen their revenues plummeting and the layoffs in newsrooms that are continuing.

We may see less interest, for instance, in what this particular app is doing. Today's headline story of some private app leaking information is front page news. The regulators and the media attention is likely to shift to which employees were kept out of the workplace, which health app was leaking data about health status, which insurance companies were misusing COVID data to deny benefits. Who do we need to get data to because they need to do this analysis?

And oh my God, did we actually give this company or this research or this problem too much and are they using it in appropriate ways. This was a big part of the shift after 9/11. And it was only years after 9/11 receded and the ad market came back and the consumer market came back that those issues once again became the forefront. So I predict that just the focus of attention and time and energy ends up naturally changing because of these big drivers.


ADAM KARDASH: Thank you so much for that. Pat, we're going to turn now to you. Let's begin with this question. What are your views on the challenges of COVID-19 from your perspective as a former and now incoming Canadian privacy regulator?

PATRICIA KOSSEIM: Thank you, Adam. And certainly a very interesting question. As you know, we've been monitoring regulatory responses to the COVID-19 data challenges since early March. And I want to make a few overarching observations about the role of Canadian data protection in commissioners offices informed, as you say, by my past experience and also my future role.

And mirroring very much what Bojana described as the EU experience, I would say that the responses have really been at two levels. First operationally, and then second, more substantively. So operationally, Canadian commissioners responded initially, as all responsible organizations would, with an explanation of the impact the pandemic was having on their own operations and a general plea for Canadians to be patient with their expectations as commissioners offices cope with staff reductions and technological challenges of moving to an all virtual workplace.

And I think it was a good reminder that regulators are not almighty immortals. They are no more immune to external challenges as we all are. And they too have employment responsibilities, HR issues, health and safety obligations, et cetera.

Operationally, we also saw Canadian regulators express some level of sympathy for [? ATF ?] officers who have statutory obligations to respond to strict-- within strict time limits that have been nearly impossible to meet in the current context. And it's also shown some flexibility in the normal level of safeguarding standards they usually expect of organizations recognizing that the ability to meet those usual high standards are under severe stress in the new environment or the current environment.

And so to the extent they can, regulators show themselves willing to extend limits-- term limits or show more flexibility and pragmatism, as Bojana said, in terms of their expectations, to the extent that their legislation allows them to. And I think this with a good comparative exercise to see which offices have this legislative flexibility and which don't. And require a statutory amendment or a ministerial directive to intervene.

At a more substantive level, we saw some DPAs respond initially with the usual refrain that while privacy laws continue to apply, they do not nor should they be used to block data sharing in times like these. And they provided a surprising [INAUDIBLE] of content exemptions that apply in the circumstances to allow private sector and governments to share personal data in exceptional times of emergency when lives or health of people are at stake or the public interest requires it.

And this, too, was a good refresher of just how many consent exemptions apply in times like these of pandemic and what those exceptions looked like across jurisdictions and the related condition that attach there to. The second round of the standard responses provided more, I'd say, how to guidance and framework for governments and private sector organizations to engage in responsible data sharing.

And this was particularly helpful as public health officials, researchers, and private sector entrepreneurs were already moving to action phase in launching new contact tracing apps and research data repositories, et cetera. And more recently, we've seen a slight reframing of those same guiding principles this time presented as FPT, federal provincial territorial joint statements showing a more common and united front among the data protection commissioners all backing this guidance framework.

And in light of Marty's and Jules' comments, let me just make a couple remarks about the challenges I see in applying some of these framework principles, and in particular, two of them. One is the principle of necessity and proportionality, which Marty spoke to. The challenges that not all statutes say this explicitly as a requirement. Now, how in Canadian chartered parlance the four point Oakes test, which is really the necessity and proportionality test as we understand it in Canadian law gets applied more systematically and operationally across both public and private sector organizations and consistently across Canadian jurisdictions really remains to be seen.

On the one hand, you can say public sector organizations as state actors are subject to the charter, and therefore, need to take necessity and proportionality into account when they're collecting personal information. And you could argue that, for private sector organizations, necessity and proportionality are part of the reasonableness test, which is the overarching requirement. So only time will tell. But I think it's a wonderful opportunity to start to put some flesh around these concepts in the current context.

The other limitation or framework principle I think will pose challenge is the time limitation where the DPAs have said that exceptional measures should be time limited and any personal information collected during this period should be destroyed when the crisis ends and the application decommissioned.

And I think this is very-- while very true in keeping with traditional data protection principles, I think it's going to be challenged by the current context for much the same reason that 9/11 challenged our way of thinking. And Jules alluded to this. In a way, we're dealing with what we need to do to deal with this current pandemic.

But eventually and slowly, this conversation will shift gradually towards how to deal with the risk of all future pandemics. So 9/11 experience, of course, elicited exceptional measures-- security measures to deal with what we thought was an imminent attack and all the events surrounding the 9/11. But eventually became a conversation about how to avoid or prevent such attacks from ever happening again in the future. And so many of these measures just continued in perpetuity as part of the normal expectations and common requirements.

So how we move from dealing with the current pandemic to measures that will be accepted for avoiding all future pandemics also remains to be seen, Adam.

ADAM KARDASH: Pat, what are your thoughts on the impacts of this crisis on incoming legislative reform in Canada?

PATRICIA KOSSEIM: So I think that the current pandemic has really driven home some major flaws with the statutes, which when pressed to their limits really begin to buckle under the pressure. And I'll give you just a couple of examples at the federal level. The first is the Privacy Act. Of course, the threshold for collection of personal information is the threshold of directly related. And as we've spoken about in past calls, this threshold of directly related is a far cry from the much more thoughtful analysis of necessity and proportionality that Marty laid out for us.

So on the one hand, we have the OPC clearly stating its expectations as aspirational as they may be. And even [? TBF ?] guidance continuing to refer to the necessity criterion. Yet, we have a federal court of appeal decision recently that states unequivocally that directly related is not necessity.

So I think, for one, this has to absolutely get resolved and this pandemic is a good opportunity to really press home how important this threshold issue is to resolve. Also for years, the federal government has been talking about the need to break down silos between federal departments and ministries to facilitate better data sharing and integration to improve services to Canadians or combine different data sources in order to help solve some of society's most wicked problems.

And again, as we've mentioned on past calls, the recent data integration units that have been introduced to public sector privacy and access law is a Canada first and really presents a good model for the federal government on how to do that responsibly in the future and pressed on by the current pandemic.

Just an equivalent thought under PIPEDA. That law, too, is currently being stressed by the current pandemic. It's already been under stress for all the reasons we've discussed over the past several years, particularly as a content-based regime. And it's begun to show a crack in its armor a long time ago.

And while we can debate the virtues of allowing commercial uses, private sector organizations to innovate with data for economic profit and whether economic development is a public good or not, that should require consent or at least some measure of individual control as an ongoing debate.

But the case of dealing with a pandemic, which clearly has health, social, and economic impact is a much more pressing compelling and urgent condition. And so, again, we're going to have to see some movement on PIPEDA to deal with this. And in particular, a huge opportunity not to be lost to allow public private partnership through research. We've talked about that earlier.

And unless PIPEDA is modified, in particular, the research exemption, which as far as I know for 18 years-- until 2018, sorry, had never been used and it has been quite useless. I think it really paves the way to reforming that research exemption in a way that is much more practical to apply in practice and which incorporates the level of ethical assessment that is necessary for any research, particularly for all the reasons that Jules pointed out earlier. So I'll leave it at that, Adam.

ADAM KARDASH: Pat, thank you. Now, for the last part of this roundtable and extraordinary discussion, we're going to look forward and focus on what will be, hopefully, a silver lining of the COVID crisis. And to do so, we're going to ask each panelist to offer their response to the two queries. And just for the purposes of timing, I'm asking folks, let's keep it brief and almost rapid fire. But all call attendees are going to really welcome your comments in this regard.

And my two questions are as follows. First, what opportunities does this pandemic provide us relating to the leveraging of the use of data-- of a respectful leveraging in use of data, and how do we realize those opportunities? And Bojana, I'll begin with you.

ADAM KARDASH: Excellent. Well, I think I'm actually going to be following what I've heard from all of you and other panelists. I think, for companies, they are going to be realizing the potential of data and they will never forget this moment when the world has shifted online. When they have reimagined their businesses, they have reimagined their business operations and models, gone digital, gone through this data [INAUDIBLE]. I think companies are just absolutely going to continue to do so.

And in doing so, they're going to have to ask themselves one question, how do we keep on doing that in a way that nobody gets scared, right? So we don't have a post 9/11 situation, but actually, we have got to trust the people. So I predict that this will be an opportunity for companies to play more of this role of the corporate digital responsibility. There's been a shift to stakeholder capitalism. I think we're going to see much more of that doing more for good and making sure we are trusted business partner, responsible data manager and user.

And I think the accountability is going to be paramount for organizations. And in fact, we've actually put this [INAUDIBLE] together where we sort of talk about 12 steps of accountability to enable these big data sharing. Because I think everybody will want to share data. They will not forget what data technology [INAUDIBLE] in the time of COVID.

And for DPA, I think there is this-- to me, I hope it's an opportunity for them. And we have Pat here who is going to be an enlightened and smart DPA. I know that. But I think they can also graduate to this more mature sort of model of having this dual more balanced role of enabling responsible innovation and promoting this accountability, incentivizing good behavior.

I'm sick to death listening to people in Europe awaiting the [INAUDIBLE]. That's not the way to actually change the world. The way to change the world is to actually incentivize what good looks like. And so I hope that we're going to see more regulatory [? sandboxes, ?] more innovative oversight through data review boards perhaps with some of these big huge infrastructure data sharing and data use projects.

And I hope we're going to end up with more risk-based and transparent regulators as a result of this. So that's my optimistic view.

ADAM KARDASH: Bojana, thank you. Marty.

MARTY ABRAMS: I'm going to build on what Bojana said. I believe that this concept of proportional use and looking beyond proportionality as a line, but rather, as an algorithmic process, which brings into consideration multiple stakeholders and multiple interests. I believe that organizations are going to have to master-- and they will master if they want to be digitally driven-- the skills to determine what's appropriate and what's not and how to defend it and how to demonstrate it.

I believe regulators are going to become more mature and must become more mature in looking at multiple interests and trusting analysis and testing it to see whether it's done competently and done so with integrity. And I believe that this concept needs to be addressed in the upcoming legislation. And Canada is a prime suspect to really look at data protection in a more mature basis and particularly freeing up data for research.


JULES POLONETSKY: I think there's going to be an intense interest going forward to understand what works because we obviously don't know exactly what works. Did masks help? Did temperature screening of employees help? Did different strains of the virus spread in different ways? Why were different jurisdictions affected differently?

There's going to be intense interest in data analysis at a public level. And to be honest, most of us are not living in a public [? level. ?] We're talking typically about wonky things, data protection issues that only get to the headlines when there's a scandal or a feature that ends up being criticized.

The big analysis of data sets is not usually front page news. But every single day, we're looking at those numbers. And it'll be even more intense afterwards to actually analyze once we have a little bit of distance and to look in the back side mirror. So data crunching, data [? moving, ?] good or bad, right? The good will be that there will be an appreciation that big data doesn't just mean big bad. It doesn't just mean figure out how to regulate.

Insight and knowledge won't be, oh, that's just for advertising and marketing. That will be indelibly set. And I think that's to the positive. But that will bring with it another intense desire to figure out how to make sure the rules are around it.

We also are all looking to government to save us, to save the economy, to save individual companies, to save people from having paychecks. And I think that leads to a bit more of a paternalistic direction for society. We've been in some balance and it's obviously different in different countries. But I think we've all just gotten a very big push to government does play a very important safety net role, not just for the poorest, but for all of us.

And I think that leads to more interest in government setting rules, again, for better or worse, around data protection in a sophisticated way. So just some ideas of [INAUDIBLE]. And then, the last piece I'll say is we're also looking to our employers to keep us safe by understanding who they can allow into a workplace and who they will tell to go home.

And that just changes the dynamic in ways that I think-- that are likely to stick with us. The expectation that the employer needs to figure out how to keep me safe in my environment with something that maybe only was enormously applicable for most of us who worked in a dangerous environment. In a mine, in a place where there was a day to day risks. And even when the day to day risk goes away, the ideas in the-- I think impacting how the broader public thinks about the issues.

ADAM KARDASH: And Pat, I'm going to turn it over to you for both your responses to the opportunities and your reflections on how we realize those opportunities. But also to close out this May monthly call.

PATRICIA KOSSEIM: Thank you very much, Adam. Really, I just want to echo the excellent remarks by Bojana, Marty, and Jules. And my two rapid fire responses are really very similar in the same-- and fall in the same vein. One is, I've been so heartened, I think, we all have with the tremendous public private partnerships that we've seen emerging. And so many companies have stepped up as social actors that have taken on selfless and have become critical responsible contributors to helping all of us in society respond to this pandemic.

So I think, as regulators, both as a past and future regulator, I think it's not lost on us that we're not talking about big bad evil corporations. We're talking about social actors, like all of us, who have a critically important role to play. And to the extent that data is an enabler, to Bojana's point, we need to find ways of enabling that responsible leveraging of data.

And finally, to use the common refrain, we've seen we're all in this together. I think that's true similarly of commissioner's offices that have to work collaboratively with governments and private sector to find solutions. But in all of our interests at the global societal level and that everyone can get behind. And I want to echo particularly Bojana's comments on all the tools available, regulatory sandboxes and others, to help promote and support that kind of inquiry for solutions. And I certainly look forward to adopting that collaborative approach over the next five years.

So that brings us to the end of this AccessPrivacy monthly call. And we hope you found it interesting and useful. On behalf of Adam and myself, we'd like to thank Bojana, Marty, and Jules for participating today. And particularly, with Adam's brilliant idea to have this one hour special monthly call to mark my departure, thank you, Adam. And within an hour of writing to Bojana, Marty, and Jules inviting them, they had all accepted by reply email.

So a wonderful tribute to the collegiality that we extend and we have with colleagues, really, across the Atlantic and South of the border and good colleagues and friends. So thank you to all three of you on behalf of Adam and myself.

And because this is my last AccessPrivacy call while at Osler, I need to take this opportunity to thank my co-host, my colleague, and my dear friend Adam, who by the way, I've known for nearly 20 years, for the absolute honor and privilege it's been to work with him and the whole access privacy team and the tremendous Osler lawyers I've gotten to know and work with across many different practice groups, all united behind a culture of excellence and impeccable client service. So thank you.

Particularly, these past 2 and 1/2 years have been a true hallmark in my career, as I've said, in many occasions. Recently, it's been truly humbling to be able to understand data privacy challenges from the perspective of those who are regulated and have to apply the law and policy and regulatory guidance in practice while responding to operational demands, competitive pressures, economic constraints, and client or individual expectations.

I want to thank Adam in particular for this wonderful opportunity. The warm welcome he extended to me when I joined Osler. And most importantly, for his generous offer to join him and the honor and the privilege to join him as his colleague and partner at AccessPrivacy. Adam, you've taught me so much about the practice and about being practical and you reminded me that while law is an important and noble profession, as lawyers, we should never take ourselves too seriously.

So on that note, Adam, thank you and all of my colleagues for your collegiality, your good humor, and your friendship and goodbye to everyone. Thank you.

PRESENTER: Thank you. The conference has now ended. Please disconnect your lines at this time. We thank you for your participation.