Glass globe representing international business and trade

In this roundtable discussion, members of the firm’s Cybersecurity and Privacy Practice Group discuss a wide range of important topics, including steps to take when a data breach is suspected, trends in artificial intelligence (AI) legislation, and privacy impact assessments. Simon McMenemy (Managing Partner, London) and Ben Perry (Of Counsel, Nashville), who are co-chairs of the practice group, are joined by Tracey Kinslow (Of Counsel, Nashville), Nicola McCrudden (Of Counsel, London), Erin Schachter (associate, Montréal), and Lauren Watson (Associate, Raleigh). The speakers cover developments in the United States, Canada, and Europe.

Transcript

Announcer: Welcome to the Ogletree Deakins podcast, where we provide listeners with brief discussions about important workplace legal issues. Our podcasts are for informational purposes only, and should not be construed as legal advice. You can subscribe through your favorite podcast service. Please consider rating this podcast so we can get your feedback and improve our programs. Please enjoy the podcast.

Welcome to the Ogletree Deakins first ever Cybersecurity and Privacy Law AMA. As some of you may know, an AMA, or Ask Me Anything, is just an informal interview where the participants are invited to ask the host anything that they want to know, no holds barred. Ogletree’s attorneys have spent the past week connecting with clients in our networks to find out exactly which cybersecurity privacy questions they most want answered. We’ve had some really great questions that we’re excited to talk through here today. I’m Lauren Watson. I’m an Associate at Ogletree’s Raleigh office, practicing privacy and cybersecurity full time. I will be hosting this podcast with the help of my esteemed colleagues, so if we could just go around the table and introduce ourselves briefly.

Simon McMenemy: So I’m Simon McMenemy, from the London Office of Ogletree, and I’m also the Joint Head of the Privacy and Cybersecurity Practice Group.

Ben Perry: I’m Ben Perry in the Nashville office. I am also the Co-Chair, along with Simon, and delighted to be here.

Erin Schachter: I’m Erin Schachter. I’m from the Montreal office and Canadian Counsel with Ogletree Deakins.

Nicola McCrudden: I’m Nicola McCrudden. I’m with the London office, and I’m a Data Counselor.

Tracey Kinslow: Good Afternoon. I’m Tracy Kinslow. I’m part of the Data Privacy and Cybersecurity Group at Ogletree Deakins. I’m probably one of the most recent members to join the team, but I’m glad to be here.

Lauren Watson: Well, now that our listeners know who we are, let’s jump in and see what they want to know the most. So the first question we got is, “What are the immediate first steps that a business may want to take when they suspect they are experiencing a data breach?”

Simon McMenemy: You’ve got to find out what the bad actors have got, if there are bad actors, I guess. And I guess that’s where you’ll find out fairly quickly if it’s a sort of ransomware attack or whether it’s just some other kind of breach. But yeah, the first thing that everyone wants to do is try and find out what’s been breached and what’s out there, where it’s gone.

Lauren Watson: Yeah. I think that’s key. And part of it also too, that’s the containment part, right? And hopefully the client does not destroy all of the evidence that we need for forensics in the process of doing that, because otherwise it can make the process a lot more costly and expensive than otherwise it might be.

Erin Schachter: Yeah. I think one of the biggest things too is it really helps to have a structure in place, because it’s so difficult to have to just respond to a breach and try to figure out what to do in the moment. So I would say if you have a plan, start taking a look at it and see what those first steps are. Perhaps it’s contacting the insurance. Perhaps you have a forensic team that you’ve already made contact with. You have a law firm signed up where you know that they’re going to be available and be able to respond quickly. So the most things you can put in place before so that you can actually follow through with those steps, that’s really helpful.

Lauren Watson: Yeah. And especially with respect to contacting your insurance carrier, I think you want to have a good sense of what’s going on and maybe make a decision up front about whether you want to contact the carrier or whether it’s something that you maybe want to get more information about, because you will need to make a decision pretty quickly about whether you’re going to report or whether you’re going to move forward without seeking coverage for the incident.

Erin Schachter: Yeah. And I think there’s also considerations about what is the extent of the breach? Is it still occurring? So being able to have people with an expertise come in as quickly as possible, because unfortunately, I have seen a situation where we think the breach is contained and it’s not. And then, we have a backup that’s put up and two months later it’s another breach. And it can be really devastating to have to deal with repeat incidents, so really trying to get all that information as quickly as possible.

Lauren Watson: Yeah, I think we’ve all seen that, and there’s a lot of value to being absolutely certain that the breach has been contained.

Simon McMenemy: I don’t think you should sort of seek perfection. I think perfection can sometimes be sort of the enemy of the sufficient. You’ll find out down the line more and more and more, but often clients come to us and say, “Oh yeah, we’ve had this breach and we’ve discovered X, Yeah., and Z. And it’s great because we now know.” And you go, “Well, when was this discovered?” And they go, “Oh yeah, three weeks ago.” And it’s like, “Well, great. But actually, it was kind of more important that perhaps you notified either the regulator or in fact the data subjects much earlier than that.” And I’m sure we’ll come on to talk about the different time periods that are required internationally for notification. But as long as you tell your regulator, “Look, we’ve had this happen. We’re not sure. We’re investigating.” No one’s going to criticize you for that. Whereas if you say, “Well, we weren’t a hundred percent sure, so we held off for like 3, 4, 5 weeks before we told anyone.” That isn’t going to look good. That isn’t going to look good.

Lauren Watson: Moving to our next question, some of the people who reached out wanted to know what trends this group is seeing with respect to AI legislation. Any trends you’re seeing in Europe? Anything in Canada? I’m sure we can speak to the US.

Simon McMenemy: I’m not sure if we can call it a trend. It is pretty nascent, isn’t it, globally? I was at a conference earlier this year, the UN are working on guidance and rules. I wouldn’t call it legislation. Obviously, the EU has been one of the front-runners in this area in bringing in the EU AI Act, which actually is very sort of general concepts rather than getting very specific. And you’ve got different gradings, different levels of risk, really. It’s like risk-based, but it’s a start. And then, I’m not even going to talk about the States. I’ll let you guys do that. But we are seeing the advent, the birth of AI legislation in the US as well, aren’t we?

Ben Perry: Yeah. It’s almost a bit comical because legislators are acting like this technology was just invented last year. Just because they’re just now learning about generative AI and things like ChatGPT and automated decision-making in general, which has been around for a really long time does not mean that this is a new technology. And in fact, they’re passing a lot of AI-specific laws in knee-jerk fashion and capturing a lot of technologies that I don’t think they’re intending to. And sometimes they are, sometimes they’re not, but we’re seeing a lot of that just being passed on a state-by-state basis.

Simon McMenemy: Do we think that’s because the big, big tech firms were for a long time against countries bringing in legislation, but then were kind of warned, “Well, look. You guys need to sort out your own house, and if you don’t, we will.” And that’s exactly what’s happened. They kind of took the view, “Well, people should be free to use this technology and we shouldn’t regulate it.” And then, obviously we’ve started to get more and more worried about the implications of what it can do. And so, as I say, the EU’s jumped in there and even the UN is now on board and interested in regulating this space.

Erin Schachter: Yeah. In Canada, we have a bill that went through its second reading that we’ll see what happens because politics are changing. But we do have a model code that came in. It’s a voluntary code of conduct on artificial intelligence that we have people that signed up to. And we have legislation that specifically refers to what they’re calling artificial intelligence systems. So it’s less focused on generative AI, so using large language models, predictive language, but more based on areas of artificial intelligence that would be making predictions. So if we’re thinking about recruiting employees, using artificial intelligence to try to choose candidates. So a lot of the concerns and what’s in the voluntary code of conduct is looking at how can you remove bias? Are we being discriminatory? Trying to take into considerations things like evaluating employees, so avoiding having any type of technology that’s going to be relying on information that’s using intellectual property of other organizations. So we’re looking at a lot of things that I find interestingly is sometimes coveredby other legislation.

Simon McMenemy: Is that coming this year?

Erin Schachter: Yes.

Simon McMenemy: Is that new this year?

Erin Schachter: Yes.

Simon McMenemy: Okay.

Erin Schachter: It is new to this year, so it’s fairly recent. And we have unfortunately one large bill that had things around privacy that we’re going to update privacy law in Canada, so that’s our Bill C-27, and then it had artificial intelligence in it. And a lot of these information systems were put in with the privacy, so the whole legislation is a little bit at a standstill because there’s some controversial elements in the bill.

Simon McMenemy: Is that just Montreal or is that Canada-wide?

Erin Schachter: Canada-wide.

Simon McMenemy: Canada-wide, yeah.

Erin Schachter: So it’s a federal legislation.

Simon McMenemy: Yeah.

Erin Schachter: So yeah. And then, in Quebec, so if we’re speaking of the province of Quebec, we already have our what was Law 25, so that’s our privacy act in Quebec. So when that bill passed, there’s already areas of that bill that speaks to any automated decision-making that would be taking into consideration sensitive information. So a lot of the legislation we have in place is kind of already dealing with things of artificial intelligence. And we have a Copyright Act and other legislation as well, so yes, artificial intelligence legislation is coming through. But what we’re seeing in our practice is that we’re also sometimes able to use other legislation to address the concerns associated with artificial intelligence. So I would say in Canada, if you’re going to be working with artificial intelligence, it’s important to take a look at what are the requirements? There is, especially in Quebec, a requirement to do a privacy impact assessment, meaning that you’re evaluating the technology before using it on individuals. So I think that it’s important to be aware that even if there’s not a specific legislation in place, it’s possible that you will be subject to certain regulations.

Simon McMenemy: And you’ve got really memorable names for your legislation as well.

Erin Schachter: Yes.

Nicola McCrudden: But I think on a practical level in the UK and the EU, like you said, Erin, people and organizations need to be looking at what they already have in place. If they’re going to be refreshing those contracts where they have service providers providing AI that they haven’t really thought about as AI previously, but it is coming into scope of the AI Act, they need to be making sure that they are complying with other laws that are affected by the AI Act such as the GDPR. So if their services or the AI services that they’re going to be procuring, bringing their business and are going to be processing personal data of employees, of their customers, of their clients, they need to ensure that they are doing things like data privacy impact assessments at the outset, that they’re doing transfer impact assessments if they have to, putting in SECs in place with those service providers, particularly in the EU and the UK if they’re going to be transferring EU and UK individuals’ data to the service providers of the AI and things like that. So yeah, on a practical level, I think organizations are having to do a lot more now because of the AI Act and it’s intertwinement.

Simon McMenemy: Do you have the equivalent of privacy impact assessments in the US?

Lauren Watson: In some situations, yes. Some states are mandating them, particularly with the use of automated decision-making technologies.

Simon McMenemy: Is that like California-led?

Ben Perry: Well, yeah.

Lauren Watson: Yeah.

Ben Perry: But the problem is we have this thing called the First Amendment, and there have been some challenges to those sorts of requirements recently. And specific in California, there’s some litigation going on where they invalidated that provision of one of the laws because they were essentially equating it to forced speech. And so, they invalidated that portion of the law and sent it back down.

Simon McMenemy: That’s quite weird.

Lauren Watson: Yeah.

Simon McMenemy: Yeah.

Lauren Watson: There’s some push and pull for sure, but it’s something that clients may want to consider, particularly if they are operating in both the US and the EU. And if they’re going to be using similar technologies across the board, they might want to have it extend because it may become something that’s required.

Simon McMenemy: Is there anything in the California Privacy Rights Act that is AI-specific? Obviously, you had the CCPA came in and then the CPRAs kind of updated it. Talking about trends, CCPA didn’t really cover AI and CPRA did?

Ben Perry: It covers automated decision-making, but they haven’t finalized the regulations on those. They submitted them for comment. We thought they would be finalized this year, but they basically have gone back to the drawing board because I think they got so much pushback on them. And yeah, there’s just no indication of when those will even be, I guess, proposed in final form.

Simon McMenemy: And presumably, the pushback is coming from the Palo Alto, Silicon Valley areas?

Lauren Watson: Yeah.

Ben Perry: Anybody who’s left doing business in California is pushing back on it, I’d imagine.

Simon McMenemy: And after California, what other states are kind of leading the way on this in the US?

Lauren Watson: So Utah recently passed an AI act, but it’s not as broad as I think it was initially advertised as being when it was first passed. Colorado has also passed an AI act. However, when the governor signed that act, he indicated that he expected before it becomes effective, major revisions would take place. So the act exists, but probably not in its final form.

Simon McMenemy: But nothing on the East Coast?

Lauren Watson: Yeah, no, we’re catching up.

Ben Perry: In some ways, none of the requirements of some of these laws are new themes. You could never discriminate against somebody on the basis of certain protected characteristics. And that whole idea of you’re responsible for the output, even if you don’t know how it’s generated, that’s why the disparate impact analysis exists. And so, we’re starting to see kind of those similar themes just being implemented in automated decision-making laws, but now we’re having new requirements like the bias audits that people I think are trying to figure out how to navigate. And also, there are some new technologies that are using stuff in different ways, like we’ve heard of companies having people interview with a computer basically, and they are not talking to a person the entire time. And it’s just analyzing them and figuring out whether or not they’d be a “good fit” for the company. Obviously, resume-scanning software has been around for a long time, and it’s unclear what it’s looking at in making those decisions in terms of whether to vet somebody out. So a lot of old use cases and new ones, but the themes I think are remaining the same.

Lauren Watson: Nicola, earlier you were talking about some of the high-level considerations for using AI tools as part of your business in Europe. Are there other high-level considerations that companies looking to expand operations to Europe should be aware of?

Nicola McCrudden: Yes, absolutely. So everyone will be aware of, most people will be aware of the GDPR. And it does place quite a lot of obligations on organizations who are going to be processing the personal data of EU individuals. So whether they’re established in the EU or they’re offering their services and products and goods within the EU, they need to be considering how they are going to be managing any personal data that comes along. As a starting point, it’s a matter of whether they have operations in the EU or the UK. If they have employees in the EU and the UK, then you think about putting in place a whole catalogue of policies and notifications, notices and procedures in place in order to handle their employee data. Then if they’re going to have clients in the EU and the UK, they also need have certain policies and procedures in place for that type of personal data.

They also need to consider what sort of third parties that they may be interacting with in order to provide their services and provide the products and things like that. So they want to think about if they have the right contractual obligations in place with their third parties. If they have international transfers related to those interactions with third parties, do they have the right contractual obligations in place, such as standard contractual clauses if they’re within the EU?

The international data transfer agreements if they’re within UK or an addendum to the SECs, they want to think about doing their additional privacy assessments that go along with those types of contractual obligations working within the EU. Handing EU data, organizations need to ensure that they are mapping that data, that they know where all their data is going, that they’re handing, so that in the event of the likes of a data subject access request, where an individual wants to know what an organization is doing with their personal data, that organization can quite readily find that data and provide a response within the regulatory deadlines to an individual. Because in the likes of the UK and EU, the regulatory deadlines are very, very short. They’re 30 days. It can be extended by additional two months, but the regulators are quite strict on the requirement that DSRs are handled in a timely fashion.

We see a lot of regulatory reprimands and fines being handled out to organizations for not handling DSRs and things like that appropriately. There’s that data breach, again, organizations want to know where their data, is to understand what data has been affected, and how that data breach may have happened. Did it come through a third party, their supply chains? Did it happen because of their internal software and cyber security controls? All the things that companies and organizations operating within the EU and the UK need to consider or the requirements to have potential policies in place, the requirements to have.

Erin Schachter: A lot of things.

Ben Perry: Speaking of SCCs, there was a big new set of SCCs dropped recently, right, for the recipients that are already subject to GPR.

Nicola McCrudden: Exactly. So for the past few years, people have been put in place SCCs that aren’t actually fit for purpose or aren’t actually required if an organization is actually directly subject to GDPR. The EU commission has promised for a long time that they will send out a new set of SCCs that cover those, that scenario. But yes, you’re right. Those new SCCs are coming out. So that may lead to some changes for organizations that may need to re-paper some of their contracts, look at what contracts they have in place, understand their data, where their data is going, their data flows, do their data mapping, and decide if they need to put in place these new SCCs before the regulatory deadline, in the same way that there was a regulatory deadline in 2022 for previously released new SCCs.

Ben Perry: Some companies were just finishing last month?

Nicola McCrudden: Probably. Yes.

Simon McMenemy: But that’s interesting you say finishing, because I think a lot of companies think they’re compliant with the international data transfer requirements of the EU and the UK if they’ve got the standard contractual clauses in place. But you have to ask the question then, “Well, what are the standard contractual clauses?” And they are promises, and they’re promises to the EU countries in the UK that if you send us in the United States, for example, the data, the personal data of your citizens, we’ll look after it to the GDPR standard. That’s what you’re promising.
So it’s not enough just to put those clauses in a data transfer agreement or a master services agreement or something like that and then forget about it and put it in a drawer. You can still be looked into, you can still be fined by a regulator. You can still end up having your reputation damaged for not looking after that data property if you don’t fulfill those promises, which is you do need to give your people who are handling that data in the United States some training on how to do that. How else will they know what their obligations are? And that includes the things that Nicola was just talking about there, like data retention. If you don’t need that data anymore, let’s say someone’s resume, once you’ve made a decision whether to take them on or not, do you still need their resume? Well, clearly not if you haven’t taken them on. And yet, I think a lot of companies say, “Oh, well, we may have another role for them in the future, so we’ll hang onto it.” No. That’s not allowed under GDPR.

And so, if you are collecting resumes here in the United States from people in Europe, that rule applies over here, which it might not do necessarily to a US applicant. So I think let’s absolutely the right documentation in place and have everyone sign it, but let’s not then forget about what the promises are that you’ve made in signing those clauses.

Erin Schachter: And I think too, a lot of the times, sometimes we forget when we have a head office in the US that when we’re recruiting and then all we’re doing is sending data to the head office, that in certain situations that’s still a transfer. So you can fall subject to the GDPR even though in your mind you’re not sending it to anybody else.

Simon McMenemy: I think that’s a really good point, because with HR information systems, you’ve got one system, Workday, whatever it is, and people can access that anywhere. It seems to be quite a hard concept sometimes for people to grasp, but that is a data transfer. Exactly what you’re saying, yeah.

Erin Schachter: Exactly. I think too, sometimes I’ve seen things like working with an EOR, like an employer of record, and we’re just using this third party and we’re not necessarily realizing that we might be transferring. And I think when we have, let’s say a merger acquisition, we’re purchasing a company in another jurisdiction, and then we start to say, “Well, we’re going to have all these people as employees. Do we have certain things in place? Are we allowed to transfer the data out of that country to us?” And it might be Europe, but it could also be a country in Latin America, it could be a country that is in Asia. So I think lots of times we have these big vision of merger acquisition, “Let’s get all the data to our head office.” But in those moments, we really have to consider jurisdiction. And it’s not only Europe that has certain requirements for these types of contractual clauses.

Simon McMenemy: Yeah. I think with those PEO master services agreements, it’s often very, very interesting reading. Because you are kind of deciding there who’s responsible for this personal data and who’s responsible if there’s a breach and all the rest of it. And you get into the concept that we have under GDPR of who’s the controller of the data and who’s the processor. And you’ve got this really interesting sort of triangular relationship between the PEO, the employee, and if you like, us as the client company. And you do get often people trying to pass the buck of, “Oh, well, they’re not really our employees.” “Oh, but they’re working for you” and that kind of thing. So actually, yeah, I think it’s kind of leveled off a bit, but in the early days after GDPR was brought in, in those master services agreements, there’s some really interesting data privacy addendums, which you still see from time to time.

Announcer: Thank you for joining us on the Ogletree Deakins podcast. You can subscribe to our podcasts on Apple Podcasts or through your favorite podcast service. Please consider rating and reviewing so that we may continue to provide the content that covers your needs. And remember, the information in this podcast is for informational purposes only and is not to be construed as legal advice.

Share Podcast


Modern dark data center, all objects in the scene are 3D
Practice Group

Cybersecurity and Privacy

The attorneys in the Cybersecurity and Privacy Practice Group at Ogletree Deakins understand that data now accumulates quickly and transmits easily. As the law adapts to technical advancements, we effectively advise our clients as they work to comply with new developments and best practices for protecting the privacy of the data that their businesses collect and retain.

Learn more
Glass globe representing international business and trade
Practice Group

Cross-Border

Often, a company’s employment issues are not isolated to one state, country, or region of the world. Our Cross-Border Practice Group helps clients with matters worldwide—whether involving a single non-U.S. jurisdiction or dozens.

Learn more

Sign up to receive emails about new developments and upcoming programs.

Sign Up Now