Announcer: Welcome to the Ogletree Deakins Podcast, where we provide listeners with brief discussions about important workplace legal issues. Our podcasts are for informational purposes only and should not be construed as legal advice. You can subscribe through your favorite podcast service. Please consider rating this podcast so we can get your feedback and improve our programs. Please enjoy the podcast.
Lauren Watson: Thanks for joining us for part two of our Cybersecurity AMA podcast. The discussion continues now. So, if we want to pivot and maybe think about entities that don’t have a full-scale cybersecurity program in place, but they do want to stand one up, what are some practical issues that you guys think they might face trying to do this? I mean, there’s the usual obvious things like budget and getting management buy-in, but are there any things that you guys think companies can do to address those issues or any non-obvious considerations?
Simon McMenemy: It’s having a plan, isn’t it? We might have already referred to that, but I think a lot of clients we come across still don’t have the plan. But again, don’t just have a plan and then put it in a drawer, have a plan and practice it, so everyone knows if it happens, what their role is. So, people in marketing know what they’re doing. People can deal with the press. HR can tell employees if their data’s affected. Have drafts of the memos that go out or the statements that go out to the press or stuff like that. So, I think it’s having a plan and practicing it. Seems very easy to say, but actually when you sit down and say to someone, “Well, that’s your responsibility.” You’ll then find, there’s perhaps a bit of an internal debate, they’ll say, “Oh, well, actually no, I think you’re better placed to do that.” So, it’s no good someone in legal, for example, or someone in IT writing a plan saying, “Hey, it’s fine. We’ve got a plan.” Because actually if they haven’t exercised it, they don’t know whether it works and whether the people that they’ve given responsibilities to are actually the best people.
Nicola McCrudden: I think that’s exactly like to complement the point you’re making is choosing the right people and making sure they’re trained. We have some interesting case law that came out of Canada where they chose a secretary who didn’t have any qualifications and then tried to say, “Well, you were a privacy officer, so we have a requirement-“
Simon McMenemy: That’s mean.
Nicola McCrudden: Yes. And it did not go over well with the judge, as you can imagine. So, I think you have to understand that in some jurisdictions you have an obligation to name a privacy officer, but you need to make sure that person has real authority in the organization, and knowledge.
And you brought up involving HR and other departments, and I often, one of the first things I’ll recommend to clients is, “Collaborate, create a committee.” If you have different departments, you want to have somebody who is perhaps in charge, but you want to have a committee, you want those people to meet. We were talking about there’s certain obligations in some jurisdictions that you have to do a privacy impact assessment. But what happens if HR signs up for a service without informing the privacy officer? Well, then what happened is just a lack of communication means that we’ve not respected our policy.
So, really understanding that, as Simon was mentioning, that those policies have to be a real part of corporate structure. And having the right people, the right place, and they need to be meeting, they need to be applying it, they need to be trained. And there’s some great programs that are available. There’s courses that people can take online to become privacy officers. So, there’s lots of things in place to become educated. Or, you choose a new employee, you hire somebody specifically for that role, but having the right people in the right place is really important.
Simon McMenemy: I think people probably know more than they think they know, and often it’s like, “Oh, well, we’ll bring someone in from the outside to give us the training, or to run through, or to show us the holes in our systems,” and all this kind of thing. But actually, I think you’re absolutely right. When you put a group of people, they’re the experts. They know their company, they know what they’ve got, they know who they’ve got. And actually, you’re probably 90% of the way there. Yes, fine, you may need to bring in a cybersecurity consultant to help you … I talked to someone recently about how hackers get in, and he described it as people leaving the doors open in the house. If you get someone in to help you go around shutting the doors, you can get a long way there to preventing a hack. But actually, you can do so much with the people you’ve already got in most medium to large-size entities.
Lauren Watson: I’d be curious to know, it’s my understanding that there’s some state case law in the European Union that says you really shouldn’t be using your corporate executives as your DPOs. How do you strike that balance between finding someone who has the requisite knowledge but also is well-placed to have that role?
Simon McMenemy: You don’t want to put your own job on the line by going to the board and saying, “Look, you really need to change this, and you’re really not complying and you’re not listening.” It’s really difficult for people to do that internally, which is why in Germany, for example, it’s nearly always an external data protection officer.
The downside of that, I guess, I’ve just sort of mentioned is they don’t necessarily know the organization as well as someone who’s working within it. So, in the UK, for example, there isn’t that requirement to have someone from the outside, but across the whole of Europe, it’s got to be someone who has got the knowledge and the skill on the legal side. So, they’ve got to know GDPR pretty well and what the requirements are. But I would say if they also know how your organization works, they know where the data flows and they know who the right people are with access and that kind of thing, they’re going to be a much more effective DPO than an outside consultant.
Nicola McCrudden: Yeah. And I think it’s an excellent point. I think if you think, a really practical example is onboarding and offboarding employees. Somebody comes in, if they don’t understand how it really works, “Oh, that employee actually was given a laptop. Did we get it back?” “They used their own cellphone, but they had email on it. Did we make sure that that was wiped? Who should we be talking to?” IT needs to work with HR. If you have somebody who doesn’t really understand that structure, there’s boxes that might not get checked.
And I think it’s really important when you’re looking at your structure and you’re thinking of those moments that are really key privacy incident moments, it’s when somebody comes in, what access they get and when somebody goes out. And so, when you have people within the organization, they’re already thinking of these things, you can just create policies that complement your existing policies and you’re just making sure you’re respecting the legislation in place.
Simon McMenemy: So often you get people in organization go, “I really don’t want to be the data protection officer.” Actually, in Europe, it’s pretty hard to fire someone who’s a data protection officer if they’ve taken on that responsibility.
Lauren Watson: So, are we going to see the death of the external DPO?
Simon McMenemy: No, I don’t think so. As I say, I think it is a balance, and if you can get the right person internally, I think that’s great. But I can’t see there being a change, for example, in Germany, I think you’re always going to have a very, very healthy external DPO consultancy practice there.
Lauren Watson: Tracey, you’ve got a somewhat unique legal background compared to some of us here in the room. Do you mind filling us in on your background?
Tracey Kinslow: My background in this area is primarily from my military experience, having served 24-plus years in the Air Force, both on active duty and in Tennessee Air National Guard. I helped stand up two units that involve cybersecurity with our Tennessee Air National Guard and served as their legal advisor for eight years.
Lauren Watson: So, it sounds like you probably have some thoughts about what businesses that are looking to stand up a cybersecurity program might want to do.
Tracey Kinslow: Well, standing up the program involved a lot of training, and that would be critical, I would imagine, for any business, but particularly one that’s starting out. Is to make sure that their employees are properly educated and trained, not just in cyber, but at their particular level in cyber.
Lauren Watson: So, it sounds like role-based training is particularly important.
Tracey Kinslow: Absolutely. Because I’m sure it’s been mentioned by you guys that, and we all know that sometimes the email that an employee might have from work, might be the only email they’ve ever had. So, depending on their role, responsibilities, duties. And so, making sure that the training is structured in a way that each and every employee can understand how vital their connectivity to the system is and how it can impact the whole company.
Lauren Watson: Right. And I think there’s also some considerations around the particular legal regime that a business is subject to. For example, if you are a HIPAA-covered entity or a business associate, there’s going to be some additional training that you are going to need to do just by virtue of operating in that space. So, I think that’s also something really important to be aware of on early days.
Tracey Kinslow: Oh, absolutely. Absolutely. Whenever you’re dealing with critical information, personally identifiable information, in particular if you’re a company that deals in any type of investigative role, particularly from my background, when you’re doing surveillance and other things that you might be doing that involves someone’s personal data and information, there can be a lot of liability associated with how you go about handling that information, how long you handle that information, whether you should be handling that information. But yeah, that’s all critical for employees to understand if they’re in a company that deals in those areas.
Ben Perry: It seems like these types of phishing and other types of threats are not going away, and in fact, they’re getting more sophisticated with the proliferation of generative AI and it just being publicly available, people being able to create videos of other people speaking. So, Tracey, I know you love your hypotheticals. Give us a hypothetical of what one of these potential attacks might look like.
Tracey Kinslow: Well, it’s interesting you say that because I know all of you have probably now recently experienced the phone call where the person doesn’t identify what company they’re calling from. They just simply call you and say, “Hey, good evening, this is so-and-so, and so-and-so. Am I speaking to Tracey?” And of course, the threat actor just wants you to say yes, so they can record your voice saying yes. So, then they can use that in order to access your personal data and information when they call, because now they’ve put all the pieces together. They know a little bit about you, your family name, your children’s names, which you’re probably using as your passwords. And also, then they have an idea of your date of birth and where you might live. And so, any of that information can be pulled together.
And then, the other piece is they see your business profile, so they know where your job is located and that address, and they’re able to research that as well. And so, all of a sudden now they have your voice, they have your birthdate, they have your children’s names. And all of that stuff you’re probably using when you call in to make whatever security clearance that you do with whatever agency, bank account, or any other private information that you want to keep confidential.
Ben Perry: Well, don’t forget that your social is also on the dark web somewhere.
Tracey Kinslow: Yeah, that too. So, when they call me, I say, “Mm-hmm.” “Is this Tracey?” “Mm-hmm.” “I couldn’t quite hear you. Is this Tracey?” “Mm-hmm.”
Lauren Watson: I like the [inaudible 00:11:50].
Tracey Kinslow: But that’s just one simple example. But hey, not to mention the phishing emails. I don’t know how true it is, but the one with the folks that are receiving phone calls that are AI calls, “There’s a loved one who’s just been in an accident and I need just money right now, because if I don’t get this money, they’re going to lock me away in jail.” And then they end up calling that relative, “And I’m not even in that city, and I don’t even know what you’re talking about.”
But Ben and I were talking about not just from the business perspective, which we are very intricately involved in, but we have to understand how it’s interrelated with our family lives, and that’s important too. And so, we have to understand how that’s connected there, and not only how we train our employees, but letting our employees know that, “Hey, you need to train your family as well because they have access to the same information and whatever your internet is at home could lead to the same thing.”
Lauren Watson: There needs to be an element of empathy and understanding in any training that you do. It’ll help with getting employee buy-in, but it’ll also just make you a better employer.
Announcer: Thank you for joining us on the Ogletree Deakins Podcast. You can subscribe to our podcasts on Apple Podcasts or through your favorite podcast service. Please consider rating and reviewing so that we may continue to provide the content that covers your needs. And remember, the information in this podcast is for informational purposes only and is not to be construed as legal advice.