hands typing on a laptop, close up from over shoulder, soft focus

In this podcast, Shareholder Michael McKnight (Raleigh) and Associate Lauren Watson (Raleigh) discuss the primary privacy challenges that manufacturers face, including pitfalls and best practices surrounding employee monitoring, biometric data collection, and information storage, especially when employers use tools enabled with artificial intelligence (AI) to surveil employees. In addition, Lauren and Michael discuss how manufacturers can comply with various state and sector-specific privacy laws and provide practical tips for manufacturers responding to data breaches. Michael and Lauren offer valuable insights on how manufacturers can balance the need to comply with the various privacy laws, protect their employees’—and the employers’ own—data and devices, and efficiently run their manufacturing businesses in an increasingly tech-forward but regulated environment.

Transcript

Announcer: Welcome to the Ogletree Deakins Podcast, where we provide listeners with brief discussions about important workplace legal issues. Our podcasts are for informational purposes only and should not be construed as legal advice. You can subscribe through your favorite podcast service. Please consider rating this podcast so we can get your feedback and improve our programs. Please enjoy the podcast.

Michael McKnight: Hello, everyone, and thank you for joining us today in what I hope will be the first of a series of podcasts related to privacy issues affecting manufacturers. And this topic came about because of some collaboration I had done with my colleague Lauren Watson, who is one of our privacy specialists here at the firm. And my name is Michael McKnight. I am a shareholder here in the Raleigh office of Ogletree Deakins. I also co-chair our manufacturing industry group and work with a lot of manufacturing clients in my practice. And I was surprised at how the number of privacy issues that impact manufacturers working with Lauren on some of those issues.
Lauren, why don’t you introduce yourself, and then we’ll get right to the meat of it.

Lauren Watson: Sounds good. As Michael said, my name’s Lauren Watson. I’m also an attorney in the Raleigh office of Ogletree Deakins. I’m one of the attorneys in our Cybersecurity and Privacy Practice Group, and my entire practice is focused on helping companies comply with the various cybersecurity and privacy laws and regulations that are being passed across the country. We work with everything from sort of the comprehensive state privacy laws that are coming into effect, and also some more traditional and sector-specific privacy laws, so things like HIPAA, things that deal with biometric privacy, employee monitoring issues, all that good stuff. So, thanks for inviting me to join you for the podcast, Michael.

Michael McKnight: Sure, thanks for being here. Well, let’s get right into it. What do you think are some of the biggest privacy issues out there right now that impact manufacturers?

Lauren Watson: So, thinking of manufacturers in the employment context, they have just a vast amount of personal information associated with their employees. And I think something that I see not just in manufacturing but across the board is that companies are, as we’re seeing more regulation in this space, really having to take steps to get a good handle on the types of personal information they have, how and where they’re storing it, how and where they’re using it. And they’re having to do that on a scale and at a speed that they’ve not had to do previously. So that’s a major issue.
And then I think as we move into this age of AI, as we have more wonderful tools available to us, we want to be able to do more and more with the personal information that we have. And so, things like monitoring our employees, things like using artificial intelligence-enabled tools to surveil employees, or collecting biometric information to do things like keep time records. As we want to do these things, that’s wonderful, we should be doing that, but these manufacturers are having to think about privacy issues in a way that they haven’t necessarily had to do before. So, it’s becoming a major part, I think, of many, many manufacturers’ day-to-day.

Michael McKnight: Now, where does this law come from, Lauren? Is there some kind of national law or national guidelines on this or is the law primarily coming from the state?

Lauren Watson: So, it is primarily coming from the states. There is no federal privacy law that’s sort of across all states, across all sectors. We have, of course, sectoral privacy laws at the federal level. I’ve mentioned HIPAA earlier, but for the most part, manufacturers are facing the challenge of dealing with a patchwork of state laws affecting privacy in different ways.

Michael McKnight: And what do you think is driving that? This type of these privacy issues, it seems like this is an area that there’s interest in regulating. It doesn’t really matter really what the political orientation of those in different states might be or of the regulators, but it seems like there’s reason people on both sides want to try to look at this issue. Some people view it as an issue of this is a company, or this is an employer who’s taking my private information and using it to make a profit, and that’s not fair. And then I think others are coming from the more of a privacy approach that this is my stuff, I don’t want my privacy impinged upon, that kind of thing. What do you think is driving these laws?

Lauren Watson: I think in legislatures across the country there is an existent and growing respect for people’s personal information and for privacy generally. But I also think that there’s an understanding that the tools that collect and use personal information are growing and that they are very valuable tools. So, I think there’s a real desire on both sides of the aisle to make sure that people are protected when we’re using these tools, but we’re also helping businesses to use them in a way that helps them grow and helps the economy do the best that it can.

Michael McKnight: Sounds like these issues aren’t going away no matter who’s in charge.

Lauren Watson: No, I don’t think so. I think we may see different interpretations of different laws, we may see different priorities in terms of enforcement and legislation, but this is definitely not an area that’s going away, and it’s not an area that companies can really afford to ignore anymore.

Michael McKnight: Well, let’s hit some of the high points so that our listeners can do some issue spotting if they have these issues going on in their own companies. Let’s first talk a little bit about employee monitoring. That was one of the topics that you mentioned earlier. As I understand it, there are some states, at least, where an employer cannot even look at things on an employee’s computer or a company phone without first giving them notice. I always thought if an employer owned a particular device, they certainly had the right to look at it. That kind of flowed from, it was almost a property rights thing that if you own the device, you own what’s on it. A lot of times there are disclaimers that pop up when you open up a computer or other electronic device that’s owned by a company, and they say you have no expectation of the right to privacy.
Is that good enough in a lot of states? Or what do you have to do in order to protect yourself as an employer if you are going to be looking at things on an employee’s computer or phone or some other kind of electronic device?

Lauren Watson: Yeah. So, there’s a little bit of a distinction here between when it’s a company-owned device or a personally owned device. With company-owned devices, you absolutely should still be giving that warning, that pop-up that I think we all have on our computers for work that says, hey, anything you do on this device can be monitored. It’s our property, it’s ours, and we’re going to be able to look at it. That’s important.
There are at least three states, Connecticut, Delaware, and New York, that have passed some additional laws targeting electronic employee monitoring. The pop-up isn’t necessarily going away in those states, but they do have some more substantive requirements for employers who want to monitor employees and essentially do these computer-based monitoring activities. And what’s interesting about those, and what I think is a place where people get tripped up with this is you have to think about computer really broadly in this context.
So, it’s not just keeping an eye on what employees are doing on their laptops, rather we are addressing these issues I think more and more frequently with employers who are looking at doing things like installing artificial intelligence enabled surveillance cameras, particularly in manufacturing or other spaces because in some situations because technology has evolved so much that could be considered if we’re taking a conservative approach to be computer-based monitoring. So, we need to be really, really expansive in our views of what could constitute this monitoring and what could require that sort of additional consideration of the notice that you’re giving to your employees there.
So, that is one of the approaches that we take with respect to company-owned devices. When it comes to personally owned devices, it gets a little more complicated in those situations. What we find is most useful, and what we encourage clients to do regularly, is if an employee is going to be bringing their own device, and that could be something as simple as like a phone that they’re using on the job, you really want to have a bring your own device policy in place and communicate it to your employees so that they understand exactly what they can and cannot do with respect to company information on the phone.
You also ideally should have some point of endpoint management tool installed on their devices; that’s important because it essentially allows you to wipe company information. I can’t tell you how many situations we’ve had where a phone gets lost or an employee quits or is terminated under less than ideal circumstances, and then it can be difficult to sort of get that appropriate confirmation that company confidential or personal information has been removed. So, just making sure that you have taken the steps to make sure that you can protect your information when an employee is using a personally owned device is absolutely critical.

Michael McKnight: And one of the things that I think you mentioned or brought up here that sparked my interest was surveillance cameras. And I know that most manufacturers have some kind of surveillance cameras in their facility, and they often refer to those if there’s been an incident involving employees, or there’s a problem with the product, they’re looking to see what happened, what went wrong. Do any of these laws impact employers’ abilities to use surveillance cameras in the workplace?

Lauren Watson: So, by and large, you can use surveillance cameras in the workplace. There are some jurisdictions, like California, that have, I honestly can’t recall if it’s a statute or if it’s just a body of case law that’s developed, but there are places that you really can’t use surveillance because they consider it to be an invasion of privacy. Every state has that to a degree, but California, of course, is a little bit more stringent. Places like break rooms in California, you really shouldn’t be surveilling those.
To be very clear, when we’re talking about surveillance, I’m thinking of just video, not audio surveillance because that, of course, has its own separate set of issues.
Where I think it gets a little bit dicier, and this I see a lot not just with surveillance cameras, again, those AI-enabled surveillance cameras, but also with AI-enabled dash cams in fleet vehicles. Once you are using artificial intelligence to essentially analyze an individual, with the dash cams in particular, it does things like look for what they say are things like indicators of drowsiness, indicators that you’re not paying attention while you’re driving, that kind of thing. Once you start sort of collecting and analyzing that information, you can be in kind of a gray area in terms of biometric privacy.
The way that most of these work is they collect the information, and then it’s converted to essentially a mathematical representation of your biometric identifier. It’s not super clear how courts will treat the mathematical identifier argument, especially in Illinois. I think there’s some case law going both ways. And so oftentimes we recommend to our clients that the risk of litigation is so high in this area in Illinois in particular, and Illinois does have statutory damages, so it can be quite impactful if you get dinged for it. So, what we recommend is that they take steps to comply with biometric privacy laws, and that is the preparation of a biometric privacy-specific policy, and then obtaining consent from those employees whose biometric information, if it is even biometric information but could be processed in connection with this sort of surveillance.

Michael McKnight: So, these laws don’t mean that you can’t monitor employees, you can’t have surveillance, it’s basically a notice of rights and a disclosure about what you’re doing, and as long as everyone knows what’s going on, you can generally do it. Is that right?

Lauren Watson: Yeah, you can generally do it. The real, real push, and we’re seeing this across the board, not just with respect to biometric or surveillance laws, but the real push is for an increase in transparency, so people are aware of what’s being collected on them and how it’s being used. The consent piece is also important just because it, I think, emphasizes this awareness. It affirms that they have actually been told exactly what’s being collected and how it’s being used.
And I’ll note that even in jurisdictions where it’s not strictly required, I think I’ve seen a number of employers opt anyway to provide a notice and then get just acknowledgement. It’s a little bit of a belt and suspenders approach, but it can go a long way towards smoothing employee relations, frankly, so that they don’t feel that they’re being sort of covertly surveilled or held to some standard they’re not aware of, but also just helping with that transparency piece.

Michael McKnight: Sure, that makes sense to me. And I know that some states, like North Carolina really hasn’t gone out on a limb with adopting any type of particular privacy law yet, but we’ve had for a long time a anti wiretapping law, and I’ve seen cases where someone left a recorder in a room, and they were not a party to the conversation, and that resulted in a lawsuit over that against the employee who left the recorder in the room and also against the employer.
Now, in a lot of cases, the courts are going to find that in a workplace you don’t have an expectation of privacy, but it seems to me that employers would be better off letting people know that, letting employees know that there is no expectation of privacy or that you may be being recorded. And then that way, when something like that happens, people are not able to use those laws in order to sustain some kind of cause of action against an employer because they’ve been told you don’t have an expectation of privacy. Would you agree with that?

Lauren Watson: Yeah, absolutely. And in all areas, really, across all industries. Something that I think we’re seeing with respect to wiretapping is more and more frequently employees are coming to meetings and using sort of AI chatbot transcription tools to just transcribe like a Teams meeting. That’s one place that I think it’s incredibly important to make sure that notice is provided, particularly with respect to wiretap laws, because they do still apply. And if you are recording someone in a jurisdiction that is a two-party consent state, or it’s sometimes we call them all-party consent states, if you don’t have the right consent, then you very well could be violating that law, and the penalties are quite steep for violations. So, it’s just another thing that I think maybe isn’t as frequently thought about but should be on people’s radars.

Michael McKnight: Yeah. And one term that you used earlier was biometric data.

Lauren Watson: Yes.

Michael McKnight: And I know that that’s a very common term in the privacy world. For the uninitiated, what could that include? Because I think of things like fingerprints maybe, maybe retinal scans. What else does that include?

Lauren Watson: Yeah, I mean, you’re exactly right. Those would fall within the definition. Unfortunately, I’m going to give you a very lawyer answer: it depends. But for the most part, when we are dealing with biometric privacy issues, and we’ve got employers collecting what we consider to be biometric information, it’s usually fingerprint scans, hand scans, iris scans, retinal scans, or especially in the case of these AI enabled dash cams it’s often a facial scan. So, they’re getting specific identifiers tied to different pieces of facial geometry.

Michael McKnight: Very good to know. Before we wrap up here, one of the things that has always gotten my attention, because this happens so frequently, are these situations where people’s identity is stolen or their maybe identity isn’t stolen but their identifying information is leaked in a data breach, and sometimes that leads to identity theft. Maybe let’s talk about that a little bit. What should an employer do if, for example, they learn that someone’s hacked their computer system, and they’ve gotten a hold of data related to their payroll or to employees’ medical records or any of that kind of thing? What should they do in those circumstances?

Lauren Watson: Yeah, I mean, those circumstances are always really difficult. Nobody wants to get the call, hey, we’ve had a data incident. I will say that this is a situation where having some upfront preparation is going to help you out. So very much. We very frequently recommend to our clients that they prepare something called an incident response plan, and what that is it’s just a written document that walks you through essentially how you’re going to handle a data security incident from the moment you find out about it, tells you who that’s going to be escalated to, tells you who the team that should be handling that incident is, and then just walks through essentially everything you would need to know, who you call. If you have cyber insurance, who to call at your insurer. If you do or do not have cyber insurance, who your preferred counsel is to handle those types of issues.
And once you’ve hopped on the call with your attorneys, you can get a better sense of what else needs to be done. Sometimes we need to do forensic investigations. If it’s a bigger breach and it’s something involving ransomware, that’s pretty common. And that usually entails bringing in an outside vendor who will go into the system, figure out exactly what was accessed, whether anything was taken or whether it was just viewed, and then we’ll give you back a report to help you understand exactly what happened. Once you understand what happened, you can then work with your counsel to figure out which laws are implicated and whether you have notice obligations. And then you’ll move forward if you need to with notifying individuals, what happened, what could have been taken. Very often companies opt, whether they’re legally required to or not, to provide some sort of credit monitoring in relation to these. And I’m sure you’ve seen that, I feel like we’ve all gotten these letters, right?

Michael McKnight: Sure, sure. You can just cancel your regular credit monitoring.

Lauren Watson: Yeah. Right.

Michael McKnight: Because your data gets breached so much, you get six months here, six months there before mine’s about to run out–

Lauren Watson: I do actually think that, yeah.

Michael McKnight: … I usually can just, I can count on getting another letter and getting another offer of credit monitoring. But the bottom line is that you’ve got to be on top of this because those notice requirements they have some strict timeframes, do they not?

Lauren Watson: They do. Yeah. So it totally depends on… Actually, let me walk back and say this, I think it’s a fairly common misconception with companies that have experienced a data security incident. The law of the jurisdiction in which you operate is not the law that is going to necessarily govern your data breach and your data breach notification obligations. The way most data breach notification statutes are drafted, it is the law of the jurisdiction where the impacted person resides. So, you can operate in maybe a handful of states, but depending on where your workers reside or if it’s a consumer breach, depending on where the people who’ve bought products from you live, those laws are going to come into play. And to your point, some of them do have very, very short notification windows. Puerto Rico is 10 days. Vermont, I think, has an initial notification obligation of about 15 days with a longer window to give some more fulsome information.
But I am going through all of this to sort of emphasize that it’s not the kind of thing you want to be dealing with when it happens. It’s incredibly important to think it through, have a plan in place. And then what we encourage people to do, and we actually help out with this, is you should have a tabletop, it doesn’t have to be every year, although very frequently larger companies choose to do it at least every year. But set aside a day, get your important people in a room, and run through your plan. Figure out what’s working and what isn’t working so that you are not the person on Christmas Eve who finds out that they’ve had a data breach and can’t get ahold of anybody to respond to it. That happens all the time. The bad guys are not choosing the most convenient time for companies to respond. They’re actively targeting holidays, weekends, the Super Bowl. Times that they know people are not paying attention so that they can get into the systems, get as much information as possible, and then use it, oftentimes to exploit the companies to pay a ransom.
So you really, really need to know what you’re doing because it’s not going to be a good time when it happens. It’s just not.

Michael McKnight: Never a good time for a data breach.

Lauren Watson: No, never.

Michael McKnight: But very good advice on that. With that, we’ll wrap it up from here. I think we’ve given everybody a good overview of the types of issues a manufacturer can expect to encounter in this area. And then I hope in the future you’ll rejoin me, and we can break these down even further.

Lauren Watson: I’d be happy to. Thank you so much for inviting me on the podcast.

Michael McKnight: All right, thanks again, and thanks for listening, everybody.

Announcer: Thank you for joining us on the Ogletree Deakins Podcast. You can subscribe to our podcast on Apple Podcasts or through your favorite podcast service. Please consider rating and reviewing so that we may continue to provide the content that covers your needs. And remember, the information in this podcast is for informational purposes only and is not to be construed as legal advice.

Share Podcast


Modern dark data center, all objects in the scene are 3D
Practice Group

Cybersecurity and Privacy

The attorneys in the Cybersecurity and Privacy Practice Group at Ogletree Deakins understand that data now accumulates quickly and transmits easily. As the law adapts to technical advancements, we effectively advise our clients as they work to comply with new developments and best practices for protecting the privacy of the data that their businesses collect and retain.

Learn more

Sign up to receive emails about new developments and upcoming programs.

Sign Up Now