Announcer: Welcome to the Ogletree Deakins podcast, where we provide listeners with brief discussions about important workplace legal issues. Our podcasts are for informational purposes only and should not be construed as legal advice. You can subscribe through your favorite podcast service. Please consider rating this podcast so we can get your feedback and improve our programs. Please enjoy the podcast.
Sam Sedaei: Hello, this is an episode of Ogletree’s The AI Workplace podcast, where we discuss the latest developments in the use of our artificial intelligence in the workplace. My name is Sam Sedaei. I am an attorney in the Chicago office of Ogletree Deakins. I am a member of the firm’s technology practice group and cybersecurity practice group. I advise employers on the use of technology and artificial intelligence in the workplace, help them manage risks associated with the use of such tools, and prepare policies for them that govern the use of AI tools by their employees.
I’m joined by my colleague Simone Francis, who is the managing shareholder of Ogletree’s St. Thomas office and who also works with our New York City office. She is an experienced civil litigator and a member of the firm’s technology practice group. She also routinely counsels employers concerning the complex landscape of legal considerations related to the use of automated decision tools and other technology in the workplace.
In today’s episode, we are going to discuss AI notetakers and potential legal issues associated with their use in the workplace. Simone, welcome.
Simone Francis: Thank you, Sam. I am delighted to be joining you for this timely discussion about AI notetakers.
Sam Sedaei: Now, can you, Simone, please tell our listeners what AI notetakers are and how they are used?
Simone Francis: Certainly. These are applications that are appended to virtual meeting platforms, and they generate a written transcript of the meeting conversation.
Sam Sedaei: So, like many rapidly advancing technologies, there could be legal risks and issues associated with the use of AI notetakers. Can you briefly discuss what those risks and issues are?
Simone Francis: Yes. Employers may wish to be mindful of several issues that can arise related to the use of these tools, including ensuring that the meeting participants are aware of and consent to the transcription, considering and establishing guidelines for the distribution and retention of meeting transcripts that take into account issues such as whether any trade secret or business confidential information has been discussed, whether the content of the discussion may be protected by the attorney-client privilege, issues under the national labor relations law and any state-specific laws.
Sam Sedaei: So, let’s talk about the consent issue first. That is something that I see come up, and it’s a concern that many clients have. How could the issue of consent be implicated in the use of AI notetakers?
Simone Francis: With respect to consent, I think there are at least two issues that need to be considered. The first is the legal issue related to the fact that in 10 states and the District of Columbia, the consent of all participants is necessary to record a meeting or a call. And because AI notetakers function in the same manner, it is important to be aware of the laws of those states that require all parties to consent. And that’s particularly given as we know from experience, the fact that meetings tend to include attendees who are located in multiple jurisdictions, and indeed the location of attendees at the time of a meeting may not be known in advance.
Secondly, apart from the question of whether there are any applicable laws that may require consent, some organizations have internal policies that may require notice of the recording and consent to record a meeting or conversation. And so, those organization-specific policies also will be implicated by the use of AI notetakers
Sam Sedaei: That is certainly true, Simone. It is one of the elements that many of us have been including in AI policies that we have been writing for employers is that issue of consent. And I always wonder how many people actually read those policies. I hope everybody, but I’m concerned that maybe sometimes not everyone is fully familiar with any AI policies that may be applicable to their situation. Now, can you please talk a little bit about the risk associated with the collection of personal information from individuals participating in meetings where AI notetakers are used?
Simone Francis: Certainly. In some states such as California, residents are entitled to know what categories of personal information are collected about them. And thus, to the extent that a note taker may be capable of collecting any personal information, an employer will want to be aware of those laws so that the required notices can be provided. And additionally, there are privacy laws outside of the United States that may be implicated when meetings include global audiences and those also should be considered in deploying these tools.
Sam Sedaei: Another issue is how a recording is handled once created. The issue requires a user to decide if the recording should be encrypted, who is to be permitted to access it, how long it should be maintained. I also note that such recordings could become the subject of a litigation hold, which would require a party in litigation to preserve and later produce any such recordings. Simone, could employers also consider whether the information collected during a meeting could be used to train the note-taking tool?
Simone Francis: Yes, that is a very important issue, both with respect to notetakers and any other AI tools that may be deployed in the workplaces. Understanding precisely what data is being collected and whether that data in any way becomes available to developers or other third parties, particularly when the conversations that may be used to train the note taker include confidential or proprietary information. So really understanding how the tool functions in the background is an important issue for employers.
And as a litigator, also want to mention, and those members of our audience who are frequently involved in litigation will certainly understand that distribution and access to the transcript is an important consideration. How it’s maintained, where it’s maintained, the question of whether and to what extent recordings may be reviewed and corrected by one or more of the meeting participants close in time to the meeting, those are all details that should be considered in advance. Because one could easily imagine a situation where you have a transcript, it is mostly accurate but contains some inaccuracies. And then do you have multiple people reviewing that transcript and creating or editing it? Do we keep those edited versions separately from the original? These are all practical issues that employers may wish to consider at the time that they are deploying or allowing these tools to be used in the workplace.
Sam Sedaei: Those are all really important points. In recent months, I’ve been assisting clients by preparing AI policies for them and many are concerned about the very issues we have been discussing today. Some want to ban AI notetakers altogether, while others want to place restrictions on how the technology can be used. Simone, in your view, could companies benefit from putting in place policies that could govern the use of AI tools and in particular here, AI notetakers?
Simone Francis: Yes, absolutely. As with other technologies such as ChatGPT and other OpenAI systems, employers that do not currently have policies may wish to consider developing guidelines with appropriate input from legal counsel and various operational stakeholders. And certainly once developed, those guidelines, it may be appropriate to review them periodically to update them based on experience and enhancements within the technology that is being used or the emergence of new technologies.
But without clear guidelines in place that ensure to the greatest extent practical, uniformity in the use and retention of the notes across the organization, employers could face challenges during litigation, arbitration, or other proceedings where such transcriptions may be relevant to the entity’s claims or defenses.
Additionally, in developing such policies, employers may wish to consider how and to what extent it may be advisable to establish different policies for different groups or functions within the company. For example, we may consider that individuals involved in internal investigations might use notetakers as a matter of course, or perhaps the organization determines that routine use of notetakers in investigation interviews is not something that the organization wishes to allow or encourage.
But thinking ahead to those issues certainly can be helpful and appropriate in managing and mitigating risks. And employers may also wish to distinguish between internal meetings and meetings that include external participants. Employers may also wish to consider how they will handle requests for exceptions to their policies, including by individuals who may seek such exceptions as part of an interactive process related to a disability.
My experience, Sam, and I’d be curious to hear whether your experience is similar or different, but we know that once these technologies exist, people are going to use them either because they exist on company platforms or because they’re joining meetings from their own devices that have other transcription capabilities embedded in the software on those devices. And so a complete ban or the absence of policies can pose some challenges.
So, I think my experience, and certainly I think the experience of many of our clients, is to get ahead of these issues and really consider how to formulate policies that best serve business interests, interests based on the areas of the country or the areas of the world where they operate, and consider legal risk and other appropriate issues.
Sam Sedaei: I agree with you. And I’ve had a similar experience of seeing companies try to ban the use of AI tools or certain AI tools altogether. And sometimes that doesn’t work because people are interested, people are interested in using AI tools and other productivity tools in ways that could benefit them at work. So, it seems like a much more effective way to try to control how the tool’s used rather than to prohibit it altogether.
And the other point that you made I think is also really important, which is that on the one hand, you want to have policies that are specific enough that they’re highly relevant to specific subgroups within your organization. On the other hand, you don’t want to make the policy so complex that a non-attorney user would need to hire a lawyer just to understand what they can and cannot do because of all the exceptions and branches within the policy.
So, we always find when we’re preparing an AI policy that it’s really on a case-by-case basis where we have to strike that sweet spot balance between the level of the specificity and the level of generality. We want it to be specific enough to apply to the common scenarios for a specific employer or company, but you want it to be general enough that people do not get confused, that they can read the policy and really ultimately understand what they can and cannot do without having to consult with many other people.
So, Simone, I think this was a really awesome conversation. It’s one of those topics that come up all the time, the use of AI notetakers. And I’m glad that we were able to dedicate an episode of The AI Workplace podcast to it. Do you have any final thoughts?
Simone Francis: No. I think we’ve summarized it well, and I know that we continue, both you and I and others within our group, to monitor these developments and to work to understand the various legal implications. So, I welcome the opportunity to join you for another episode of this podcast as we wrestle with other automated tools that are being used in our workplaces across the country.
Sam Sedaei: I cannot wait to do more episodes. So, thank you again, Simone, for this very informative session. And I’ll say goodbye to all of our listeners. I hope you have a great week.
Announcer: Thank you for joining us on the Ogletree Deakins podcast. You can subscribe to our podcasts on Apple Podcasts or through your favorite podcast service. Please consider rating and reviewing so that we may continue to provide the content that covers your needs. And remember, the information in this podcast is for informational purposes only and is not to be construed as legal advice.