Hallway of servers.

The launch of ChatGPT on November 30, 2022, ushered in an explosion of interest by businesses seeking to incorporate large language model artificial intelligence applications into the workplace. To capitalize on efficiencies that this technology presents, many employers have implemented or are considering the use of chatbots to serve human resource functions. Such a program can meet a wide range of needs, from gathering job application information and conducting basic candidate screening to acting as an initial point of contact to answer employee questions related to topics such as employee benefits and company policies or direct users to other resources.

While few areas of the country have passed laws directly regulating artificial intelligence applications to date, introducing a human resources chatbot to the workplace still carries the potential risk of violating any number of established labor and employment laws. Over the course of 2022, federal labor and employment law stakeholders, including the U.S. Equal Employment Opportunity Commission (EEOC) and the general counsel of the National Labor Relations Board (NLRB), published guidance addressing how artificial intelligence tools, including chatbots, can run afoul of the Americans with Disabilities Act (ADA) and the National Labor Relations Act (NLRA). Thus, employers can expect conflicts between employee legal protections and artificial intelligence to draw strong interest from law enforcement agencies.

Some of the legal risks associated with chatbots may be more readily apparent than others. One issue applicable to chatbots highlighted by the EEOC’s May 12, 2022, publication, “The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees,” relates to health condition questions. The potential violations of law explained by the EEOC included that the ADA restricts an employer’s ability to conduct disability-related inquiries and medical examinations. If an employer’s chatbot directly questions a candidate or employee about his or her health condition or raises inquiries that are likely to elicit information about a health condition, such dialogue may infringe on the employee’s rights under the ADA.

Other legal risks presented through the introduction of chatbot applications may be less obvious. For example, a chatbot that is deployed as an initial point of contact to assist employees with human resources needs may be used by employees to attempt to report concerns or complaints related to discrimination, harassment, or retaliation. Many human resources professionals are well-familiar with the importance of a timely response to such issues to preserve invaluable defenses, among other worthy objectives.

Yet many employers have also experienced communication challenges that some employees have presented related to their unique use of emojis and slang, particularly in remote work settings that heavily rely on electronic communication. A chatbot that is not equipped to understand or redirect a potential report of misconduct using this distinct language may miss the need and opportunity to address problematic behavior. This issue highlights the importance of maintaining a human presence in human resources functions and avoiding the temptation of relying entirely on artificial intelligence for this important company department.

Additionally, risks may arise related to the information that employees communicate through a chatbot. The latest large language model chatbots utilize machine learning to improve performance over time and use. The machine learning process may rely on assimilating the information that employees submit to it and distributing it externally to expand the information available to the chatbot, thereby increasing the application’s accuracy. Companies may question what happens with that information once an employee uses it to interact with the chatbot. A business may risk losing confidentiality and trade secret protections if the information that employees communicate with the chatbot is disclosed to third parties through the machine learning process.

A chatbot application may raise other legal concerns related to the personal health information that employees may submit to it. An employee may voluntarily disclose to a chatbot the details of a health condition when seeking benefits-related information. Even assuming there is not an issue such as that discussed above with the employer soliciting information from the employee, the employer may maintain obligations to protect and retain the employee’s communication as confidential medical information. This hypothetical also presents the possibility that the employer received sufficient notice of a serious health condition supporting a need for leave under the Family and Medical Leave Act (FMLA) or a request for accommodation pursuant to the ADA. If the chatbot does not respond appropriately, the employer may face a claim of interference or denial of rights under these laws.

Employers may want to be mindful of how artificial intelligence tools such as chatbots are received by employees and communicate how and why a chatbot is being introduced. A failure to help a workforce understand the benefits of these applications and to address individual concerns may have a detrimental impact on efforts to maintain positive employee relations. Without a concerted investment in messaging strategy, an employer’s latest and greatest artificial intelligence tools may become a focus for employees to seek outside assistance to address their concerns.

These issues highlight the importance of proceeding carefully when implementing chatbot technology and understanding how the technology works to address foreseeable questions and issues that may arise. Additionally, contracts that cover the terms and conditions of third-party technology may be critical to establish the company’s rights to information and resources to defend against employee claims. Lastly, employers may want to have plans in place to closely monitor chatbot functionality to ensure compliance with labor and employment law requirements and avoid assumptions that the application works as intended when initially implemented.

A version of this article first appeared in Legaltech News.

Author


Browse More Insights

Form for a leave of absence on a desktop.
Practice Group

Leaves of Absence/Reasonable Accommodation

Managing leaves and reasonably accommodating employees can be complex, frustrating, and expose employers to legal peril. Employers must navigate a bewildering array of state and federal statutes, with seemingly contradictory mandates.

Learn more
Modern dark data center, all objects in the scene are 3D
Practice Group

Cybersecurity and Privacy

The attorneys in the Cybersecurity and Privacy Practice Group at Ogletree Deakins understand that data now accumulates quickly and transmits easily. As the law adapts to technical advancements, we effectively advise our clients as they work to comply with new developments and best practices for protecting the privacy of the data that their businesses collect and retain.

Learn more
Practice Group

Employment Law

Ogletree Deakins’ employment lawyers are experienced in all aspects of employment law, from day-to-day advice to complex employment litigation.

Learn more
Silhouette shadows of business people talking in office
Practice Group

Unfair Competition and Trade Secrets

We know your business. We know what makes it valuable. We make it our business to protect your assets and goodwill. Every day, our Unfair Competition and Trade Secrets Practice Group—comprised of more than 100 lawyers—leverages our deep bench, experience, and efficiency-built technology and litigation support to partner with companies of all sizes, from small businesses to Fortune 100 companies.

Learn more

Sign up to receive emails about new developments and upcoming programs.

Sign Up Now