Hallway of servers.

On January 31, 2023, the U.S. Equal Employment Opportunity Commission (EEOC) held a public hearing, titled, “Navigating Employment Discrimination in AI and Automated Systems: A New Civil Rights Frontier,” to receive panelist testimony concerning the use of automated systems, including artificial intelligence, by employers in employment decisions. The hearing convened with statements by the four EEOC commissioners—Chair Charlotte A. Burrows, Vice Chair Jocelyn Samuels, Commissioner Keith E. Sonderling, and Commissioner Andrea R. Lucas—followed by panelist testimony that included prepared statements and question-and-answer periods with each commissioner. The panelists included higher education professors, nonprofit organization representatives, attorneys, and a workforce consultant.

The EEOC has invited the public to submit written comments on any issues discussed at the meeting through February 15, 2023. These comments will be reviewed and considered by the EEOC and its staff, who are working on these matters.

Panelist Concerns

The testimony addressed a number of shared concerns, though the panelists diverged in their recommendations about the role the EEOC should play to address them.

Critical evaluation of data. The testimony delivered to the EEOC consistently cited the importance of data to artificial intelligence. Concerns related to data include how its scope and quality can impact the individuals who may be selected or excluded by algorithm-based tools.

Validation and auditing. The role of auditing artificial intelligence tools for bias was a repeated concern raised by the panelists. Testimony debated whether audits should be required or recommended and whether they should be independent or self-conducted. Further, panelists questioned whether vendors should share liability related to the artificial intelligence tools they promote for commercial gain.

Transparency and trust. Multiple panelists raised concerns over the extent to which individuals subjected to artificial intelligence tools have any knowledge that such applications are being used. These concerns led the panelists to express doubt about how any individual with a disability affectable by artificial intelligence applications could know whether, when, and how to request an accommodation. Further, the panelists consistently shared as a priority that the EEOC support a system in which artificial intelligence is trustworthy in its applications.

Applicable or necessary laws. Testimony critiqued the application of traditional antidiscrimination analysis to the application of artificial intelligence as a hiring and screening tool. Although current disparate treatment analysis seeks to prohibit a decision-maker from considering race when selecting a candidate, panelists suggested that some consideration of race and other protected characteristics should be permitted as a strategy to de-bias automated systems to ensure an artificial intelligence model is fair to all groups. The panelists also addressed the applicability of the Uniform Guidelines on Employee Selection Procedures to automated decision tools and the potential for the use of analyses other than the “four-fifths rule” to evaluate the potential disparate impact of such tools.

Panelist Recommendations

Multiple panelists called for the EEOC to have a role in evaluating artificial intelligence applications for bias. Commissioner Sonderling suggested the EEOC consider taking an approach similar to the one taken by the U.S. Department of Agriculture, pursuant to which the agency would approve artificial intelligence products to certify their use. Other panelists urged the EEOC to issue guidance addressing compliance with Title VII of the Civil Rights Act of 1964 and the Age Discrimination in Employment Act when utilizing artificial intelligence tools and suggested that the EEOC work with other federal regulators to address the use of these tools.

Key Takeaways

The EEOC is likely to issue one or more additional publications following the hearing’s testimony to provide guidance for employers and individuals on the application of equal employment laws to artificial intelligence applications. The meeting was part of EEOC Chair Burrows’s Artificial Intelligence and Algorithmic Fairness Initiative. One of the stated goals of the initiative is to “[i]ssue technical assistance to provide guidance on algorithmic fairness and the use of AI in employment decisions.” On May 12, 2022, the EEOC issued its first technical guidance under this initiative titled, “The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees.” While this technical guidance focused on the application of the Americans with Disabilities Act (ADA) to artificial intelligence tools, the scope of the testimony at the hearing was significantly broader than this single law.

Further, the EEOC’s hearing took place as the New York City Department of Consumer and Worker Protection continued to develop enforcement regulations for the city’s automated employment decision tools law. New York City’s law is the first of its kind in the United States to impose a bias audit requirement on artificial intelligence applications. While future EEOC publications may address the role of a bias audit in employer decision-making tools, such an audit is unlikely to be required by the EEOC in the absence of a new federal law or a notice of proposed rulemaking.

Ogletree Deakins’ Technology Practice Group will continue to monitor developments with respect to artificial intelligence in employment-related matters and will post updates on the firm’s Cybersecurity and Privacy, Employment Law, and Technology blogs as additional information becomes available. Important information for employers is also available via the firm’s webinar and podcast programs.

Authors


Browse More Insights

Modern dark data center, all objects in the scene are 3D
Practice Group

Cybersecurity and Privacy

The attorneys in the Cybersecurity and Privacy Practice Group at Ogletree Deakins understand that data now accumulates quickly and transmits easily. As the law adapts to technical advancements, we effectively advise our clients as they work to comply with new developments and best practices for protecting the privacy of the data that their businesses collect and retain.

Learn more
Practice Group

Employment Law

Ogletree Deakins’ employment lawyers are experienced in all aspects of employment law, from day-to-day advice to complex employment litigation.

Learn more

Sign up to receive emails about new developments and upcoming programs.

Sign Up Now