Quick Hits
- A proposed class action alleges that a widely-used AI-powered tool violates the federal FCRA and California’s ICRAA by compiling sensitive, individualized personal information on job applicants without their consent.
- The complaint contends that by evaluating applicants based on extensive data sources—such as LinkedIn profiles, publications, and job application history—the tool generates consumer reports subject to the disclosure, authorization, notification, and certification requirements of both statutes.
- The lawsuit could be the first of a new wave of class action litigation targeting AI-powered employment tools and similar automated decision-making technology.
On January 20, 2026, a pair of job applicants filed a proposed class action lawsuit against Eightfold AI Inc. in a California state court. The complaint alleges that the company unlawfully compiles sensitive personal information on job applicants—including social media profiles, (e.g., LinkedIn), location data, internet and device tracking data, and data from online cookies—to build profiles about applicants and assess their “likelihood of success” for the job without their knowledge.
The lawsuit seeks to bring nationwide and California class claims under the FCRA, a federal law that regulates how employers collect and use third-party background check information and ensures accuracy, fairness, and privacy in hiring, as well as the similar California ICRAA.
Eightfold is one of a growing list of companies developing software and tools powered by AI and similar automated decision-making technology to aid in employment decisions, such as job applicant screening tools, which employers are increasingly using to improve efficiency. A recent LinkedIn study found that 93 percent of recruiters say they plan to increase their use of AI in 2026, and 59 percent say it is already helping them discover candidates with skills they would not have found before. Two-thirds of recruiters (66 percent) plan to increase their use of AI for pre-screening interviews in 2026, according to the survey.
The lawsuit raises significant new questions about these tools and the large volume of sensitive information collected about job applicants and employees from outside sources and other employers, sometimes without their knowledge or consent, beyond what is provided during the hiring process.
Consumer Reporting Agency Allegations
According to the complaint, Eightfold generates reports on applicants using “AI-powered tools that assemble and evaluate information on prospective employees” and assess their “suitability” for a job based on factors such as “work history, projected future career trajectory, culture fit, and other personal characteristics.” The company then allegedly “sells these reports to employers for use in making employment decisions.”
To generate these reports, the complaint alleges, information is fed into the company’s Large Language Model (LLM), which incorporates over 1.5 billion data points from job titles, skills, and “the profiles of more than 1 billion people working in every job, profession, [and] industry.”
The complaint asserts that “Eightfold’s Evaluation Tools then evaluate and rank job applicants using the data gathered from job applicants during the application process, the employer’s internal data, external data, and Eightfold’s proprietary LLM.”
Specifically, the complaint alleges that the evaluation includes not only the candidate’s profile and resume, but “supplemental candidate data gathered from public sources about the candidate’s professional history (such as blogs, publications, conferences, job application history, etc.),” data from other comparable employees, predictions about the candidate, and “data used to train Eightfold’s AI.”
Further, the complaint alleges that once an applicant applies for a job with an employer using the Eightfold tool, Eightfold retains that applicant’s data and uses it to evaluate other applicants for the same job, unrelated positions, or “for that same job applicant for other positions in the future.”
FCRA Protections
The FCRA and state equivalents such as the ICRAA regulate how employers obtain and use “consumer reports” (or background checks) for “employment purposes,” including hiring, promotion, or retention. The FCRA requires employers to provide stand-alone written disclosures to employees and job applicants, and to obtain written authorization before obtaining a report. To take “adverse action,” such as rejecting a job applicant based on information contained in a consumer report, the FCRA imposes additional pre-adverse action and adverse action notice requirements and provides an opportunity for the applicant to correct inaccurate information.
The FCRA further regulates “consumer reporting agenc[ies],” which are defined as entities that assemble or evaluate information on consumers for the purpose of furnishing consumer reports related to the consumer’s (i.e., applicant’s or employee’s) “character, general reputation, personal characteristics, or mode of living”—broad terms that encompass many applicant or employee attributes—to third parties. The FCRA also provides consumers with the right to access, dispute, and correct information in their consumer reports.
Implications for Employers
While AI tools already have faced lawsuits and regulations alleging that they can lead to unlawful employment discrimination, the new Eightfold lawsuit raises novel allegations and questions about whether these tools may implicate the FCRA and other background check laws—a very different legal framework than employment discrimination statutes.
Next Steps
The Eightfold lawsuit could be the first of a new type of class action litigation targeting employment tools that use AI or automated or algorithmic decision-making processes that retrieve or use applicant- or employee-specific data. However, the case is in its early stages, and it is not clear whether the claims will survive litigation.
Employers may want to monitor this case and similar cases as they develop. In the meantime, they also may wish to review their use of AI-powered tools for recruitment and hiring. If such tools are used, employers may further want to evaluate how those tools function, including which specific data is collected and used in the evaluation of job applicants and employees.
Ogletree Deakins’ Background Checks Practice Group, Cybersecurity and Privacy Practice Group, and Technology Practice Group will continue to monitor developments and will provide updates on the Background Checks, California, Class Action, Cybersecurity and Privacy, Employment Law, and Technology blogs as additional information becomes available.
Follow and Subscribe
LinkedIn | Instagram | Webinars | Podcasts