California State Capitol building with state flag in Sacramento on a windy summer day with clear sky

Quick Hits

  • The California Privacy Protection Agency recently finalized regulations governing automated decisionmaking technologies, representing California’s first set of regulations outside of the anti-discrimination context that place guardrails around AI systems that evaluate or make decisions about individuals.
  • The new regulations apply to significant automated decisions affecting employment, finance, housing, and education, and include enhanced obligations for extensive profiling through systematic observation like Wi-Fi tracking or video recording.
  • California individuals now have the right to opt out of ADMT processing in certain contexts and to access details about how ADMT was used to make significant decisions affecting them.

What Constitutes an ADMT?

Under the final regulations, an ADMT is defined as technology that processes personal information using computation to “replace human decisionmaking or substantially replace human decisionmaking.” The phrase “substantially replace human decisionmaking” means using the output of such a technology to make a decision without human involvement. An organization may nonetheless use these technologies without triggering the definition of an ADMT—and thus avoid the burdensome compliance activities described herein—as long as there is sufficient human involvement in the decisionmaking process. This is known as the “human involvement” exception.

To rely upon the human involvement exception under the CCPA regulations, companies must ensure that human reviewers are trained to interpret and utilize the technology’s outputs effectively, actively review and analyze both the automated output and other relevant information when making decisions, and have the explicit authority to make or alter decisions based on that analysis. In practice, this likely means the burden of reviewing relevant outputs will fall on relatively “high-up” individuals at each company, such as upper management personnel.

The rules explicitly include profiling and exclude routine tasks like web hosting, spam filtering, and simple data organization unless these tasks significantly replace human judgment.

Significant Decisions and Extensive Profiling

The regulations impose obligations on businesses using ADMT to make “significant decisions,” including those involving employment, financial services, housing, insurance, education, criminal justice matters, and essential goods or services. For employers, this means that automated decisions related to hiring, job assignments, promotions, demotions, compensation, and termination are covered. Similarly, businesses engaging in extensive profiling of individuals, including systematic observation in workplaces, educational settings, or publicly accessible places, will encounter enhanced compliance obligations.

The term “profiling” includes any form of automated processing of personal information to evaluate or predict aspects of an individual’s performance, behavior, or interests. It specifically includes practices such as any automated processing to evaluate “performance at work,” reliability, behavior, locations, or movement, whereas the term “systematic observation” refers to “methodical and regular or continuous observation” using technologies such as Wi-Fi or Bluetooth tracking, video or audio recording, location tracking technologies, and “technologies that enable physical or biological identification or profiling.” These broad definitions encompass technologies used to monitor employee productivity, behavior, or communication patterns in a wide variety of circumstances. When profiling is systematic and ongoing, it may require pre-use notice, opt-out rights, and, in some cases, access rights—discussed in greater detail below—unless an exception applies. In particular, employers using AI-driven monitoring tools or productivity analytics may need to evaluate whether their use constitutes profiling or systematic observation under the regulations.

Pre-Use Notices

The finalized regulations elevate consumer privacy protections, emphasizing California individuals’ rights to informed consent, transparency, and recourse. Depending on their existing practices, businesses that are in-scope for these regulations may need to reassess how they communicate with consumers (including employees) and integrate these rights into their operational workflows.

Before deploying ADMT, companies are now obligated to provide consumers with notice that explains the technology’s purpose, its operational scope, and its potential impacts. Importantly, this notice must be presented in plain language to ensure accessibility. We previously detailed the pre-use notice requirements of the initial draft of these regulations. The final regulations are similar in some respects, but include several new material changes:

  • Businesses are no longer required to provide the logic used in the ADMT. However, this is still a required data element in response to an individual’s request to access, as discussed below, so a thorough understanding of how each ADMT operates remains an important element of the diligence process.
  • Businesses must provide information regarding “how the ADMT processes personal information to make a significant decision,” such as the categories of personal information used to generate an output.
  • If a human reviewer does not meet the human involvement exception, the business must provide a description of the human reviewer’s role in the process, such as where the reviewer does not have sufficient authority to overrule the output of the ADMT.
  • A description of the alternative process for individuals who opt out, unless an exception to the opt-out right applies.

The regulations also clarify the form in which these notices must be provided, such as when providing a single, consolidated notice may be appropriate. As an example, the regulations specifically state that a single notice could cover both automated resume-screening software and tools evaluating an applicant’s vocal intonation, facial expressions, and gestures to make hiring decisions.

Right to Opt Out

California individuals now have the right to opt out of ADMT processing in certain circumstances. However, the right to opt out is not without its limits, as the final regulations include several exceptions.

For instance, businesses are not required to offer an opt-out if the ADMT is used exclusively for detecting security incidents, preventing fraud, or ensuring physical safety. Additionally, a business may be able to avoid offering a consumer the right to opt out of ADMT processing if it instead provides a right to appeal the decision to a qualified human reviewer who can reverse it, provided the specific requirements of the human appeal exception are met.

Importantly, there is also an exception for “admission, acceptance, or hiring decisions,” provided that those tools are used solely to assess a person’s ability to perform in a job or educational program, align exclusively with the business’s objectives, and do not result in unlawful discrimination based on protected characteristics. However, in today’s hiring landscape, where businesses may utilize such tools for a variety of purposes in vetting job applicants, such as scrutinizing applicants for evidence of fraud or identity theft, these exceptions may not always neatly apply. Separately, there is a limited exception for allocation/assignment of work and compensation decisions if the business uses the ADMT solely for allocation/assignment of work or compensation, and the ADMT works “for the business’s purpose” (i.e., the ADMT’s intended purpose) and does not unlawfully discriminate based on protected characteristics.

Finally, these exceptions do not apply when ADMT is used for behavioral advertising or training machine learning systems. In such cases, consumers must always be given the ability to opt out. As such, businesses may also need to consider how any employee or individual data is being used by the ADMT, such as whether it is training on those individuals’ data, to ascertain their compliance obligations.

As with other consumer rights under the CCPA, there are statutory deadlines for responding to requests to opt out, even if the ultimate outcome is a denial of the request. There are also very specific requirements surrounding the form in which the opt-out right is provided, such as the type of and number of opt-out methods, the type of information that may be required to be provided by the data subject to exercise these rights, and how the opt-out right is presented. The regulations require at least two submission methods, one of which must reflect how the business primarily interacts with consumers (e.g., online form or toll-free number). The process must be easy to use, clearly labeled, and free of dark patterns. If a request is denied as fraudulent, the business must have a documented basis for the denial and provide an explanation—a requirement that may be particularly relevant where an ADMT is used to evaluate job applicants who may be misrepresenting their identity.

Right to Access Information About the ADMT

In addition to the right to opt out, the proposed regulations give individuals the right to access information about a business’s use of ADMT when it is used to make a “significant decision” about a consumer. As discussed above, this term generally means a decision that affects a person’s rights, as well as access to or eligibility for important opportunities or essential goods and services such as financial, housing, healthcare, independent contracting, and employment-related opportunities.

When a consumer exercises this right, the business must provide a clear and plain-language explanation of how the ADMT was used in relation to that individual. This includes describing the specific purpose of the ADMT, the logic involved in its decisionmaking, and how the output was used in making a decision about the individual. If the system produces a score or recommendation, the business must explain how that output factored into the final decision and whether a human was involved in the process. The business must also disclose the key inputs or parameters that influenced the decision, and how those inputs were applied to the consumer. While businesses are not required to reveal trade secrets or information that could compromise security or fraud-prevention functions, the disclosure must nevertheless be substantive enough in the absence of such information to allow a consumer to understand how the technology affected them.

If ADMT may be used multiple times with respect to the same individual, such as for repeated employee evaluations or loan decisions, the business may also provide information regarding how the output will be used to make significant decisions about that individual in the future. But in all cases, the information must be accessible, accurate, and delivered using reasonable security measures.

Looking Forward

The finalized regulations provided a staggered compliance schedule for these new requirements. Businesses that use ADMT for significant decisions must comply with the ADMT requirements by January 1, 2027.

California’s proposed ADMT regulations represent a major shift in California privacy rights and employee protection under the CCPA, demanding new levels of transparency and accountability from businesses and employers. Businesses may want to consider reviewing their existing processes and technologies well in advance of the January 1, 2027, deadline, and may wish to bake continued evaluative processes into their diligence processes with respect to a newly developed or acquired ADMT, to operationalize these novel requirements, because California regulators have signaled a strong interest in enforcement against businesses of all sizes.

Ogletree Deakins’ Cybersecurity and Privacy Practice Group and Technology Practice Group will continue to monitor developments and will provide updates on the California, Cybersecurity and Privacy, and Technology blogs as additional information becomes available.

Follow and Subscribe
LinkedIn | Instagram | Webinars | Podcasts


Browse More Insights

Fingerprint Biometric Authentication Button. Digital Security Concept
Practice Group

Technology

Ogletree Deakins is uniquely situated to provide tech employers and users (the “TECHPLACE™”) with labor and employment advice, compliance counseling, and litigation services that embrace innovation and mitigate legal risk. Through our Technology Practice Group, we support clients in the exploration, invention, and/or implementation of new and evolving technologies to navigate the unique and emerging labor and employment issues present in the workplace.

Learn more
Modern dark data center, all objects in the scene are 3D
Practice Group

Cybersecurity and Privacy

The attorneys in the Cybersecurity and Privacy Practice Group at Ogletree Deakins understand that data now accumulates quickly and transmits easily. As the law adapts to technical advancements, we effectively advise our clients as they work to comply with new developments and best practices for protecting the privacy of the data that their businesses collect and retain.

Learn more

Sign up to receive emails about new developments and upcoming programs.

Sign Up Now