Modern dark data center, all objects in the scene are 3D

Quick Hits

  • The California Privacy Protection Agency released draft regulations on the rights to opt out of and access information about a business’s use of automated decisionmaking technology and AI.
  • The draft regulations would apply to any decision producing legal or similarly significant effects concerning consumers, including employers’ use of such technology to make employment or compensation decisions.
  • Employers would have to notify job applicants and employees that an employment decision was based on ADMT; employers would also have to notify these individuals that they have a right to access information about the technology’s use.

The draft ADMT regulations would govern businesses’ use of such technology by requiring “pre-use notice” to consumers (including job applicants and employees), reinforcing rights to opt out of and access information about businesses’ use of ADMT, and requiring businesses to conduct risk assessments in certain situations.

The draft regulations would apply to “consumer[s],” who are defined as California residents, and also includes employees, job applicants, and other individuals in the business-to-business or employment context. In particular, the draft regulations would require employers that use ADMT to notify job applicants, and employees that an employment decision (e.g., denial of employment opportunity or lowered compensation) was based on the use of ADMT and that the employee has a right to access information about how the technology was used.

The draft regulations are part of the CPPA’s regulatory authority established in amendments to the California Consumer Privacy Act (CCPA), which were approved by California voters in 2020 as part of Proposition 24, the California Privacy Rights Act of 2020, or the so-called CCPA 2.0.

However, while the draft regulations would put California on the forefront of regulating the use of ADMT, including AI, with respect to individual privacy concerns, it is important to note that the regulations are merely a draft of potential regulations, and the formal rulemaking process has not yet begun.

The draft regulations were published to facilitate public comment and will be discussed at the CPPA’s upcoming board meeting on December 8, 2023, along with discussion of previously released proposed regulations regarding cybersecurity audits and risk assessments. As a result, these proposed regulations are likely subject to change before they are finalized.

The following is an overview of the key aspects of the draft regulations.

Draft ADMT Regulations

The regulations would apply to “automated decisionmaking technology,” which is defined as “any system, software, or process—including one derived from machine-learning, statistics, or other data-processing or artificial intelligence—that processes personal information and uses computation,” in whole or in part, to make or facilitate a decision.

Such decisions include “profiling,” which the regulations define as “any form of automated processing of personal information to evaluate certain personal aspects relating to a natural person” and analyze or predict the person’s performance, behaviors, whereabouts, and other attributes such as reliability, economic situation, or health.

Profiling would include, among other things, the use of some productivity monitoring tools, including keystroke loggers, attention monitors, facial- or speech-recognition technology, and social media and web-browsing monitoring applications. It also includes profiling of consumers while they are in a publicly-accessible place.

1. Pre-use Notice

  1. The draft regulations would require businesses that use ADMT to provide consumers with advance notice that informs them about the use of the technology, the decisionmaking process and outputs, and consumers’ rights to opt out of, and access information about, the use of such technology.

The notice would be required to include: (1) a “plain language explanation of the purpose” of the technology’s use; (2) a description of consumers’ rights to opt out and how to exercise those rights, or an explanation of why consumers cannot opt out; (3) a description of the consumers’ right to access information about the use of such technology; and (4) “a simple and easy-to-use method (e.g., a layered notice or hyperlink)” for consumers to get additional information about the use of the technology.

Such additional information includes: (a) the logic used by the ADMT (including factors used to generate decisions), (b) the decisionmaking result (such as a numerical score), (c) how the business intends to use the output (including whether there is human involvement in the process), and (d) whether a business’s use of ADMT “has been evaluated for validity, reliability, and fairness, and the outcome of any such evaluation.”

2. Right to Opt Out

Under the draft regulations, businesses would be required to allow consumers to opt out of the use of ADMT for “a decision that produces legal or similarly significant effects concerning a consumer,” which the proposed regulations define to include employment opportunities or compensation.

The draft regulations would provide certain exceptions to the opt-out right in various instances, such as preventing or investigating security incidents, fraudulent, or illegal actions directed at the business, for safety purposes, where requested by the consumer, and if there is no reasonable alternative method of processing. There would be a rebuttable presumption that a reasonable alternative method of processing exists, and the business would bear the burden of establishing otherwise.

Additionally, the draft regulations would require businesses to provide employees, job applicants, and independent contractors with the ability to opt out of “profiling,” including the use of “keystroke loggers, productivity or attention monitors, video or audio recording or live-streaming, facial- or speech- recognition or –detection, automated emotion assessment, location trackers, speed trackers, and web-browsing, mobile-application, or social-media monitoring tools.”

The profiling provisions also allow consumers to opt out of profiling “while they are in a publicly accessible place,” which would include the use of “wi-fi or Bluetooth tracking, radio frequency identification, drones, video or audio recording or live-streaming, facial- or speech-recognition or -detection, automated emotion assessment, geofencing, location trackers, or license-plate recognition.”

Businesses using ADMT would have to provide two or more methods for consumers to submit opt-out requests with consideration as to how a business ordinarily interacts with the consumer, the manner of the ADMT use, and the ease with which consumers may opt out. At least one of those methods would have to “reflect the manner in which the business primarily interacts with the consumer.”

For example, a business that interacts with consumers both in person and online may provide an online opt-out form and an in-person method for opting out. Other potential methods include, but are not limited to, a toll-free number, a designated email address, and a form to be submitted through the mail.

3. Requests to Access Information

The proposed regulations would give consumers the “right to access information about the business’s use of automated decisionmaking technology” and require a businesses to provide access to information about their use of ADMT. If a business makes an ADMT decision that denies “goods or services,” including an employment decision such as the denial of an employment opportunity or the lowering of an employee’s compensation, then it would be required to notify the consumer that the business made the decision, that the consumer has a right to access information about the business’s use of ADMT and the way in which the consumer can exercise that access right, and that the consumer can file a complaint with the CPPA and the California attorney general. The business would also be required to include a link on its website to the CPPA and California attorney general complaint forms.

In responding to a consumer’s request for access, a business would be required to provide: (1) the purpose for which the ADMT was used; (2) the output of the technology; (3) how the output either was used to make the decision with respect to that consumer or how the business plans to use the output to make a decision as to the consumer, as applicable; (4) how the technology worked, including how logic assumptions and limitations were applied to the consumer, and key parameters affecting the output; (5) “[a] simple and easy-to-use method by which the consumer can obtain the range of possible outputs,” such as aggregate information regarding decisionmaking for other consumers; (6) instructions for how to exercise the consumer’s CCPA rights; and (7) instructions for how to submit a complaint to the business about its use of ADMT.  

Businesses would be allowed to provide consumers with the option “to allow specific uses” of ADMT so long as an option to opt out of all uses is also offered. Businesses would be required to wait at least twelve months from the date an opt-out request was received before asking consumers whether they want to consent to the use of ADMT.

4. Risk Assessments

The proposed ADMT regulations would work in tandem with the previously released proposed regulations governing risk assessments, which will also be discussed at the upcoming December 8, 2023, board meeting. The proposed risk assessment regulations include language “for additional Board consideration” that would require businesses to conduct risk assessments where the “processing of consumers’ personal information presents significant risk to consumers’ privacy.”

Such processing in the employment context would include, among other things, any of the following five purposes as it pertains to ADMT and similar technology: (1) using ADMT to make or facilitate a decision producing legal or similarly significant effects concerning a consumer; (2) profiling an employee, independent contractor, job applicant, or student, such as through keystroke loggers or other monitoring techniques discussed above; (3) profiling consumers (including employees, if applicable) in publicly accessible places; (4) profiling for the purpose of behavioral advertising; and (5) the processing of consumers’ personal information to train AI or ADMT.

Next Steps

Employers in California may want to consider how the draft regulations would impact their businesses and current employment practices and policies. However, the draft regulations are far from being finalized and there will be further opportunities for discussion and comment.

Ogletree Deakins’ Cybersecurity and Privacy Practice Group will continue to monitor developments and will provide updates on the California, Cybersecurity and Privacy, and Technology blogs.

Follow and Subscribe

LinkedIn | Instagram | Webinars | Podcasts


Browse More Insights

Fingerprint Biometric Authentication Button. Digital Security Concept
Practice Group

Technology

Ogletree Deakins is uniquely situated to provide tech employers and users (the “TECHPLACE™”) with labor and employment advice, compliance counseling, and litigation services that embrace innovation and mitigate legal risk. Through our Technology Practice Group, we support clients in the exploration, invention, and/or implementation of new and evolving technologies to navigate the unique and emerging labor and employment issues present in the workplace.

Learn more
Modern dark data center, all objects in the scene are 3D
Practice Group

Cybersecurity and Privacy

The attorneys in the Cybersecurity and Privacy Practice Group at Ogletree Deakins understand that data now accumulates quickly and transmits easily. As the law adapts to technical advancements, we effectively advise our clients as they work to comply with new developments and best practices for protecting the privacy of the data that their businesses collect and retain.

Learn more

Sign up to receive emails about new developments and upcoming programs.

Sign Up Now