On March 29 and March 30, 2022, the California Privacy Protection Agency (“CPPA”) held via video conference two public pre-rulemaking informational sessions regarding the California Privacy Rights Act (“CPRA”). During the sessions, members of the California Attorney General’s Office and various privacy and cybersecurity experts led discussions on topics such as the sale and sharing of personal information, dark patterns, data privacy impact assessments, cybersecurity audits and automated decision-making. The CPPA Board has not at this time responded to the views expressed by the experts at the meetings.

The goal of the first meeting on March 29 was to provide an overview of personal information and the CPRA. Among other highlights, Supervising Deputy Attorney General Stacey Schesser advocated for the retention of the current California Consumer Privacy Act regulations (“CCPA Regulations”) regarding user-enabled global privacy controls. The existing CCPA Regulations provide that businesses must treat user-enabled global privacy controls as a valid request to opt out of the sale of personal information. Relatedly, Deputy Attorney General Lisa Kim posited that the CPRA’s right to opt out of sharing for cross-context behavioral advertising applies to real-time bidding in advertising auctions and recommended that businesses give consumers the right to opt-out of these auctions.

Separately, Jennifer King, Privacy and Data Policy Fellow at Stanford’s Institute for Human-Centered Artificial Intelligence, discussed using new terminology with respect to dark patterns and suggested that toggle switches for “CCPA Do Not Sell” requests may be considered a dark pattern (i.e., a tactic used by a company to trick consumers into making certain choices). Lior J. Strahilevitz, professor at the University of Chicago Law School, summarized recent studies of dark patterns, which found mild dark patterns significantly increase users’ acceptance of a program and are particularly coercive among less educated populations.

The second meeting on March 30 focused on risk assessments and consumer rights with respect to automated decision-making under the CPRA. UCLA Professor Safiya Noble highlighted the importance of addressing structural racism when developing rules and technologies related to automated decision making. During a later presentation, Andrew Selbst, professor at UCLA School of Law, advocated for transparency in automated decision-making among developers, consumers and regulators.

Gwendal LeGrand, who serves as Head of Activity for Enforcement Support and Coordination for the European Data Protection Board, explained the requirements for privacy risk assessments under the EU General Data Protection Regulation, which may serve as a helpful case study for privacy risk assessments soon to be required by the CPRA. Under the CPRA and its forthcoming regulations, businesses will need to regularly submit to the CPPA a risk assessment regarding their processing of personal information. The assessment must consider whether the processing involves sensitive personal information, and must identify and weigh the risks and benefits of the processing to the business, the consumer, the public and other stakeholders. The CPPA also recently announced additional meetings with stakeholders via teleconference beginning May 4, 2022. Stakeholders can sign up to participate in the meetings by completing the CPPA’s Stakeholder Session Request Form by April 22, 2022.