The California Privacy Protection Agency’s (CPPA) highly anticipated regulations for automated decision-making technology and risk assessment requirements are likely far from final. The CPPA met at the beginning of the month but did not come to a consensus on what the final regulations should look like.

The CPPA’s vote was expected to be procedural but the final review to begin formal rulemaking will now not begin until the summer. The CPPA’s General Counsel, Phil Laird, stated that the rulemaking process may not be completed until sometime in 2025.

The CPPA will continue developing the final rules to govern how developers of automated decision-making technology (ADMT) (which includes artificial intelligence (AI)) and businesses using such technology can obtain and use personal information. The rules are also expected to include specific details on how to collect opt-outs and when risk assessments must be conducted. Risk assessments would be required when training ADMT or AI models that will be used for significant decisions, profiling, generating deepfakes, or establishing identity.

Further, personal information of minors would be classified as sensitive personal information under the California Consumer Privacy Act/California Privacy Rights Act, and “systematic observation,” which is the consistent tracking by use of Bluetooth, wi-fi, drones or livestream technologies that can collect physical or biometric data, would qualify as “extensive profiling” when used in a work or educational setting.

So, where do we stand on these potential requirements? Without a unanimous vote from the CPPA on the proposed regulations, the CPPA will take another two months to rework the rules and get all members in alignment. We’ll continue to monitor the progress.

Photo of Kathryn Rattigan Kathryn Rattigan

Kathryn Rattigan is a member of the Business Litigation Group and the Data Privacy and Security Team. She concentrates her practice on privacy and security compliance under both state and federal regulations and advising clients on website and mobile app privacy and…

Kathryn Rattigan is a member of the Business Litigation Group and the Data Privacy and Security Team. She concentrates her practice on privacy and security compliance under both state and federal regulations and advising clients on website and mobile app privacy and security compliance. Kathryn helps clients review, revise and implement necessary policies and procedures under the Health Insurance Portability and Accountability Act (HIPAA). She also provides clients with the information needed to effectively and efficiently handle potential and confirmed data breaches while providing insight into federal regulations and requirements for notification and an assessment under state breach notification laws. Prior to joining the firm, Kathryn was an associate at Nixon Peabody. She earned her J.D., cum laude, from Roger Williams University School of Law and her B.A., magna cum laude, from Stonehill College. She is admitted to practice law in Massachusetts and Rhode Island. Read her full rc.com bio here.