Skip to content

California Age-Appropriate Design Code (CAADC): Business Guide

Resources / Blog / California Age-Appropriate Design Code (CAADC): Business Guide
Summary
  • Signed into law in September 2022, the CAADC (AB 2273) was set to take effect July 1, 2024. Court injunctions have blocked enforcement continuously since September 2023.
  • In March 2026, the Ninth Circuit upheld injunctions against five of six challenged provisions, finding them likely unconstitutional under the First Amendment. A geolocation data restriction for minors was allowed to stand.
  • The CAADC applies to any business operating an online service “likely to be accessed by children”, not just platforms aimed at minors. That includes general-audience websites, apps, and connected devices.
  • Key requirements include privacy-by-default settings, data minimization, restrictions on profiling minors, a ban on dark patterns, and Data Protection Impact Assessments (DPIAs) before launching services children may access.
  • Penalties reach USD 2,500 per affected child per negligent violation and USD 7,500 per affected child per intentional violation, with no cap per incident.
  • Even with enforcement blocked, obligations under COPPA and the CCPA remain active. The CAADC’s design principles also reflect the broader direction of the regulatory landscape.

California’s Age-Appropriate Design Code Act was designed to be a landmark in children’s online privacy. Signed into law in 2022 and modeled on UK law, it imposes sweeping obligations on any business offering online services likely to be accessed by users under 18.

However, it has not yet taken effect. Courts have blocked enforcement since before its July 2024 effective date, and the Ninth Circuit’s ruling in March 2026 upheld most of those injunctions.

But the CAADC is not over, and the compliance landscape for businesses that handle children’s data is not static. Other state laws are in force, federal pressure is building, and California has made clear it intends to protect minors’ data one way or another. 

Understanding what the CAADC would require, why it’s blocked, and what businesses should be doing now is essential for any company that operates online and reaches a general audience.

What Is the California Age-Appropriate Design Code Act?

The California Age-Appropriate Design Code Act (AB 2273) was passed in the California legislature unanimously before being signed by Governor Gavin Newsom on September 15, 2022. It was modeled on the United Kingdom’s Age Appropriate Design Code, which is itself grounded in the UK GDPR.

Enforcement authority rests with the office of the California Attorney General rather than the California Privacy Protection Agency (CalPrivacy). The AG’s office must give businesses a 90-day period to cure violations before any formal action, though that cure window does not cap the underlying penalties. There is no private right of action.

The law applies to businesses that:

  1. Meet the definition of a “business” under the California Consumer Privacy Act (CCPA)
  2. Develop or provide an online service, product, or feature that is “likely to be accessed by children” (defined as any user under 18)

Who Does the CAADC Cover?

“Likely to be accessed by children” is broader than it may sound. The CAADC does not require that a business market to or intend to serve minors. Indicators that a service is likely to be accessed by children include:

Subject matter that appeals to children

Music, games, or animations children commonly use

Celebrities or influencers with large child audiences

Language or reading-level indicators suggesting a younger audience

Advertisements directed at children

This coverage definition sweeps in general-audience platforms: social media, video streaming, online gaming, music platforms, connected devices, shopping apps, and more. If minors reasonably could and do access a service, the CAADC would apply.

That breadth is one reason the litigation has been so consequential. Courts found that because the law’s coverage depends on evaluating the content of a service to determine whether it appeals to children, it constitutes a content-based regulation and is subject to strict First Amendment scrutiny.

What the CAADC Would Require

The CAADC imposes obligations across several categories. Most remain enjoined, but they define the compliance baseline businesses should understand.

Privacy by Default

Covered businesses must configure default privacy settings at the highest level of privacy available unless they can demonstrate a compelling reason that a different setting is in the best interests of children. This is the inverse of how most consumer platforms currently operate under U.S. law, where data collection is on by default in most cases and users actively must opt out.

Data Minimization

Businesses may only collect, sell, share, retain, or use personal information about a child to the extent reasonably necessary to provide the service the child is using. Data minimization dictates that personal data collected for one purpose cannot be repurposed for others.

Restrictions on Profiling and Targeted Content

Businesses cannot use a child’s personal information to serve content or advertising that is not in the child’s best interests. Profiling of minors, which involves using data to build behavioral or interest models, is restricted unless the business can demonstrate it is necessary for the service and in the child’s interest.

No Dark Patterns

The CAADC prohibits using design techniques, which are commonly called dark patterns or nudging, to manipulate children in a number of ways, including:

  • Leading or encouraging children to provide more personal information than necessary
  • Making choices that are not in their interests
  • Remaining engaged with a platform beyond what would be beneficial to them, including infinite scroll, autoplay, streak mechanics, and similar engagement-maximizing features designed to exploit users’ behavioral tendencies

Data Protection Impact Assessments

Before launching any new online service, product, or feature likely to be accessed by children, businesses must complete a Data Protection Impact Assessment (DPIA)

Existing services had to complete assessments by July 1, 2024. DPIAs must evaluate risks to children across eight factors, including exposure to harmful content, contact risks, behavioral tracking, and data collection practices.

This requirement is the core of what courts have found constitutionally problematic. Courts blocked enforcement in September 2023 on these grounds, and the Ninth Circuit’s March 2026 ruling affirmed that the DPIA requirement compels businesses to “opine on potential harm to children” and function as de facto content editors for the state. It likely fails First Amendment scrutiny as a result.

Age Estimation

The CAADC requires businesses to implement age estimation for all users or to treat all users as if they were minors. Courts found this provision less clearly unconstitutional than the DPIA requirement, but the district court enjoined it as part of its broader injunction against the full statute.

Penalties

Violations carry civil penalties of USD 2,500 per affected child for each negligent violation and USD 7,500 per affected child for each intentional violation. There is no per-incident cap. For a platform with millions of minor users, the exposure from a single non-compliant feature could be substantial.

The Litigation: Where Things Stand

NetChoice, which is a trade association whose members include Google, Meta, Amazon, and TikTok, filed suit against California Attorney General Rob Bonta in December 2022, challenging the law on constitutional grounds.

Courts blocked enforcement in September 2023, finding that the CAADC’s requirement for businesses to assess and mitigate potential harm to children compelled speech in violation of the First Amendment. The California AG has appealed at every stage.

The Ninth Circuit’s most recent ruling, issued March 12, 2026, is the most detailed yet. Of the six challenged provisions, five remain blocked (enjoined) pending further proceedings. The court reversed the lower court on two points — the coverage definition and the age estimation requirement — finding neither unconstitutional on its face. 

One provision survived outright: restrictions on collecting, using, selling, and disclosing minors’ geolocation data. The data protection and dark patterns provisions remain blocked, but on vagueness grounds rather than First Amendment grounds, which is a meaningful distinction, as it signals that redrafted versions of those provisions could survive constitutional review.

The CAADC cannot be enforced while injunctions remain in effect. But the litigation does not make the law disappear. California has not abandoned it, the Ninth Circuit has indicated its core data protections may be constitutionally sound if redrafted, and similar design codes are advancing in Maryland, Vermont, Nebraska, and South Carolina.

What’s in Force Now: Existing Obligations for Businesses With Minor Users

Businesses handling data from minors face real obligations now, regardless of where CAADC litigation ends.

COPPA

The federal Children’s Online Privacy Protection Act (COPPA) requires operators of websites and online services directed to children under 13, or general-audience services with actual knowledge of users under 13, to obtain verifiable parental consent before collecting personal information. 

COPPA was last significantly updated in 2013; however, the Federal Trade Commission has since finalized amendments that took effect on June 23, 2025, with a compliance deadline of April 22, 2026. 

Key changes include a requirement for separate parental opt-in consent before children’s data is shared for targeted advertising, new data retention limits, and expanded definitions of personal information to include biometric identifiers.

A separate legislative proposal, the Children and Teens’ Online Privacy Protection Act (COPPA 2.0), passed the U.S. Senate by unanimous consent in March 2026. The bill would extend COPPA’s protections to teenagers under the age of 17 and ban targeted advertising directed at minors. As of early April 2026, it awaits action in the House.

How COPPA and the CAADC Relate

COPPA and the CAADC are complementary rather than duplicative. COPPA sets the federal floor, including parental consent before collecting data from children under 13 for services directed at that age group. 

The CAADC was designed to go further: extending protections to all users under 18, applying to any general-audience service children are likely to use, and imposing design-level obligations — privacy by default, no dark patterns, impact assessments — that COPPA doesn’t touch. 

Where COPPA asks whether a business obtained consent, the CAADC asks whether the product was built with children’s interests in mind from the start. Businesses that are COPPA-compliant are not necessarily CAADC-compliant, and the gap between the two frameworks is where most of the CAADC’s obligations live.

CCPA and CPRA Children’s Data Provisions

The California Consumer Privacy Act, like most U.S. privacy regulations, already imposes heightened requirements on data about minors. Businesses cannot sell or share the personal information of children under 16 without affirmative authorization, which requires opt-in consent from the child if they are aged 13 to 15, and parental consent if under 13. 

The California Privacy Protection Agency (CalPrivacy) has explicitly identified children’s data as an enforcement priority for 2026 and beyond, and the Consortium of Privacy Regulators, which now includes eight state regulators in addition to CalPrivacy and the California AG, lists it among its shared enforcement priorities.

The PlayOn Sports enforcement action in March 2026, in which CalPrivacy imposed a USD 1.1 million penalty partly because the platform’s youth-sports context created a “captive audience” of minor users, illustrates how seriously the agency takes children’s data exposure even under existing law.

Where businesses serve users in California who may be minors, consent banner design matters. Dark patterns, which can include pre-ticked boxes, confusing or missing opt-out flows, and misleading language, are already actionable under the CCPA. 

CalPrivacy uses automated scanning to detect non-compliant consent interfaces on public-facing websites and their Audits Division can and does open investigations without a consumer complaint.

What Businesses Should Do Now About the CAADC

The CAADC’s enforcement status will continue to evolve. Businesses should not treat the current injunction as a reason to defer action on children’s data practices. The direction of regulation is clear, and the costs of building compliant infrastructure reactively are higher than building it proactively.

Beyond regulatory pressure, consumer expectations also increasingly drive adoption of best practices in handling all personal data, not just children’s.

Map Your Minor User Exposure

The first step is understanding whether minors access your services, and how. This is not always obvious. A general-audience platform may have significant under-18 usage without ever having marketed to that audience. 

Conducting a data audit that identifies where children’s data may enter your systems is the foundation of everything else.  This can include: registration, browsing behavior, purchase history, device signals, and other means.

Privacy by design and default is the direction that children’s data regulation is heading. Reviewing your consent banner and data collection defaults to ensure that the most privacy-protective settings are the default helps to position your business ahead of the compliance curve and consumer expectations. It also reduces the risk that CalPrivacy’s automated scanning flags your site for non-compliant consent behavior.

Audit for Dark Patterns

Dark pattern restrictions are among the most durable elements of the CAADC. The Ninth Circuit’s March 2026 ruling found them enjoined only on vagueness grounds, not because restricting manipulative design is unconstitutional in principle. 

The same dark pattern restrictions appear in CCPA enforcement guidance and are an active priority for CalPrivacy. Reviewing your consent interface and UX flows against dark pattern criteria is a compliance step that applies regardless of the CAADC’s ultimate fate.

Build Toward DPIA Readiness

Even if the CAADC’s DPIA requirement never takes effect in its current form, Data Protection Impact Assessments are embedded in CPRA’s risk assessment requirements. These are in force for high-risk processing activities and are standard practice under GDPR for any processing that is likely to result in high risk to individuals. 

Businesses that develop assessment processes now are better positioned under existing obligations and ready for whatever form the CAADC’s requirements ultimately take.

Monitor State-Level Developments

The CAADC cannot be enforced while injunctions remain in effect, but the litigation does not make the law go away. California has not abandoned it, the Ninth Circuit has indicated its core data protections may be constitutionally sound if redrafted, and similar design codes have now been enacted in Maryland, Nebraska, Vermont, and South Carolina

All four states drafted their laws to be narrower than California’s to avoid the duty-of-care and compelled-speech provisions that drew First Amendment challenges. However, NetChoice has challenged both the Maryland and South Carolina laws in court, and further litigation is anticipated.

How Usercentrics Supports California Age-Appropriate Design Code Compliance

The CAADC’s core compliance requirements to avoid dark patterns, enforce data minimization, and promote privacy by design are not unique to this law. They reflect the principles that underpin the GDPR, CPRA, and children’s data regulation globally. 

Businesses that approach those principles as a design philosophy and customer experience priority, rather than a legal checklist, are better positioned across the entire regulatory landscape.

Consent management is the operational layer where those principles become practice. The settings users see when they first encounter your site, the choices they’re offered, the data that flows depending on those choices — these are where CAADC compliance would be centered, where CCPA compliance already lives, and where enforcement scrutiny is concentrated.

Usercentrics provides the consent management infrastructure that makes those requirements operable. Get fully customizable consent banners built to regulatory specifications and configurable for your jurisdictional needs. Rely on consent logs that create an audit-ready record for every user. And signal integration helps to ensure that consent flows correctly across platforms and data systems. 

For businesses with exposure to audiences with children, the ability to demonstrate documented, consent-grounded data practices is increasingly not optional.

Build the consent infrastructure children’s data rules require

Regulators and parents are watching how businesses handle children’s data. Start your free trial today and see how Usercentrics can help you meet the standard.

William Newmark
Senior Legal Counsel, Usercentrics
Stay in the loop

Join our growing community of data privacy enthusiasts now. Subscribe to the Usercentrics newsletter and get the latest updates right in your inbox.