The United States does not have a comprehensive federal data privacy law that governs how businesses access or use individuals’ personal information. Instead, privacy protections and regulation are currently left to individual states. California led the way in 2020 with the California Consumer Privacy Act (CCPA), later strengthened by the California Privacy Rights Act (CPRA). As of January 2025, 20 states have passed similar laws. The variances in consumers’ rights, companies’ responsibilities, and other factors makes compliance challenging for businesses operating in multiple states.
The American Data Privacy and Protection Act (ADPPA) sought to simplify privacy compliance by establishing a comprehensive federal privacy standard. The ADPPA emerged in June 2022 when Representative Frank Pallone introduced HR 8152 to the House of Representatives. The bill gained strong bipartisan support in the House Energy and Commerce Committee, passing with a 53-2 vote in July 2022. It also received amendments in December 2022. However, the bill did not progress any further.
As proposed, the ADPPA would have preempted most state-level privacy laws, replacing the current multi-state compliance burden with a single federal standard.
In this article, we’ll examine who the ADPPA would have applied to, its obligations for businesses, and the rights it would have granted US residents.
What is the American Data Privacy and Protection Act (ADPPA)?
The American Data Privacy and Protection Act (ADPPA) was a proposed federal bill that would have set consistent rules for how organizations handle personal data across the United States. It aimed to protect individuals’ privacy with comprehensive safeguards while requiring organizations to meet strict standards for handling personal data.
Under the ADPPA, an individual is defined as “a natural person residing in the United States.” Organizations that collect, use, or share individuals’ personal data would have been responsible for protecting it, including measures to prevent unauthorized access or misuse. By balancing individual rights and business responsibilities, the ADPPA sought to create a clear and enforceable framework for privacy nationwide.
What data would have been protected under the American Data Privacy and Protection Act (ADPPA)?
The ADPPA aimed to protect the personal information of US residents, which it refers to as covered data. Covered data is broadly defined as “information that identifies or is linked, or reasonably linkable, alone or in combination with other information, to an individual or a device that identifies or is linked or reasonably linkable to an individual.” In other words, any data that would either identify or could be traced to a person or to a device that is linked to an individual. This includes data that may be derived from other information and unique persistent identifiers, such as those used to track devices or users across platforms.
The definition excludes:
- Deidentified data
- Employee data
- Publicly available information
- Inferences made exclusively from multiple separate sources of publicly available information, so long as they don’t reveal private or sensitive details about a specific person
Sensitive covered data under the ADPPA
The ADPPA, like other data protection regulations, would have required stronger safeguards for sensitive covered data that could harm individuals if it was misused or unlawfully accessed. The bill’s definition of sensitive covered data is extensive, going beyond many US state-level data privacy laws.
Protected categories of data include, among other things:
- Personal identifiers, including government-issued IDs like Social Security numbers and driver’s licenses, except when legally required for public display.
- Health information, including details about past, present, or future physical and mental health conditions, treatments, disabilities, and diagnoses.
- Financial data, such as account numbers, debit and credit card numbers, income, and balance information. The last four digits of payment cards are excluded.
- Private communications, such as emails, texts, calls, direct messages, voicemails, and their metadata. This does not apply if the device is employer-provided and individuals are given clear notice of monitoring.
- Behavioral data, including sexual behavior information when collected against reasonable expectations, video content selections, and online activity tracking across websites.
- Personal records, such as private calendars, address books, photos, and recordings, except on employer-provided devices with notice.
- Demographic details, including race, color, ethnicity, religion, and union membership.
- Biological identifiers, including biometric information and genetic information, precise location data, login credentials, and information about minors.
- Security credentials, login details or security or access codes for an account or device.
Who would the American Data Privacy and Protection Act (ADPPA) have applied to?
The ADPPA would have applied to a broad range of entities that handle covered data.
Covered entity under the ADPPA
A covered entity is “any entity or any person, other than an individual acting in a non-commercial context, that alone or jointly with others determines the purposes and means of collecting, processing, or transferring covered data.” This definition matches similar terms like “controller” in US state privacy laws and the European Union’s General Data Protection Regulation (GDPR). To qualify as a covered entity under the ADPPA, the organization would have had to be in one of three categories:
- Businesses regulated by the Federal Trade Commission Act (FTC Act)
- Telecommunications carriers
- Nonprofits
Although the bill did not explicitly address international jurisdiction, its reach could have extended beyond US borders. Foreign companies would have needed to comply if they handle US residents’ data for commercial purposes and meet the FTC Act’s jurisdictional requirements, such as conducting business activities in the US or causing foreseeable injury within the US. This type of extraterritorial scope is common among a number of other international data privacy laws.
Service provider under the ADPPA
A service provider was defined as a person or entity that engages in either of the following:
- Collects, processes, or transfers covered data on behalf of a covered entity or government body
OR
- Receives covered data from or on behalf of a covered entity of government body
This role mirrors what other data protection laws call a processor, including most state privacy laws and the GDPR.
Large data holders under the ADPPA
Large data holders were not considered a third type of organization. Both covered entities and service providers could have qualified as large data holders if, in the most recent calendar year, they had gross annual revenues of USD 250 million or more, and collected, processed, or transferred:
- Covered data of more than 5,000,000 individuals or devices, excluding data used solely for payment processing
- Sensitive covered data from more than 200,000 individuals or devices
Large data holders would have faced additional requirements under the ADPPA.
Third-party collecting entity under the ADPPA
The ADPPA introduced the concept of a third-party collecting entity, which refers to a covered entity that primarily earns its revenue by processing or transferring personal data it did not collect directly from the individuals to whom the data relates. In other contexts, they are often referred to as data brokers.
However, the definition excluded certain activities and entities:
- A business would not be considered a third-party collecting entity if it processed employee data received from another company, but only for the purpose of providing benefits to those employees
- A service provider would also not be classified as a third-party collecting entity under this definition
An entity is considered to derive its principal source of revenue from data processing or transfer if, in the previous 12 months, either:
- More than 50 percent of its total revenue came from these activities
or
- The entity processed or transferred the data of more than 5 million individuals that it did not collect directly
Third-party collecting entities that process data from more than 5,000 individuals or devices in a calendar year would have had to register with the Federal Trade Commission by January 31 of the following year. Registration would require a fee of USD 100 and basic information about the organization, including its name, contact details, the types of data it handles, and a link to a website where individuals can exercise their privacy rights.
Exemptions under the ADPPA
While the ADPPA potentially would have had a wide reach, certain exemptions would have applied.
- Small businesses: Organizations with less than USD 41 million in annual revenue or those that process data for fewer than 50,000 individuals would be exempt from some provisions.
- Government entities: The ADDPA would not apply to government bodies or their service providers handling covered data. It also excluded congressionally designated nonprofits that support victims and families with issues involving missing and exploited children.
- Organizations subject to other federal laws: Organizations already complying with certain existing privacy laws, including the Health Insurance Portability and Accountability Act (HIPAA), the Gramm-Leach-Bliley Act (GLBA), and the Family Educational Rights and Privacy Act (FERPA), among others, were deemed compliant with similar ADPPA requirements for the specific data covered by those laws. However, they would have still been required to comply with Section 208 of the ADPPA, which contains provisions for data security and protection of covered data.
Definitions in the American Data Privacy and Protection Act (ADPPA)
Like other data protection laws, the ADPPA defined several terms that are important for businesses to know. While many — like “collect” or “process” — can be found in other regulations, there are also some that are unique to the ADPPA. We look at some of these key terms below.
Knowledge under the ADPPA
“Knowledge” refers to whether a business is aware that an individual is a minor. The level of awareness required depends on the type and size of the business.
- High-impact social media companies: These are large platforms that are primarily known for user-generated content. They would have to have at least USD 3 billion in annual revenue and 300 million monthly active users over 3 months in the preceding year. They would be considered to have knowledge if they were aware or should have been aware that a user was a minor. This is the strictest standard.
- Large data holders: These are organizations that have significant data operations but do not qualify as high-impact social media. They have knowledge if they knew or willfully ignored evidence that a user was a minor.
- Other covered entities or service providers: Those that do not fall into the above categories are required to have actual knowledge that the user is a minor.
Some states — like Minnesota and Nebraska — define “known child” but do not adjust the criteria for what counts as knowledge based on the size or revenue of the business handling the data. Instead, they apply the same standard to all companies, regardless of their scale.
Affirmative express consent under the GDPR
The ADPPA uses the term “affirmative express consent,” which refers to “an affirmative act by an individual that clearly communicates the individual’s freely given, specific, and unambiguous authorization” for a business to perform an action, such as collecting or using their personal data. Consent for data collection would have to be obtained after the covered entity provides clear information about how it will use the data.
Like the GDPR and other data privacy regulations, consent would have needed to be freely given, informed, specific, and unambiguous.
Under this definition, consent cannot be inferred from an individual’s inaction or continued use of a product or service. Additionally, covered entities cannot trick people into giving consent through misleading statements or manipulative design. This includes deceptive interfaces meant to confuse users or limit their choices.
Transfer under the ADPPA
Most data protection regulations include a definition for the sale of personal data or personal information. While the ADPPA did not define sale, it instead defined “transfer” as “to disclose, release, disseminate, make available, license, rent, or share covered data orally, in writing, electronically, or by any other means.”
What are consumers’ rights under the American Data Privacy and Protection Act (ADPPA)?
Under the ADPPA, consumers would have had the following rights regarding their personal data.
- Right of awareness: The Commission must publish and maintain a webpage describing the provisions, rights, obligations, and requirements of the ADPPA for individuals, covered entities, and service providers. This information must be:
- Published within 90 days of the law’s enactment
- Updated quarterly as needed
- Available in the ten most commonly used languages in the US
- Right to transparency: Covered entities must provide clear information about how consumer data is collected, used, and shared. This includes which third parties would receive their data and for what purposes.
- Right of access: Consumers can access their covered data (including data collected, processed, or transferred within the past 24 months), categories of third parties and service providers who received the data, and the purpose(s) for transferring the data.
- Right to correction: Consumers can correct any substantial inaccuracies or incomplete information in their covered data and instruct the covered entity to notify all third parties or service providers that have received the data.
- Right to deletion: Consumers can request that their covered data processed by the covered entity be deleted. They can also instruct the covered entity to notify all third parties or service providers that have received the data of the deletion request.
- Right to data portability: Consumers can request their personal data in a structured, machine-readable format that enables them to transfer it to another service or organization.
- Right to opt out: Consumers can opt out of the transfer of their personal data to third parties and its use for targeted advertising. Businesses are required to provide a clear and accessible mechanism for exercise of this right.
- Private right of action: Consumers can sue companies directly for certain violations of the act, with some limitations and procedural requirements. (California is the only state to provide this right as of early 2025.)
What are privacy requirements under the American Data Privacy and Protection Act (ADPPA)?
The ADPPA would have required organizations to meet certain obligations when handling individuals’ covered data. Here are the key privacy requirements under the bill.
Consent
Organizations must obtain clear, explicit consent through easily understood standalone disclosures. Consent requests must be accessible, available in all service languages, and give equal prominence to accept and decline options. Organizations must provide mechanisms to withdraw consent that are as simple as giving it.
Organizations must avoid using misleading statements or manipulative designs, and must obtain new consent for different data uses or significant privacy policy changes. While the ADPPA works alongside the Children’s Online Privacy Protection Act (COPPA)’s parental consent requirements for children under 13, it adds its own protections for minors up to age 17.
Privacy policy
Organizations must maintain clear, accessible privacy policies that detail their data collection practices, transfer arrangements, retention periods, and rights granted to individuals. These policies must specify whether data goes to countries like China, Russia, Iran, or North Korea, which could present a security risk, and they must be available in all languages where services are offered. When making material changes, organizations must notify affected individuals in advance and give them a chance to opt out.
Data minimization
Organizations can only collect and process data that is reasonably necessary to provide requested services or for specific allowed purposes. These allowed purposes include activities like completing transactions, maintaining services, protecting against security threats, meeting legal obligations, and preventing harm or if there is a risk of death, among others. Collected data must also be proportionate to these activities.
Privacy by design
Privacy by design is a default requirement under the ADPPA. Organizations must implement reasonable privacy practices that consider the organization’s size, data sensitivity, available technology, and implementation costs. They must align with federal laws and regulations and regularly assess risks in their products and services, paying special attention to protecting minors’ privacy and implementing appropriate safeguards.
Data security
Organizations must establish, implement, and maintain appropriate security measures, including vulnerability assessments, preventive actions, employee training, and incident response plans. They must implement clear data disposal procedures and match their security measures to their data handling practices.
Privacy and data security officers
Organizations with more than 15 employees must appoint both a privacy officer and data security officer, who must be two distinct individuals. These officers are responsible for implementing privacy programs and maintaining ongoing ADPPA compliance.
Privacy impact assessments
Organizations — excluding large data holders and small businesses — must conduct regular privacy assessments that evaluate the benefits and risks of their data practices. These assessments must be documented and maintained, and consider factors like data sensitivity and potential privacy impacts.
Loyalty with respect to pricing
Organizations cannot discriminate against individuals who exercise their privacy rights. While they can adjust prices based on necessary financial information and offer voluntary loyalty programs, they cannot retaliate through changes in pricing or service quality, e.g. if an individual exercises their rights and requests their data or does not consent to certain data processing.
Special requirements for large data holders
In addition to their general obligations, large data holders would have had unique responsibilities under the proposed law.

Privacy policy
Large data holders would have been required to maintain and publish 10-year archives of their privacy policies on their websites. They would need to keep a public log documenting significant privacy policy changes and their impact. Additionally, they would need to provide a short-form notice (under 500 words) highlighting unexpected practices and sensitive data handling.
Privacy and data security officers
At least one of the appointed officers would have been designated as a privacy protection officer who reports directly to the highest official at the organization. This officer, either directly or through supervised designees, would have been required to do the following:
- Establish processes to review and update privacy and security policies, practices, and procedures
- Conduct biennial comprehensive audits to ensure compliance with the proposed law and make them accessible to the Commission upon request
- Develop employee training programs about ADPPA compliance
- Maintain detailed records of all material privacy and security practices
- Serve as the point of contact for enforcement authorities
Privacy impact assessments
While all organizations other than small businesses would be required to conduct privacy impact assessments under the proposed law, large data holders would have had additional requirements.
- Timing: While other organizations must conduct assessments within one year of the ADPPA’s enactment, large data holders would have been required to do so within one year of either becoming a large data holder or the law’s enactment, whichever came first.
- Scope: Both must consider nature and volume of data and privacy risks, but large data holders would need to specifically assess “potential adverse consequences” in addition to “substantial privacy risks.”
- Approval: Large data holders’ assessments would need to be approved by their privacy protection officer, while other entities would have no specific approval requirement.
- Technology review: Large data holders would need to include reviews of security technologies (like blockchain and distributed ledger), this review would be optional for other entities.
- Documentation: While both would need to maintain written assessments until the next assessment, large data holders’ assessments would also need to be accessible to their privacy protection officer.
Metrics reporting
Large data holders would be required to compile and disclose annual metrics related to verified access, deletion, and opt-out requests. These metrics would need to be included in their privacy policy or published on their website.
Executive certification
An executive officer would have been required to annually certify to the FTC that the large data holder has internal controls and a reporting structure in place to achieve compliance with the proposed law.
Algorithm impact assessments
Large data holders using covered algorithms that could pose a consequential risk of harm would be required to conduct an annual impact assessment of these algorithms. This requirement would be in addition to privacy impact assessments and would need to begin no later than two years after the Act’s enactment.
American Data Privacy and Protection Act (ADPPA) enforcement and penalties for noncompliance
The ADPPA would have established a multi-layered enforcement approach that set it apart from other US privacy laws.
- Federal Trade Commission: The FTC would serve as the primary enforcer, treating violations as unfair or deceptive practices under the Federal Trade Commission Act. The proposed law required the FTC to create a dedicated Bureau of Privacy for enforcement.
- State Attorneys General: State Attorneys General and State Privacy Authorities could bring civil actions on behalf of their residents if they believed violations had affected their state’s interest.
- California Privacy Protection Authority (CPPA): The CPPA, established under the California Privacy Rights Act, would have special enforcement authority. The CPPA could enforce the ADPPA in California in the same manner as it enforces California’s privacy laws.
Starting two years after the law would have taken effect, individuals would gain a private right of action, or the right to sue for violations. However, before filing a lawsuit, they would need to notify both the Commission and their state Attorney General.
The ADPPA itself did not establish specific penalties for violations. Instead, violations of the ADPPA or its regulations would be treated as violations of the Federal Trade Commission Act, subject to the same penalties, privileges, and immunities provided under that law.
The American Data Privacy and Protection Act (ADPPA) compared to other data privacy regulations
As privacy regulations continue to evolve worldwide, it’s helpful to understand how the ADPPA would compare with other comprehensive data privacy laws.
The EU’s GDPR has set the global standard for data protection since 2018. In the US, the CCPA (as amended by the CPRA) established the first comprehensive state-level privacy law and has influenced subsequent state legislation. Below, we’ll look at how the ADPPA compares with these regulations.
The ADPPA vs the GDPR
There are many similarities between the proposed US federal privacy law and the EU’s data protection regulation. Both require organizations to implement privacy and security measures, provide individuals with rights over their personal data (including access, deletion, and correction), and mandate clear privacy policies that detail their data processing activities. Both also emphasize data minimization principles and purpose limitation.
However, there are also several important differences between the two.
Aspect | ADPPA | GDPR |
---|---|---|
Territorial scope | Would have applied to individuals residing in the US. | Applies to EU residents and any organization processing their data, regardless of location. |
Consent | Not a standalone legal basis; required only for specific activities like targeted advertising and processing sensitive data. | One of six legal bases for processing; can be a primary justification. |
Government entities | Excluded federal, state, tribal, territorial and local government entities. | Applies to public bodies and authorities. |
Privacy officers | Required “privacy and security officers” for covered entities with more than 15 employees, with stricter rules for large data holders. | Requires a Data Protection Officer (DPO) for public authorities or entities engaged in large-scale data processing. |
Data transfers | No adequacy requirements; focus on transfers to specific countries (China, Russia, Iran, North Korea). | Detailed adequacy requirements and transfer mechanisms. |
Children’s data | Extended protections to minors up to age 17. | Focuses on children under 16 (can be lowered to 13 by member states). |
Penalties | Violations would have been treated as violations of the Federal Trade Commission Act. | Imposes fines up to 4% of annual global turnover or €20 million, whichever is higher. |
The ADPPA vs the CCPA/CPRA
There are many similarities between the proposed US federal privacy law and California’s existing privacy framework. Both include comprehensive transparency requirements, including privacy notices in multiple languages and accessibility for people with disabilities. They also share similar approaches to prohibiting manipulative design practices and requirements for regular security and privacy assessments.
However, there are also differences between the ADPPA and CCPA/CPRA.
Aspect | ADPPA | CCPA/CPRA |
---|---|---|
Covered entities | Would have applied to organizations under jurisdiction of the Federal Trade Commission, including nonprofits and common carriers; excluded government agencies. | Applies only to for-profit businesses meeting any of these specific thresholds:gross annual revenue of over USD 26,625,000receive, buy, sell, or share personal information of 100,000 or more consumers or householdsearn more than half of their annual revenue from the sale of consumers’ personal information |
Private right of action | Broader right to sue for various violations. | Limited to data breaches only. |
Data minimization | Required data collection and processing to be limited to what is reasonably necessary and proportionate. | Similar requirement, but the CPRA allows broader processing for “compatible” purposes. |
Algorithmic impact assessments | Required large data holders to conduct annual assessments focusing on algorithmic risks, bias, and discrimination. | Requires risk assessments weighing benefits and risks of data practices, with no explicit focus on bias. |
Executive accountability | Required executive certification of compliance. | No executive certification requirement. |
Enforcement | Would have been enforced by the Federal Trade Commission, State Attorney Generals, and the California Privacy Protection Authority (CPPA). | CPPA and local authorities within California. |
Consent management and the American Data Privacy and Protection Act (ADPPA)
The ADPPA would have required organizations to obtain affirmative express consent for certain data processing activities through clear, conspicuous standalone disclosures. These consent requests would need to be easily understood, equally prominent for either accepting or declining, and available in all languages where services are offered. Organizations would also need to provide simple mechanisms for withdrawing consent that would be as easy to use as giving consent was initially. The bill also required organizations to honor opt-out requests for practices like targeted advertising and certain data transfers. These opt-out mechanisms would need to be accessible and easy to use, with clear instructions for exercising these rights.
Organizations would need to clearly disclose not only the types of data they collect but also the parties with whom this information is shared. Consumers would also need to be informed about their data rights and how to act on them, such as opting out of processing, through straightforward explanations and guidance.
To support transparency, organizations would also be required to maintain privacy pages that are regularly updated to reflect their data collection, use, and sharing practices. These pages would help provide consumers with access to the latest information about how their data is handled. Additionally, organizations would have been able to use banners or buttons on websites and apps to inform consumers about data collection and provide them with an option to opt out.
Though the ADPPA was not enacted, the US does have an increasing number of state-level data privacy laws. A consent management platform (CMP) like the Usercentrics CMP for website consent management or app consent management can help organizations streamline compliance with the many existing privacy laws in the US and beyond. The CMP securely maintains records of consent, automates opt-out processes, and enables consistent application of privacy preferences across an organization’s digital properties. It also helps to automate the detection and blocking of cookies and other tracking technologies that are in use on websites and apps.
Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or privacy specialists regarding data privacy and protection issues and operations.
How far can companies go to get a user’s consent? When does inconvenience or questionable user experience tip over into legally noncompliant manipulation? These continue to be important questions across the data privacy landscape, especially with mobile apps, an area where regulatory scrutiny and enforcement have been ramping up.
French social networking app BeReal requests users’ consent to use their data for targeted advertising, which is very common. However, how they go about presenting (and re-presenting) that request has led to a complaint against them relating to their GDPR compliance. Let’s look at what BeReal is doing to get user consent, what the complaint is, and the legal basis for it.
BeReal’s consent request: A false sense of choice?
According to noyb’s complaint, BeReal introduced a new consent banner feature for European users in July 2024. The contention is that this banner requested user consent for use of their data for targeted advertising, which is not unusual or problematic in itself. However, the question is whether the banner provides users with real consent choice or not.
Based on the description from the complaint, BeReal designed their banner to be displayed to users when they open the app. If a user accepts the terms — giving consent for data use for targeted advertising — then they never see the banner again. However, if a user declines consent, the banner allegedly reappears every day when users attempt to post on the app. As the app requires users to snap photos multiple times a day, seeing a banner display every time one tries to do so could be understandably frustrating.
In addition to resulting in an annoying user experience, this alleged action is also potentially a GDPR violation. It’s questionable if user consent under these described conditions is actually freely given.
The GDPR does require organizations to ask users for consent again if, for example, there have been changes in their data processing operations, like they want to collect new data, or want to use data for a new purpose.
It’s also recommended that organizations refresh user consent data from time to time, even though the GDPR doesn’t specify an exact time frame, as some other laws and guidelines do. For example, a company could ask users for consent for specific data uses every 12 months, either to ensure consent is still current, or to see if users who previously declined have changed their minds.
The noyb complaint against BeReal
In December 2024, privacy advocacy group noyb (the European Center for Digital Rights) filed a complaint against BeReal with French data protection authority Commission Nationale de l’Informatique et des Libertés (CNIL), arguing that the company’s alleged repeated banner displays for non-consenting users are a form of “nudging” or use of dark patterns.
The CNIL is one of the EU data protection authorities that has previously announced increased enforcement of data privacy for mobile apps, and released guidelines for better privacy protection for mobile apps in September 2024.
While regulators have increasingly taken a dim view of various design manipulations to obtain users’ consent, like hiding the “reject” option, noyb argues BeReal’s actions are a new dark pattern trend: “annoying people into consent”. Simply put, they contend that BeReal does not take no for an answer, meaning consent obtained through this repeated tactic is not freely given, and thus is a clear violation of the GDPR’s requirements.
The noyb legal team has requested that the CNIL order BeReal to delete the personal data of affected users, modify its consent practices to be GDPR-compliant, and impose an administrative fine as a deterrent to other companies that may consider similar tactics.
European regulators take a dim view of manipulations to obtain user consent
Whether it’s making users go hunting to find the “reject” button (or removing it entirely), or wearing them down with constant banner displays until they give in and consent to the requested data use, the European Data Protection Board (EDPB) has seen and addressed similar issues before.
It’s generally understood that users are likely to give in over time out of fatigue or frustration and consent to the requested data use. Companies get what they want, but not in a way that is voluntary or a good user experience. The EDPB has emphasized that in addition to being specific, informed, and unambiguous, consent must be freely given. Persistent prompts can be a form of coercion, and thus consent received that way may not be legally valid (Art. 4 GDPR).
As technologies change over time, the ways in which dark patterns can be deployed to manipulate users into giving consent are likely to further evolve and become more sophisticated.
A fine balance: Data monetization and privacy compliance
It is a common challenge for companies to try to find ways to increase consent rates for access to user data to drive monetization strategies via their websites, apps, and other connected platforms. Cases like the one against BeReal could potentially set the tone for regulators’ increasingly stringent expectations for online platforms’ data operations, and the company could serve as a cautionary tale for others considering questionable tactics where user privacy is concerned.
As more individuals around the world are protected by more data privacy laws, what data companies are allowed to access and under what circumstances is becoming more strictly controlled. Thus the increasing challenge for companies that need data for advertising, analytics, personalization, and additional uses to grow their businesses.
Fortunately, there is a way to strike a balance between data privacy and data-driven business. With clear, user-friendly consent management, a shift to reliance on zero- and first-party data, and embracing Privacy-Led Marketing by employing preference management and other strategies to foster engagement and long-term customer satisfaction and loyalty.
How Usercentrics helps
Good consent practices require making user experience better, not more frustrating. Usercentrics App CMP helps your company deliver, building trust with users and providing a smooth, friendly user experience for consent management. You can obtain higher consent rates while achieving and maintaining privacy compliance.
Simple, straightforward setup for technical and non-technical teams automates integration of your vendors, SSPs, and SDKs with the App Scanner. We provide over 2,200 pre-built legal templates so you can provide clear, comprehensive consent choices to your users.
With extensive customization, you can make sure your banners fit your app or game’s design and branding and provide key information, enabling valid user consent without getting in their way or causing frustration. And you also get our expert guidance and detailed documentation every step of the way.
Oregon was the twelfth state in the United States to pass comprehensive data privacy legislation with SB 619. Governor Tina Kotek signed the bill into law on July 18, 2023, and the Oregon Consumer Privacy Act (OCPA) came into effect for most organizations on July 1, 2024. Nonprofits have an extra year to prepare, so their compliance is required as of July 1, 2025.
In this article, we’ll look at the Oregon Consumer Privacy Act’s requirements, who they apply to, and what businesses can do to achieve compliance.
What is the Oregon Consumer Privacy Act (OCPA)?
The Oregon Consumer Privacy Act protects the privacy and personal data of over 4.2 million Oregon residents. The law establishes rules for any individual or entity conducting business in Oregon or those providing goods and services to its residents and processing their personal data. Affected residents are known as “consumers” under the law.
The OCPA protects Oregon residents’ personal data when they act as individuals or in household contexts. It does not cover personal data collected in a work context. This means information about individuals acting in their professional roles, rather than as consumers, is not covered under this law.
Consistent with the other US state-level data privacy laws, the OCPA requires businesses to inform residents about how their personal data is collected and used. This notification — usually included in a website’s privacy policy — must cover key details such as:
- What data is collected
- How the data is used
- Whether the data is shared and with whom
- Information about consumers’ rights
The Oregon privacy law uses an opt-out consent model, which means that in most cases, organizations can collect consumers’ personal data without prior consent. However, they must make it possible for consumers to opt out of the sale of their personal data and its use in targeted advertising or profiling. The law also requires businesses to implement reasonable security measures to protect the personal data they handle.
Who must comply with the Oregon Consumer Privacy Act (OCPA)?
Similar to many other US state-level data privacy laws, the OCPA establishes thresholds for establishing which organizations must comply with its requirements. However, unlike some other laws, it does not contain a revenue-only threshold.
To fall under the OCPA’s scope, during a calendar year an organization must control or process the personal data of:
- 100,000 consumers, not including consumers only completing payment transactionsor
or
- 25,000 consumers if 25 percent or more of the organization’s annual gross revenue comes from selling personal data
Exemptions to OCPA compliance
The OCPA is different from some other data privacy laws because many of its exemptions focus on the types of data being processed and what processing activities are being conducted, rather than just on the organizations themselves.
For example, instead of exempting healthcare entities under the Health Insurance Portability and Accountability Act (HIPAA), the OCPA exempts protected health information handled in compliance with HIPAA. This means protected health information is outside of the OCPA’s scope, but other data that a healthcare organization handles could still fall under the law. Organizations that may be exempt from compliance with other state-level consumer privacy laws should consult a qualified legal professional to determine if they are required to comply with the OCPA.
Exempted organizations and their services or activities include:
- Governmental agencies
- Consumer reporting agencies
- Financial institutions regulated by the Bank Act and their affiliates or subsidiaries, provided they focus exclusively on financial activities
- Insurance companies
- Nonprofit organizations established to detect and prevent insurance fraud
- Press, wire, or other information services (and the non-commercial activities of media entities)
Personal data collected, processed, sold, or disclosed under the following federal laws is also exempt from the OCPA’s scope:
- Health Insurance Portability and Accountability Act (HIPAA)
- Gramm-Leach-Bliley Act (GLBA)
- Health Care Quality Improvement Act
- Fair Credit Reporting Act (FCRA)
- Driver’s Privacy Protection Act
- Family Educational Rights and Privacy Act (FERPA)
- Airline Deregulation Act
Definitions in the Oregon Consumer Privacy Act (OCPA)
This Oregon data privacy law defines several key terms related to the data it protects and relevant data processing activities.
What is personal data under the OCPA?
The Oregon privacy law protects consumers’ personal data, which it defines as “data, derived data or any unique identifier that is linked to or is reasonably linkable to a consumer or to a device that identifies, is linked to or is reasonably linkable to one or more consumers in a household.”
The law specifically excludes personal data that is:
- Deidentified data
- made legally available through government records or widely distributed media
- made public by the consumer
The law does not specifically list what constitutes personal data. Common types of personal data that businesses collect include a consumer’s name, phone number, email address, Social Security Number, or driver’s license number.
It should be noted that personal data (also called personal information under some state privacy laws) and personally identifiable information are not always the same thing, and distinctions between the two are often made in data privacy laws.
What is sensitive data under the OCPA?
Sensitive data is personal data that requires special handling because it could cause harm or embarrassment if misused or unlawfully accessed. It refers to personal data that would reveal an individual’s:
- Racial or ethnic background
- National origin
- Religious beliefs
- Mental or physical condition or diagnosis
- Genetic or biometric data
- Sexual orientation
- Status as transgender or non-binary
- Status as a victim of crime
- Citizenship or immigration status
- Precise present or past geolocation (within 1,750 feet or 533.4 meters)
All personal data belonging to children is also considered sensitive data under the OCPA.
Oregon’s law is the first of the US privacy laws to include either transgender or non-binary gender expression or the status as a victim of crime as sensitive data. The definition of biometric data excludes facial geometry or mapping unless it is done for the purpose of identifying an individual.
An exception to the law’s definition of sensitive data includes “the content of communications or any data generated by or connected to advanced utility metering infrastructure systems or equipment for use by a utility.” In other words, the law does not consider sensitive information to include communications content, like that in emails or messages, or data generated by smart utility meters and related systems used by utilities.
What is consent under the OCPA?
Like many other data privacy laws, the Oregon data privacy law follows the European Union’s General Data Protection Regulation (GDPR) regarding the definition of valid consent. Under the OCPA, consent is “an affirmative act by means of which a consumer clearly and conspicuously communicates the consumer’s freely given, specific, informed and unambiguous assent to another person’s act or practice…”
The definition also includes conditions for valid consent:
- the consumer’s inaction does not constitute consent
- the user interface used to request consent must not attempt to obscure, subvert, or impair the consumer’s choice
These conditions are highly relevant to online consumers and reflect that the use of manipulative dark patterns are increasingly frowned upon by data protection authorities, and increasingly prohibited. The Oregon Department of Justice (DOJ) website also clarifies that the use of dark patterns may be considered a deceptive business practice under Oregon’s Unlawful Trade Practices Act.
What is processing under the OCPA?
Processing under the OCPA means any action or set of actions performed on personal data, whether manually or automatically. This includes activities like collecting, using, storing, disclosing, analyzing, deleting, or modifying the data.
Who is a controller under the OCPA?
The OCPA uses the term “controller” to describe businesses or entities that decide how and why personal data is processed. While the law uses the word “person,” it applies broadly to both individuals and organizations.
The OCPA definition of controller is “a person that, alone or jointly with another person, determines the purposes and means for processing personal data.” In simpler terms, a controller is anyone who makes the key decisions about why personal data is collected and how it will be used.
Who is a processor under the OCPA?
The OCPA defines a processor as “a person that processes personal data on behalf of a controller.” Like the controller, while the law references a person, it typically refers to businesses or organizations that handle data for a controller. Processors are often third parties that follow the controller’s instructions for handling personal data. These third parties can include advertising partners, payment processors, or fulfillment companies, for example. Their role is to carry out specific tasks without deciding how or why the data is processed.
What is profiling under the OCPA?
Profiling is increasingly becoming a standard inclusion in data privacy laws, particularly as it can relate to “automated decision-making” or the use of AI technologies. The Oregon privacy law defines profiling as “an automated processing of personal data for the purpose of evaluating, analyzing or predicting an identified or identifiable consumer’s economic circumstances, health, personal preferences, interests, reliability, behavior, location or movements.”
What is targeted advertising under the OCPA?
Targeted advertising may involve emerging technologies like AI tools. It is also becoming a standard inclusion in data privacy laws. The OCPA defines targeted advertising as advertising that is “selected for display to a consumer on the basis of personal data obtained from the consumer’s activities over time and across one or more unaffiliated websites or online applications and is used to predict the consumer’s preferences or interests.” In simpler terms, targeted advertising refers to ads shown to a consumer based on their interests, which are determined by personal data that is collected over time from different websites and apps.
However, some types of ads are excluded from this definition, such as those that are:
- Based on activities within a controller’s own websites or online apps
- Based on the context of a consumer’s current search query, visit to a specific website, or app use
- Shown in response to a consumer’s request for information or feedback
The definition also excludes processing of personal data solely to measure or report an ad’s frequency, performance, or reach.
What is a sale under the OCPA?
The OCPA defines sale as “the exchange of personal data for monetary or other valuable consideration by the controller with a third party.” This means a sale doesn’t have to involve money. Any exchange of data for something of value, even if it’s non-monetary, qualifies as a sale under the law.
The Oregon privacy law does not consider the following disclosures of personal data to be a “sale”:
- Disclosures to a processor
- Disclosures to an affiliate or a third party to help the controller provide a product or service requested by the consumer
- Disclosures or transfers of personal data as part of a merger, acquisition, bankruptcy, or similar transaction in which a third party takes control of the controller’s assets, including personal data
- Disclosures of personal data that occur because the consumer:
- directs the controller to disclose the data
- intentionally discloses the data while directing the controller to interact with a third party
- intentionally discloses the data to the public, such as through mass media, without restricting the audience
Consumers’ rights under the Oregon Consumer Privacy Act (OCPA)
The Oregon privacy law grants consumers a range of rights over their personal data, comparable to other US state-level privacy laws.
- Right to access: consumers can request confirmation of whether their personal data is being processed and the categories of personal data being processed, gain access to the data, and receive a list of the specific third parties it has been shared with (other than natural persons), all subject to some exceptions.
- Right to correction: consumers can ask controllers to correct inaccurate or outdated information they have provided.
- Right to deletion: consumers can request the deletion of their personal data held by a controller, with some exceptions.
- Right to portability: consumers can obtain a copy of the personal data they have provided to a controller, in a readily usable format, with some exceptions.
- Right to opt out: consumers can opt out of the sale of their personal data, targeted advertising, or profiling used for decisions with legal or similarly significant effects.
Consumers can designate an authorized agent to opt out of personal data processing on their behalf. The OCPA also introduces a requirement for controllers to to recognize universal opt-out signals, further simplifying the opt-out process.
This Oregon data privacy law stands out by giving consumers the right to request a specific list of third parties that have received their personal data. Unlike many other privacy laws, this one requires controllers to maintain detailed records of the exact entities they share data with, rather than just general categories of recipients.
Children’s personal data has special protections under the OCPA. Parents or legal guardians can exercise rights for children under the age of 13, whose data is classified as sensitive personal data and subject to stricter rules. For minors between 13 and 15, opt-in consent is required for specific processing activities, including its use for targeted advertising or profiling. “Opt-in” means that explicit consent is required before the data can be used for these purposes.
Consumers can make one free rights request every 12 months, to which an organization has 45 days to respond. They can extend that period by another 45 days if reasonably necessary. Organizations can deny consumer requests for a number of reasons. These include cases in which the consumer’s identity cannot reasonably be verified, or if the consumer has made too many requests within a 12-month period.
Oregon’s privacy law does not include private right of action, so consumers cannot sue data controllers for violations. California remains the only state that allows this provision.
What are the privacy requirements under the Oregon Consumer Privacy Act (OCPA)
Controllers must meet the following OCPA requirements to protect the personal data they collect from consumers.
Privacy notice and transparency under the OCPA
The Oregon privacy law requires controllers to be transparent about their data handling practices. Controllers must provide a clear, easily accessible, and meaningful privacy notice for consumers whose personal data they may process. The privacy notice, also known as the privacy policy, must include the following:
- Purpose(s) for processing personal data
- Categories of personal data processed, including the categories of sensitive data
- Categories of personal data shared with third parties, including categories of sensitive data
- Categories of third parties with which the controller shares personal data and how each third party may use the data
- How consumers can exercise their rights, including:
- How to opt out of processing for targeted advertising or profiling
- How to submit a consumer rights request
- How to appeal a controller’s denial of a rights-related request
- The identity of the controller, including any business name the controller uses or has registered in Oregon
- At least one actively monitored online contact method, such as an email address, for consumers to directly contact the organization
- A “clear and conspicuous description” for any processing of personal data for the purpose of targeted advertising or profiling “in furtherance of decisions that produce legal effects or effects of similar significance”
According to the Oregon DOJ website, the third-party categories requirement must strike a particular balance. It should offer consumers meaningful insights into the relevant types of businesses or processing activities, without making the privacy notice overly complex. Acceptable examples include ”analytics companies,” “third-party advertisers,” and ”payment processors,” among others.
The privacy notice or policy must be easy for consumers to access. It is typically linked in the website footer for visibility and accessibility from any page.
Data minimization and purpose limitation under the OCPA
The OCPA requires controllers to limit the personal data they collect to only what is “adequate, relevant, and reasonably necessary” for the purposes stated in the privacy notice. If the purposes for processing change, controllers must notify consumers and, where applicable, obtain their consent.
Data security under the OCPA
The Oregon data privacy law requires controllers to establish, implement, and maintain reasonable safeguards for protecting “the confidentiality, integrity and accessibility” of the personal data under their control. The data security measures also apply to deidentified data.
Oregon’s existing laws about privacy practices remain in effect as well. These laws include requirements for reasonable administrative, technical, and physical safeguards for data storage and handling, IoT device security features, and truth in privacy and consumer protection notices.
Data protection assessments (DPA) under the OCPA
Controllers must perform data protection assessments (DPA), also known as data protection impact assessments, for processing activities that present “a heightened risk of harm to a consumer.” These activities include:
- Processing for the purposes of targeted advertising
- Processing sensitive data
- The sale of personal data
- Processing for the purposes of profiling if there is a reasonably foreseeable risk to the consumer of:
- Unfair or deceptive treatment
- Financial, physical, or reputational injury
- Intrusion into a consumer’s private affairs
- Other substantial injury
The Attorney General may also require a data controller to conduct a DPA or share the results of one in the course of an investigation.
Consent requirements under the OCPA
The OCPA primarily uses an opt-out consent model. This means that in most cases controllers are not required to obtain consent from consumers before collecting or processing their personal data. However, there are specific cases where consent is required:
- Processing sensitive data requires explicit consent from consumers.
- For children’s data, the OCPA follows the federal Children’s Online Privacy Protection Act (COPPA) and requires consent from a parent or legal guardian before processing the personal data of any child under 13.
- Controllers must obtain explicit consent to use the personal data of minors between the ages of 13 and 15 for targeted ads, profiling, or sale.
- Controllers must obtain consent to use personal data for purposes other than those originally disclosed in the privacy notice.
To help consumers to make informed decisions about their consent, controllers must clearly disclose details about the personal data being collected, the purposes for which it is processed, who it is shared with, and how consumers can exercise their rights. Controllers must also provide clear, accessible information on how consumers can opt out of data processing.
Consumers must be able to revoke consent at any time, as easily as they gave it. Data processing must stop after consent has been revoked, and no later than 15 days after receiving the revocation.
Nondiscrimination under the OCPA
The OCPA prohibits controllers from discriminating against consumers who exercise their rights under the law. This includes actions such as:
- Denying goods or services
- Charging different prices or rates than those available to other consumers
- Providing a different level of quality or selection of goods or services to the consumer
For example, if a consumer opts out of data processing on a website, that individual cannot be blocked from accessing that website or its functions.
Some website features and functions do not work without certain cookies or trackers being activated, so if a consumer does not opt in to their use because they collect personal data, the site may not work as intended. This is not considered discriminatory.
This Oregon privacy law permits website operators and other controllers to offer voluntary incentives for consumers’ participation in activities where personal data is collected. These may include newsletter signups, surveys, and loyalty programs. Offers must be proportionate and reasonable to the request as well as the type and amount of data collected. This way, they will not look like bribes or payments for consent, which data protection authorities frown upon.
Third party contracts under the OCPA
Before starting any data processing activities, controllers must enter into legally binding contracts with third-party processors. These contracts govern how processors handle personal data on behalf of the controller, and must include the following provisions:
- The processor must ensure that all individuals handling personal data are bound by a duty of confidentiality
- The contract must provide clear instructions for data processing, detailing:
- The nature and purpose of processing
- The types of data being processed
- The duration of the processing
- The rights and obligations of both the controller and the processor
- The processor must delete or return the personal data at the controller’s direction or after the services have ended, unless legal obligations require the data to be retained
- Upon request, the processor must provide the controller with all necessary information to verify compliance with contractual obligations
- If the processor hires subcontractors, they must have contracts in place requiring the subcontractors to meet the processors’ obligations
- The contract must allow the controller or their designee to conduct assessments of the processor’s policies and technical measures to ensure compliance
These contracts are known as data processing agreements under some data protection regulations like the GDPR.
Universal opt-out mechanism under the OCPA
As of January 1, 2026, organizations subject to the OCPA must comply with a universal opt-out mechanism. Also called a global opt-out signal, it includes tools like the Global Privacy Control.
This mechanism enables a consumer to set their data processing preferences once and have those preferences automatically communicated to any website or platform that detects the signal. Preferences are typically set via a web browser plugin.
While this requirement is not yet standard across all US or global data privacy laws, it is becoming more common in newer legislation. Other states that require controllers to recognize global opt-out signals include California, Minnesota, Nebraska, Texas, and Delaware.
How to comply with the Oregon Consumer Privacy Act (OCPA)
Below is a non-exhaustive checklist to help your business and website address key OCPA requirements. For advice specific to your organization, consulting a qualified legal professional is strongly recommended.
- Provide a clear and accessible privacy notice detailing data processing purposes, shared data categories, third-party recipients, and consumer rights.
- Maintain a specific list of third parties with whom you share consumers’ personal data.
- Limit data collection to what is necessary for the specified purposes, and notify consumers if those purposes change.
- Obtain consent from consumers if you plan to process their data for purposes other than those that have been communicated to them.
- Implement reasonable safeguards to protect the confidentiality, integrity, and accessibility of personal and deidentified data.
- Conduct data protection assessments for processing activities with heightened risks, such as targeted advertising, activities involving sensitive data, or profiling.
- Implement a mechanism for consumers to exercise their rights, and communicate this mechanism to consumers.
- Obtain explicit consent for processing sensitive data, children’s data, or for purposes not initially disclosed.
- Provide consumers with a user-friendly method to revoke consent.
- Once consumers withdraw consent, stop all data processing related to that consent within the required 15-day period.
- Provide a simple and clear method for consumers to opt out of data processing activities.
- Avoid discriminatory practices against consumers exercising their rights, while offering reasonable incentives for data-related activities.
- Include confidentiality, compliance obligations, and terms for data return or deletion in binding contracts with processors.
- Comply with global opt-out signals like the Global Privacy Control by January 1, 2026.
Enforcement of the Oregon Consumer Privacy Act (OCPA)
The Oregon Attorney General’s office is the enforcement authority for the OCPA. Consumers can file complaints with the Attorney General regarding data processing practices or the handling of their requests. The Attorney General’s office must notify an organization of any complaint and in the event that an investigation is launched. During investigations, the Attorney General can request controllers to submit data protection assessments and other relevant information. Enforcement actions must be initiated within five years of the last violation.
Controllers have the right to have an attorney present during investigative interviews and can refuse to answer questions. The Attorney General cannot bring in external experts for interviews or share investigation documents with non-employees.
Until January 1, 2026, controllers have a 30-day cure period during which they can fix OCPA violations. If the issue is not resolved within this time, the Attorney General may pursue civil penalties. The right to cure sunsets January 1, 2026, after which the opportunity to cure will only be at the discretion of the Attorney General.
Fines and penalties for noncompliance under the OCPA
The Attorney General can seek civil penalties up to USD 7,500 per violation. Additional actions may include seeking court orders to stop unlawful practices, requiring restitution for affected consumers, or reclaiming profits obtained through violations.
If the Attorney General succeeds, the court may require the violating party to cover legal costs, including attorney’s fees, expert witness fees, and investigation expenses. However, if the court determines that the Attorney General pursued a claim without a reasonable basis, the defendants may be entitled to recover their attorney’s fees.
How does the Oregon Consumer Privacy Act (OCPA) affect businesses?
The OCPA introduces privacy law requirements that are similar to other state data protection laws. These include obligations around notifying consumers about data practices, granting them access to their data, limiting data use to specific purposes, and implementing reasonable security measures.
One notable distinction is that the law sets different compliance timelines based on an organization’s legal status. The effective date for commercial entities is July 1, 2024, while nonprofit organizations are given an additional year and must comply by July 1, 2025.
Since the compliance deadline for commercial entities has already passed, businesses that fall under the OCPA’s scope should ensure they meet its requirements as soon as possible to avoid penalties. Nonprofits, though they have more time, should actively prepare for compliance.
Businesses covered by federal laws like HIPAA and the GLBA, which may exempt them from other state data privacy laws, should confirm with a qualified legal professional whether they need to comply with the OCPA.
The Oregon Consumer Privacy Act (OCPA) and consent management
Oregon’s law is based on an opt-out consent model. In other words, consent does not need to be obtained before collecting or processing personal data unless it is sensitive or belongs to a child.
Processors do need to inform consumers about what data is collected and used and for what purposes, as well as with whom it is shared, and if it is to be sold or used for targeted advertising or profiling.
Consumers must also be informed of their rights regarding data processing and how to exercise them. This includes the ability for consumers to opt out of processing of their data or change their previous consent preferences. Typically, this information is presented on a privacy page, which must be kept up to date.
As of 2026, organizations must also recognize and respect consumers’ consent preferences as expressed via a universal opt-out signal.
Websites and apps can use a banner to inform consumers about data collection and enable them to opt out. This is typically done using a link or button. A consent management platform (CMP) like the Usercentrics CMP for website consent management or app consent management also helps to automate the detection of cookies and other tracking technologies that are in use on websites and apps.
A CMP can streamline sharing information about data categories and the specific services in use by the controller and/or processor(s), as well as third parties with whom data is shared.
The United States still only has a patchwork of state-level privacy laws rather than a single federal law. As a result, many companies doing business across the country, or foreign organizations doing business in the US, may need to comply with a variety of state-level data protection laws.
A CMP can make this easier by enabling banner customization and geotargeting. Websites can display data processing, consent information, and choices for specific regulations based on specific user location. Geotargeting can also improve clarity and user experience by presenting this information in the user’s preferred language.
Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or a privacy specialist regarding data privacy and protection issues and operations.
Microsoft Universal Event Tracking (UET) with Consent Mode helps businesses responsibly manage data while optimizing digital advertising efforts. UET is a tracking tool from Microsoft Advertising that collects user behavior data to help businesses measure conversions, optimize ad performance, and build remarketing strategies.
Consent Mode works alongside UET. It’s a feature that adjusts how data is collected based on user consent preferences. This functionality is increasingly important as businesses address global privacy regulations like the GDPR and CCPA.
For companies using Microsoft Ads, understanding and implementing these tools helps them prioritize user privacy, build trust, and achieve better marketing outcomes while respecting data privacy standards.
What is Microsoft UET Consent Mode?
Microsoft UET Consent Mode is a feature designed to help businesses respect user privacy while maintaining effective advertising strategies. It works alongside Microsoft Universal Event Tracking (UET) by dynamically adjusting how data is collected based on user consent.
When visitors interact with your website, Consent Mode determines whether tracking is activated or limited, depending on their preferences. For instance, if a user opts out of tracking, Consent Mode restricts data collection. This function aligns the tracking process with privacy preferences and applicable regulations.
Consent Mode supports businesses as they balance privacy expectations with effective campaign management. It also helps businesses align their data practices with Microsoft’s advertising policies and regional privacy laws to create a more transparent and user-focused approach to data management.
Why businesses need Microsoft UET Consent Mode
The role of UET in advertising
Microsoft Universal Event Tracking (UET) offers businesses the tools they need to optimize advertising strategies. With a simple tag integrated into a business’s website, UET helps advertisers monitor essential user actions like purchases, form submissions, and page views. This data is invaluable for building remarketing audiences, tracking conversions, and making data-backed decisions that improve ad performance.
However, effectively collecting and utilizing this data requires alignment with user consent preferences. Without proper consent, businesses risk operating outside privacy regulations, and could face penalties or restrictions. By integrating UET with Consent Mode, businesses can respect user choices while continuing to access the insights needed to run impactful advertising campaigns.
Challenges in advertising compliance
In today’s digital age, businesses must carefully balance data-driven advertising with growing privacy expectations. Regulations like the General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA), and the California Privacy Rights Act (CPRA) have set clear rules about how user data can be collected, stored, and used. Non-compliance can lead to significant consequences, such as hefty fines, restricted access to ad platforms, or even account suspension.
Beyond financial and operational risks, non-compliance can damage a company’s reputation. When businesses fail to address privacy concerns, they risk losing customer trust—a resource that is difficult to rebuild. As users become more aware of how their data is used, businesses that fail to adopt transparent practices may struggle to retain their audience.
Enforcement of Microsoft UET Consent Mode
Microsoft Advertising is requiring that customers start to enforce explicitly obtaining and providing consent signals by May 5, 2025.
Providing consent signals enables Microsoft Ads customers to comply with the requirements of privacy laws like the GDPR, where violations can result in hefty fines and other penalties.
Obtaining explicit consent also demonstrates respect for users’ privacy and rights, building user trust. Consumers increasingly indicate concerns over access to and use of their data online.
Consent benefits advertising performance as part of your Privacy-Led Marketing strategy as well. Continue generating valuable insights into campaigns for effective targeting and conversion tracking.
Benefits of using Microsoft UET Consent Mode
By integrating Microsoft UET Consent Mode, companies can address user expectations, improve data accuracy, and create a more transparent relationship with their audience. Let’s take a closer look at the benefits of using Microsoft UET Consent Mode.
Supporting privacy regulations
Privacy laws such as the GDPR, CCPA, and the ePrivacy Directive require businesses to handle user data responsibly. Microsoft UET Consent Mode adjusts data collection practices based on user preferences, helping companies better align with these requirements. By respecting user choices, businesses can reduce the risks associated with non-compliance.
Accurate data collection
Data accuracy is a key component of any successful advertising strategy. With Consent Mode, businesses only collect insights from users who agree to data tracking. This focus helps prevent skewed data caused by collecting information from users who have not consented. These insights are therefore more reliable and actionable.
Optimized ad campaigns
Consent Mode enables businesses to continue leveraging tools like remarketing and conversion tracking while honoring user privacy preferences. This functionality helps advertisers maintain the effectiveness of their campaigns by focusing on audiences who have opted into tracking. As a result, companies can make data-driven decisions without compromising privacy.
Building trust through transparency
Demonstrating respect for user privacy goes beyond privacy compliance — it also fosters trust. Transparency about how data is collected and used enables businesses to strengthen their relationships with customers. A privacy-first approach can set companies apart in a competitive advertising environment by showing users that their choices and rights are valued.
Why use Usercentrics Web CMP with Microsoft UET Consent Mode
Usercentrics Web CMP provides businesses with a practical solution for integrating Microsoft UET with Consent Mode. By leveraging Usercentrics Web CMP’s unique features, companies can manage user consent effectively while maintaining a seamless advertising strategy.
Streamlined implementation
Usercentrics Web CMP simplifies the process of integrating Microsoft Consent Mode. With automated configuration, businesses can set up their systems quickly and focus on optimizing their campaigns without the complexities of manual implementation.
Seamless compatibility
As among the first consent management platforms to offer automated support for Microsoft Consent Mode, Usercentrics Web CMP is designed for smooth integration with Microsoft UET. This compatibility reduces technical challenges and supports reliable functionality.
Customizable consent banners
The CMP enables businesses to design consent banners that align with their branding, creating a consistent user experience. Clear, branded messaging helps communicate data collection practices effectively while maintaining professionalism.
Privacy-focused data management
Usercentrics Web CMP provides a centralized platform for managing user consent across different regions and regulations. Businesses can easily adapt to global privacy requirements and organize their data collection practices efficiently, all in one place.
How to set up Microsoft UET with Consent Mode using Usercentrics Web CMP
Usercentrics Web CMP simplifies the process of setting up Microsoft UET with Consent Mode. As the first platform to offer automated implementation of Microsoft Consent Mode, Usercentrics Web CMP enables companies to focus on their marketing efforts while managing user consent effectively.
To integrate Microsoft UET with Consent Mode using Usercentrics Web CMP, follow these steps:
For a detailed walkthrough, refer to the support article.
Adapting to Privacy-Led Marketing with Microsoft UET Consent Mode
Microsoft UET with Consent Mode, supported by Usercentrics Web CMP, provides businesses with a practical approach to balancing effective advertising with user privacy. With this solution, companies can streamline consent management, enhance their advertising strategies, and adapt to ever-changing privacy expectations.
Respecting user choices isn’t just about privacy compliance—it’s an opportunity to build trust and demonstrate a commitment to transparency. Businesses that embrace Privacy-Led Marketing position themselves as trustworthy partners in a competitive digital marketplace.
Adopting Privacy-Led Marketing does more than support long-term customer relationships. It also enables companies to responsibly leverage valuable insights to optimize their campaigns. Microsoft UET with Consent Mode and Usercentrics Web CMP together create a strong foundation for businesses to effectively navigate the intersection of privacy and performance.
The Gramm-Leach-Bliley Act (GLBA), enacted in 1999, sets standards for protecting consumer data in the United States’ financial industry. Amid growing concerns about how institutions collect, use, and share sensitive personal information, the Act was passed as part of sweeping reforms to modernize the financial services sector.
The GLBA was among the first US data privacy laws to impose specific data privacy and security requirements on businesses. Its aim is to give consumers more control over their personal information while requiring institutions to adopt robust data protection measures.
Although the GLBA predates the current wave of state-level privacy laws and federal privacy legislation, its requirements continue to shape how financial institutions approach consumer data protection. Its principles have influenced many subsequent regulations, and remain central to compliance efforts in the financial industry.
The GLBA is also usually explicitly referenced in state-level US data privacy legislation passed to date, as those laws recognize that the federal GLBA is both robust in its protections and assigned responsibilities, and takes precedence.
What is the Gramm-Leach-Bliley Act (GLBA)?
The GLBA is a US federal law that addresses data security and data privacy practices in the US financial industry. It mandates that businesses that handle individual financial information, like banks, insurers, and loan providers, protect that data, inform customers of privacy practices, and limit data sharing.
GLBA summary
The GLBA was created to address concerns about data security and privacy within the financial sector. The regulation aims to protect consumers’ financial information and prevent sensitive data exposure by requiring that organizations follow responsible practices when handling data.
Any business that’s “significantly engaged” in financial activities and handles consumer financial data is required to follow the rules set out by the GLBA.
This definition includes financial institutions in the traditional sense — like banks, credit unions, and insurance companies — as well as businesses that are not usually recognized in this category — such as loan brokers, debt collectors, mortgage lenders, financial advisors, and tax preparers.
The GLBA requires these institutions to adhere to the following rules, which are aimed at maintaining transparency and accountability, while mitigating risks associated with data misuse:
- Financial Privacy Rule: Financial institutions must provide clear privacy notices that detail how personal information is collected, used, and shared. They must also give consumers the opportunity to opt out of certain data-sharing practices with unaffiliated third parties.
- Safeguards Rule: Businesses that handle consumer financial data must develop, implement, and maintain robust data security programs to protect customer information from unauthorized access or breaches.
- Pretexting Rule: This provision, which has been designed to counter criminal activity like fraud and identity theft, makes it illegal for anyone to obtain, disclose, or attempt to obtain or disclose a financial institution’s customer information under false pretenses.
GLBA updates
On May 13, 2024, an amendment to the Federal Trade Commission’s (FTC) Standards for Safeguarding Consumer Information (“Safeguards Rule”) came into effect. This update introduced more stringent requirements for security practices and data breach notifications.
Before the amendment, the GLBA simply required financial institutions to “develop, implement, and maintain a comprehensive security program” that “contains administrative, technical, and physical safeguards” that were appropriate considering the size and complexity of the entity.
The updated rule includes more detailed requirements for these systems, outlining nine elements a business’s information security program must include.
Most significantly, though, is that it introduced a notification requirement. Financial institutions must now notify the FTC of any security event where there has been unauthorized access of customer information if 500 or more individuals are affected. Prior to the new rule being adopted in 2023, there was no notification requirement. A prior proposal set the threshold at 1,000 individuals, but it was amended to 500. Notifications of such a breach must be sent to customers as soon as possible, and no later than 30 days after discovery.
This amendment created a very low threshold for mandatory breach notifications, bringing GLBA requirements in line with international regulations like the General Data Protection Regulation (GDPR).
GLBA definitions
Below, we’ll cover the definitions of certain concepts within the GLBA to provide clarity on how the Act may apply to your business.
Financial institution under GLBA
A financial institution under the GLBA is defined as “any institution the business of which is engaging in activities that are financial in nature or incidental to such financial activities.” In other words, any company that offers financial products or services to individuals, such as loans, financial or investment advice, or insurance.
The Act states that this definition can include “banks, securities brokers and dealers, insurance underwriters and agents, finance companies, mortgage bankers, and travel agents.”
If your company is significantly engaged in providing financial products or services to consumers, it’s likely subject to GLBA regulations and must adhere to its requirements for protecting customer information.
Financial service under GLBA
According to the GLBA, a financial service “includes, among other things, a financial institution’s evaluation or brokerage of information that the institution collects in connection with a request or an application from a consumer for a financial product or service.”
This broad definition subjects a wide range of activities related to managing and handling money to the Act’s privacy and security requirements. These activities include:
- lending, exchanging, or transferring money
- investing for others
- safeguarding money or securities
- providing financial or investment advice
- insurance underwriting
Services like issuing credit cards, managing investment portfolios, offering insurance policies, and facilitating payment processing (credit card companies and processors like PayPal, Square, Stripe, etc.) are all considered financial services under the GLBA.
Consumer and customer under GLBA
According to the GLBA, all customers are consumers, but not all consumers are customers. A consumer is “an individual who obtains, from a financial institution, financial products or services,” while a customer is someone who has an ongoing relationship with a financial institution.
For instance, someone who takes out a mortgage loan from a bank would be a customer because the financing and servicing of that loan requires an ongoing relationship. However, if the same person were simply using one of that bank’s ATMs to withdraw cash, they would just be considered a consumer.
This distinction is important because customers typically have more privacy rights under the GLBA than consumers do.
Nonpublic personal information under GLBA
Nonpublic personal information (NPI) refers to the personal details of consumers. This personally identifiable information is usually obtained by the institution as the result of transactions or services performed for the consumer.
NPI can include information that “a consumer provides to a financial institution to obtain a financial product or service from the institution; results from a transaction between the consumer and the institution involving a financial product or service; or a financial institution otherwise obtains about a consumer in connection with providing a financial product or service.”
Data like Social Security numbers, account balances, payment histories, and any information derived from consumer reports falls into this category. However, information that’s publicly and lawfully available, like data from public records, is not considered NPI.
Nonaffiliated third party under GLBA
A nonaffiliated third party is any entity that is not an affiliate of the financial institution. The GLBA defines an affiliate of a financial institution as “any company that controls, is controlled by, or is under common control with the financial institution.”
In other words, a nonaffiliated third party is an entity that doesn’t control, isn’t controlled by, and isn’t under common control with the institution.
Nonaffiliated third parties are external companies or individuals with whom a financial institution may share consumers’ NPI, provided that consumers are given proper notice and the opportunity to opt out of such sharing. However, there are certain circumstances in which sharing is permitted without an opt-out option.
Opt-out right and exceptions under GLBA
The GLBA gives consumers the right to opt out of allowing financial institutions to share their NPI with nonaffiliated third parties. This means that, before sharing such information, institutions must provide a clear notice and the option for consumers to decline.
The GLBA states that “consumers must be given a reasonable opportunity and a reasonable means to opt out.” It also clarifies that ”what constitutes a reasonable opportunity to opt out depends on the circumstances surrounding the consumer’s transaction, but a consumer must be provided a reasonable amount of time to exercise the opt out right.”
There are, however, instances in which opt-out rights do not apply. For example, when NPI is shared with service providers performing essential tasks on behalf of the institution, where the institution is legally compelled to share this information, like reporting suspicious activities under anti-fraud regulations, or if it is shared as part of a transaction requested by the consumer.
Who must comply with the GLBA
The GLBA’s scope applies to financial institutions beyond traditional banks, to include many other types of organizations. Let’s explore exactly who must comply and applicable exceptions.
GLBA applies to
We’ve established that the GLBA’s broad definition of financial institutions means that it applies to a variety of entities. Here are some of the most common ones.
- Banks: Institutions like commercial banks, savings associations, and credit unions that manage deposits, provide loans, and offer payment services.
- Insurance companies: Companies that provide insurance coverage, and also commonly provide diversified offerings with other financial products and services.
- Payday lenders: Businesses providing short-term, high-interest loans typically meant to cover expenses until the borrower’s next paycheck.
- Mortgage brokers: Companies that act as intermediaries between borrowers and lenders, helping individuals secure home loans or refinancing options.
- Non-bank lenders: Organizations offering loans without traditional banking structures, such as auto loan providers or personal loan companies.
- Debt collectors: Entities that recover unpaid debts on behalf of creditors. Examples include collections agencies and legal recovery firms.
- Personal property or real estate appraisers: Professionals or companies that determine the value of assets like homes, cars, or commercial property.
- Professional tax preparers: Individuals or firms that provide tax advice and tax filing assistance.
- Financial advisors and planners: Professionals who offer guidance on investments, retirement plans, estate planning, or wealth management.
It’s important to remember that the GLBA’s application depends on the nature of the relationship between an individual and a financial institution. In other words, it depends on whether that individual is a customer or a consumer.
When the individual is a customer with an ongoing relationship, e.g. a bank account holder or mortgage client, more comprehensive privacy protections apply. Conversely, a consumer who interacts with the institution for a one-time transaction, like cashing a check, may have fewer rights.
GLBA exceptions
Sections 13, 14, and 15 of the GLBA outline cases in which financial institutions aren’t required to provide a privacy notice or opt-out option when sharing NPI. These exceptions cover cases in which the disclosure of NPI is limited.
- Section 13: “to a nonaffiliated third party to perform services for the financial institution or to function on its behalf, including marketing the institution’s own products or services or those offered jointly by the institution and another financial institution.” This exception is only permitted if the financial institution provides an initial notice of these arrangements and the third party signs a confidentiality contract that states they won’t disclose or use the information for anything other than the specified purposes.
- Section 14: “as necessary to effect, administer, or enforce a transaction that a consumer requests or authorizes, or under certain other circumstances relating to existing relationships with customers.”
- Section 15: “for specified other disclosures that a financial institution normally makes.” These other disclosures can include efforts to prevent fraud or to comply with legal requirements by disclosing information to regulators.
Consumer rights under the GLBA
Consumers have the right to opt out of having their NPI shared with certain nonaffiliated third parties.
When a financial institution intends to share a consumer’s NPI with one of these third parties for purposes not explicitly exempt under the law, it must first provide a clear privacy notice that outlines the types of information collected, how that information will be shared, and the consumer’s ability to opt out.
Then, the consumer must be given a reasonable means and timeframe to exercise their opt-out right.
It’s important to note that the Act makes a distinction between consumer and customer rights. For customers, the GLBA states that they “are entitled to initial and annual privacy notices regardless of the information disclosure practices of their financial institution unless an exception to the annual privacy notice requirement applies.”
So, in addition to being able to opt out of NPI disclosure, customers also have a right to receive privacy notices that outline the financial institution’s ongoing use of their data.
What are financial institutions obliged to do under the GLBA?
Under the GLBA, financial institutions must take measures to protect customer data and provide privacy and opt-out notices.
Privacy notices under the GLBA
Financial institutions must provide clear and concise privacy notices to customers that explain how their NPI is collected, used, and shared.
They must provide these notices at the start of the customer relationship and annually thereafter. These notices must also be easily accessible, written in plain language, and displayed in a manner that enables consumers to review them before making decisions about their data.
Opt-out notices under the GLBA
Opt-out notices are required when a financial institution plans to share NPI with nonaffiliated third parties. These notices must clearly inform consumers of their right to opt out, outline the methods available to do so, like forms, online options, or toll-free numbers, and allow a reasonable timeframe for response.
Safeguarding NPI under the GLBA
Financial institutions must protect consumer data from unauthorized access, misuse, and breaches. This includes creating a comprehensive security program that includes administrative, technical, and physical safeguards.
The FTC’s Safeguards Rule now requires that financial institutions create a written information security plan (WISP) that outlines their strategy for securely handling consumer data and protecting against potential threats and breaches. According to the FTC, if your business meets the definition of a financial institution, your plan “must be appropriate to the size and complexity of your business, the nature and scope of your activities, and the sensitivity of the information at issue.”
Additionally, financial institutions must assess risks, regularly monitor systems, and train employees to promote the confidentiality, integrity, and security of customer information.
Enforcement of the Gramm-Leach-Bliley Act
In the absence of a comprehensive federal privacy law that is not sectoral (important laws like HIPAA, the CPPA, COPPA, FERPA, etc. are largely sector-specific), the GLBA operates alongside numerous state-level data privacy laws. These state laws often include enforcement exemptions for institutions covered under the GLBA, since it is a federal law that supersedes state regulations.
Enforcement authority
Enforcement of the GLBA is shared between federal and state agencies. The entity responsible for ensuring compliance with the law depends on the type of financial institution in question. Enforcement authorities include:
- The FTC: Oversees non-bank financial institutions, such as mortgage brokers, payday lenders, and tax preparers.
- Federal banking regulators: These include the Federal Deposit Insurance Corporation (FDIC), the Federal Reserve, and the Office of the Comptroller of the Currency (OCC), which enforce GLBA compliance for banks and similar entities.
- State insurance commissioners: Responsible for ensuring that insurance providers comply with the Act at a state level.
Damages and fines
Financial institutions may face civil fines of up to USD 100,000 per violation. Responsible individuals, e.g. corporate officers or directors, can incur personal fines up to USD 10,000. They may also face criminal penalties for intentional violations, including imprisonment for up to five years.
Beyond legal consequences, noncompliance can result in reputational damage, loss of consumer trust, and increased scrutiny from regulatory bodies. These can have lasting effects on an institution’s success, making compliance with the GLBA crucial for applicable organizations.
Consent management and the Gramm-Leach-Bliley Act
The GLBA requires financial institutions to provide privacy notices explaining how customer data is collected, used, and shared, along with the option for consumers to opt out of sharing their NPI with nonaffiliated third parties.
Managing these processes manually can be taxing and time-consuming. Fortunately, there are specialized platforms that help simplify compliance efforts. Usercentrics products, for example, automate privacy notices, track opt-out preferences, and keep consumers informed of their rights.
By centralizing consent management, the platform helps to simplify adherence to GLBA requirements, fosters transparency, and strengthens customer trust — all while reducing the administrative burden of compliance on your businesses.
Navigating GLBA compliance
Financial institutions need to safeguard consumer NPI, provide clear privacy notices, and offer opt-out options for data sharing to meet the GLBA’s requirements.
Achieving compliance with this comprehensive regulation and other data privacy laws means implementing robust security programs, conducting regular risk assessments, and creating transparency in your data handling practices.
Usercentrics simplifies privacy compliance by aligning your data handling practices with the requirements of the various data privacy laws applicable to your business. We help manage consumer consent, generate privacy notices, and more, so you can stay legally compliant while building trust and increasing transparency with your customers.
Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or privacy specialists regarding data privacy and protection issues and operations.
Québec has long had privacy laws in place to protect personal information, including:
- The Act Respecting the Protection of Personal Information in the Private Sector 1994 (The Private Sector Act), which regulates how businesses collect and handle personal information
- The Act Respecting Access to Documents Held by Public Bodies and the Protection of Personal Information 1982 (The Public Sector Act), which applies to public organizations
While these laws established foundational protections, they were implemented before the rise of digital platforms, big data, and AI-driven decision-making. The laws became outdated as technology advanced and data collection grew more complex.
In response to these challenges, Québec Law 25 was passed in 2021. It significantly amends, updates, and modernizes these existing laws. It also aligns Québec’s privacy framework with international standards, such as the European Union’s General Data Protection Regulation (GDPR), and strengthens protections for individuals while holding organizations more accountable.
What is Québec Law 25 and who does it apply to?
Québec Law 25, or “Act to modernize legislative provisions as regards the protection of personal information,” was introduced in the Québec National Assembly in June 2020 as Bill 64. When the Bill was passed into law in September 2021, it became known as Law 25.
This new law gives individuals more control over their personal information and introduces stricter rules and stronger accountability for organizations that handle it.
Law 25 applies to any enterprise that collects, uses, or processes the personal information of individuals residing in Québec, even if the enterprise itself is located outside the province. It is the most stringent provincial privacy regulation in Canada, and includes stipulations not reflected in the federal Personal Information Protection and Electronic Documents Act (PIPEDA), though that law, passed in 2000, has been amended a number of times.
The law uses the definition of enterprise outlined in Article 1525 of the Civil Code of Québec, which is “[t]he carrying on by one or more persons of an organized economic activity, whether or not it is commercial in nature, consisting of producing, administering or alienating property, or providing a service, constitutes the operation of an enterprise.”
This means that Law 25 applies to public bodies, private organizations, nonprofits, and even individuals acting in a professional capacity or carrying out an organized economic activity who collect the personal information of Québec residents.
What is personal information under Québec Law 25?
Personal information is defined similarly under both The Private Sector Act and The Public Sector Act as any information that relates to or concerns a natural person and directly or indirectly makes it possible to identify that person.
Personal information is considered sensitive if it is inherently private, such as medical or biometric data, or if the way it is used or shared creates a heightened expectation of privacy.
Québec Law 25 does not cover personal information collected, stored, shared, or used for journalism, historical research, or genealogy that is used to provide legitimate information to the public.
Key provisions under Québec Law 25
Québec Law 25 has introduced several new provisions that bring Québec’s privacy regulation closer to global data protection regulations.
Enhanced consent requirements
Québec Law 25 outlines clear rules about how consent must be handled to give individuals control over their personal information. Personal information can only be used to serve the purpose for which it was collected unless the individual gives explicit consent for a different purpose.
However, there are a few exceptions. Personal information can be used for another purpose without new consent, but only:
- if the new purpose is consistent with the original purpose
- if the use clearly benefits the individual
- if it is necessary to provide or deliver a product or service the individual requested
- if it is needed for studies, research, or statistics, as long as the information is de-identified
- if it’s required to prevent and detect fraud or improve security measures
- by public bodies if it is necessary to enforce a Québec law, even if the law doesn’t explicitly provide for this use
Consent must always be clear, freely given, informed, and specific. Enterprises must request consent using simple and unambiguous language, and must make a separate request for each purpose. If enterprises request consent in writing, it should be separate from other information given to the individual, such as terms and conditions.
When it comes to sensitive personal information, consent must always be explicit.
Individuals also have the right to withdraw their consent at any time, and the enterprise must stop using the individual’s personal information once consent is withdrawn.
For personal information belonging to minors under 14 years old, a parent, guardian, or tutor must give consent. Minors aged 14 or older can give their own consent or allow their parent, guardian, or tutor to provide it.
New and expanded individual rights
Québec Law 25 has strengthened individuals’ rights relating to their personal information.
- Right to privacy by default: Enterprises that offer technological products or services, such as apps, software, or online platforms, must collect as little personal information as possible without requiring users to adjust settings to protect their privacy. However, this does not apply to privacy settings for browser cookies.
- Right to know: Individuals can request to know why their personal information is collected, how it will be used, and any third parties with whom it will be shared.
- Right to access: Individuals can request a copy of the personal information that an enterprise holds about them.
- Right to erasure: Individuals can request that their personal information be deleted when it is no longer needed for the purpose for which it was collected, or when the enterprise has handled the information in a way that violates the law.
- Right to correction: Individuals can ask to have incomplete or inaccurate personal information corrected. They can also request that personal information that is collected, communicated, or kept contrary to law be rectified.
- Right to data portability: Enterprises must provide individuals with their personal information in a structured, commonly used technology format upon request. This practice enables individuals to transfer their data to another service provider.
Right to transparency in automated decision-making: Enterprises must disclose when automated systems make decisions that affect individuals.
Privacy impact assessments
Québec Law 25 requires enterprises to conduct a privacy impact assessment (PIA) or data protection impact assessment (DPIA) in certain situations, including:
- when acquiring, developing, or overhauling systems or projects involving personal information
- before transferring personal information outside Québec
- before sharing personal information without consent for study, research, or statistical purposes
Data breach notification
When an enterprise suspects there has been a data breach involving personal information — known under the law as a confidentiality incident — it must take reasonable measures to reduce the risk of injury and prevent similar incidents in the future.
The following are considered confidentiality incidents under the law:
- unauthorized access to personal information
- unauthorized use of personal information
- unauthorized sharing or disclosure of personal information
- loss of personal information or any other failure to properly protect it
Québec Law 25 also requires that, if the breach presents a “risk of serious injury” an enterprise must inform Québec’s privacy regulator, known as the Commission d’accès à l’information du Québec (CAI), and affected individuals about the breach. An exception may be made if notifying individuals would interfere with a legal investigation.
Enterprises must maintain a register of data breaches and make it available to the Commission upon request.
Privacy policy requirements
Québec Law 25 requires enterprises that must comply with the law to publish a privacy policy explaining their data practices. Privacy policies must be written in simple language that is easy for individuals to understand.
The privacy policy should include:
- what personal information is being collected and through which means
- purpose(s) for collection
- individuals’ rights, especially the rights of access and correction
- right to withdraw consent
- how the information will be used, stored, and shared
- for how long the information will be retained
- who will have access to the information, including third parties, if any
- details and contact information of the DPO
- details on any automated decision-making processes, if applicable, including profiling
Appointment of a privacy officer
Québec Law 25 automatically appoints the “person exercising the highest authority” in an enterprise as the person in charge of protecting personal information. This role is similar to that of a Data Protection Officer (DPO) under the GDPR. An enterprise does have the option to appoint another individual as DPO and can assign some or all of the statutory responsibilities in writing.
A private organization may appoint any person as DPO or privacy officer, regardless of whether they are an employee. In the case of public bodies, the appointed DPO may be one of the following:
- a member of the public body
- a member of its board of directors
- a member of its management personnel
Public bodies are also required to inform the Commission in writing about the title, contact information, and start date of the appointed DPO.
Enterprises must publish the title and contact information of the DPO on their website. If the enterprise doesn’t have a website, it must make this information available by “any other appropriate means.”
When did Québec Law 25 come into effect?
The Québec privacy law was implemented in stages to give enterprises time to comply with its requirements:
- September 22, 2022: The date the first phase of the law came into effect. This included the appointment of a privacy officer and data breach reporting requirements.
- September 22, 2023: The date that most of the law’s provisions became effective, including consent requirements, transparency in privacy policies, data protection impact assessments, and individuals’ rights under the law.
- September 22, 2024: The date that the final provision, the right to data portability, came into effect.
With all provisions of Québec Law 25 now fully operational, enterprises must align their privacy practices with the law to avoid penalties and maintain trust with Québec residents.
Québec Law 25 enforcement and penalties
Québec Law 25 is enforced by the Commission d’accès à l’information du Québec (CAI), which has the authority to monitor compliance, conduct investigations, and impose penalties for violations.
Noncompliance can lead to substantial financial penalties:
- Administrative monetary penalties: Up to CAD 10 million or 2 percent of global turnover for the preceding fiscal year, whichever is higher. However, for individuals, this penalty is capped at CAD 50,000.
- Penal provisions: For severe violations, fines can reach up to CAD 25 million or 4 percent of global turnover for the preceding fiscal year, whichever is higher. For individuals, this penalty is capped at CAD 100,000.
Additionally, individuals who believe their privacy rights have been violated can seek damages of at least CAD 1,000 and may also pursue collective action against violators.
Beyond financial penalties, noncompliance can lead to reputational damage, which can erode customer trust and harm long-term business relationships.
Steps for compliance with Québec Law 25
To meet the requirements of Québec Law 25, organizations must take proactive measures to responsibly manage personal information and protect individual privacy.
- Assign a privacy officer to oversee compliance, implement privacy policies, and manage privacy practices. Having a dedicated person who is responsible for meeting the law’s requirements can streamline the process.
- Update consent mechanisms to obtain explicit, informed consent, and implement processes that make it easy for individuals to withdraw consent at any time.
- Conduct privacy impact assessments as required to identify and address privacy risks early.
- Update privacy policies to clearly explain how personal information is collected, used, stored, and shared, using transparent and simple language.
- Establish procedures that enable individuals to exercise their rights, such as the right to access their personal information, request corrections, or delete their data.
- Develop an incident response plan to detect, address, and report data breaches promptly, including steps to minimize harm and notify affected parties as required.
- Strengthen data security by adopting safeguards such as encryption, access controls, and data minimization to protect personal information from unauthorized access or misuse.
- Implement processes for data portability, enabling individuals to receive their personal information in a structured, commonly used format for transfer to another service provider.
We strongly recommend consulting a qualified legal expert who can give advice for achieving compliance with Québec Law 25 that is specific to your enterprise’s data privacy practices.
Québec Law 25 compliance with Usercentrics
Using a consent management platform (CMP) like Usercentrics CMP can help enterprises meet the Québec privacy law’s consent requirements by enabling them to collect explicit, informed consent from individuals. CMPs streamline this process by clearly presenting consent requests that are specific to their purpose, as required by the law.
Usercentrics CMP also enables users to withdraw their consent easily, enabling enterprises to meet Québec Law 25’s requirements for consent withdrawal.
Differences between Québec Law 25 and PIPEDA
The Personal Information Protection and Electronic Documents Act (PIPEDA) is Canada’s federal privacy law that governs how organizations handle personal information for commercial activities. It sets baseline privacy standards across the country, while provinces like Québec can enact their own laws, such as Law 25, to impose additional requirements.
Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or privacy specialists regarding data privacy and protection issues and operations.
In November 2024, Australia’s Parliament passed the Online Safety Amendment (Social Media Minimum Age) Bill 2024. This new law has grabbed international attention as a serious attempt to address harm to children as the result of using social media platforms. We look at what the Amendment’s scope, social media platforms that are affected, how companies need to comply, and what potential penalties are.
What is the Australian Online Safety Amendment?
The Online Safety Amendment amends the Online Safety Act 2021. It’s designed to create specific requirements for children’s access to social media platforms, the most notable being a ban for children in Australia under age 16 from holding accounts on these platforms. Companies operating affected social media platforms will be expected to introduce and enforce age gating to prevent children from using the platforms.
It’s common under a number of international laws to restrict the access to social platforms for children under age 13, and age verification is required, though often circumvented. 13 is the same age at which people generally become categorized as adults under many data privacy laws — or they enter a mid-range including ages 13 to 16. Typically by that age, where consent is required, it now must be obtained directly from the individuals, rather than from a parent or guardian.
The Online Safety Amendment dovetails with broader data privacy law in that it includes specific privacy protections, including limiting children’s use of the platforms, and retention of personal data that’s collected on them. Noncompliance penalties are also substantial.
Why was the Australian Online Safety Amendment introduced?
The bill was introduced to address ongoing concerns about the impacts of children’s access to social media platforms. The impacts of exposure to social media at a critical development are not yet fully understood, so the full harm potential is not yet known. Studies have already shown negative impacts on children’s and teens’ mental health, and there have been criminal cases involving predators accessing and manipulating children through social platforms.
Children’s activities online also aren’t always well monitored or carefully limited, and the prevalence of Wi-Fi and mobile devices makes access to social platforms ever easier.
What has the reaction been to the Australian social media ban for children?
The law has been controversial in some circles. Not unexpectedly, there have been mixed reviews, including whether the law goes too far, not far enough, or misses the mark in its intent. Critics make a variety of claims, including:
- the law may introduce new risks and cause more harm
- the law’s scope and exclusions are insufficient or incorrectly targeted
- children’s autonomy is compromised
- children are digitally savvy and will easily find ways around the ban
- opportunities for learning and growth will be stifled
- significant burdens will be levied on social media platforms to create and manage age restrictions
Perhaps ironically, it has been noted that by requiring social media platforms to collect and use potentially sensitive personal information from children to verify age and enforce the law’s requirements, greater risk to their privacy and safety, as well as to privacy compliance with other laws, may result.
To accompany the new law and its requirements, Australia’s eSafety Commissioner has provided content and services for educators, parents, and others, targeting the topic of children’s online safety, and how to support children’s safe activities online. This touches on an important point: that it will take a variety of measures, from legal to educational to parental, to safely manage children’s use of digital social spaces.
Who has to comply with the Australian Social Media Minimum Age law?
Certain social media platforms, noted as “age-restricted social media platforms” are required to self-regulate under the law. They must take “reasonable steps” to prevent Australian children under age 16 (“age-restricted users”) from creating or using accounts or other profiles where potential harms are considered likely to occur.
Children under the age of 13 are required to be explicitly excluded in the platforms’ terms of service to remove any ambiguity about at what age it is appropriate to start using social media.
The eSafety Commissioner will be responsible for writing guidelines on the “reasonable steps” that the affected age-restricted social media platforms are required to take. The new law does not include in its text specifics like what age estimation or verification technology may be used or what the reasonable steps guidelines will include.
The law’s text does not explicitly reference any current social platforms, as popular ones tend to change over time. However, in the explanatory memorandum, the government noted that the law is intended to apply to companies like Snapchat and Facebook (parent company Meta), rather than companies offering services like messaging, online gaming, or services primarily aimed at education or health support, with Google Classroom or YouTube given as examples.
However, such distinctions can be tricky, as a number of social media platforms that would likely be included do also enable functions like messaging and gaming, for example. There are legislative rules that can set out additional coverage conditions or specific electronic services that the law includes or exempts.
Businesses have one year from the passage of the Social Media Minimum Age bill to comply, so enforcement will likely begin as of or after November 2025.
What measures do companies need to take to comply with the Online Safety Amendment?
Social media platforms that meet the Amendment’s inclusion requirements will need to implement or bolster functions on their platforms to verify user age and prevent children under 16 from creating or maintaining accounts. Presumably, the platforms will also need to purge existing accounts belonging to children. The law does not specify what technology should be used or how age should be verified, as this changes over time.
The definition of a user on these social media platforms involves being an account holder who is logged in, so children who are not logged in to accounts can continue to access contents or products on these platforms if available. As platforms are not able to access nearly as much user data from those who are not logged in, many platforms significantly limit functionality to individuals who are not logged-in account holders.
Other existing privacy law requirements dovetail with the Online Safety Amendment’s requirements, particularly the Privacy Act 1988. For example, covered platforms can only use collected personal data for the purpose of compliance unless explicitly permitted under the Privacy Act or user informed and voluntary user consent is obtained. This information must then be disclosed or destroyed after its use for that specific purpose.
What other legal actions address children’s use of social media?
Around the world there are a number of laws that address children’s privacy and online activities, though they are more broad and don’t explicitly target social media use — some of them predate the relevant platforms’ existence.
Additionally, broader regional data privacy laws, like the Privacy Act 1988, are relevant, and they all include specific and stringent requirements for accessing and handling children’s data, as well as consent requirements.
In the United Kingdom there is the U.K. Online Safety Act, which has a section dedicated to “Age-appropriate experiences for children online.”
In the EU, there is the Digital Services Act (DSA), which focuses on a wide range of digital intermediary services. It’s aimed at “very large online platforms”, aka VLOPs, and very large online search engines, or VLOSEs. The list of designated VLOPs does include social media platforms. The DSA imposes strict requirements to address risks that their operation poses to consumers, as well as aiming to protect and enhance individuals rights, particularly relating to data privacy, including those of minors.
In the United States, the Children’s Online Privacy Protection Act (COPPA) has been in place since 2000, though revised several times by the Federal Trade Commission, and aims to protect children under age 13 and their personal information. COPPA is more broad, however, and not focused only on social media platforms, though they are covered under its requirements.
There are efforts to introduce substantial legislative updates, referred to as “COPPA 2.0”, which would further modernize the law, including raising the compliance age from 13 to 16 to protect more children. It would also include more stringent requirements for operators of social platforms if there are reasonable expectations that children under 16 use the platform.
At present, compliance is only required if there are known children under 13 using the services. Insisting that they don’t know for sure if children use the platforms is a common excuse to avoid compliance requirements, though children’s presence on social media platforms is widely known.
A number of social media platforms have been charged with COPPA violations, including Epic Games, which makes the popular video game Fortnite, and the video app TikTok (parent company ByteDance). Interestingly, in early November 2024, the Canadian government ordered that TikTok’s Canadian operations be shut down due to security risks, which the company is appealing. The order will not likely affect consumer use of the app, however.
What are the penalties for violating the Australian Online Safety Amendment?
The Privacy Act applies to compliance and penalties as well, as violations of the Amendment will be considered “an interference with the privacy of the individual” for the purposes of the Privacy Act. The Information Commissioner will manage enforcement of the Social Media Minimum Age law, and concompliance fines will be up to 30,000 “penalty units”, which as of the end of 2024 equals AUD 9.5 million.
A penalty unit is a way to standardize and calculate fines, accomplished by multiplying the current value of a single penalty unit — which is determined by the Information Commissioner and regularly updated to reflect inflation — by the number of penalty units assigned to the offence.
The Information Commissioner will also hold additional powers for information gathering and the ability to notify a social media platform and publicly release information if it’s determined the platform has violated the law.
Independent review of the law is required within two years of it coming into effect, so by November 2026.
The future of online data privacy on social media platforms
Australia’s Online Safety Amendment has significant implications for data privacy and children’s autonomy as governments, educators, and parents — in that country and around the world — struggle to balance children’s use of social media to enable connection, education, and entertainment while keeping them safe from misinformation and abuse.
The Social Media Minimum Age law places strict requirements on relevant platforms to implement age verification, prevent and remove account-holding by children, and also ensure the security of sensitive information required to do these verifications. The penalties for failing to adequately achieve this ban are steep, and compliance won’t be easy given how fast technologies change and how savvy many children are online. The amendment may well require its own amendments in a relatively short period of time.
There will be a lot of attention over the next two years on how this law rolls out and what works and doesn’t to fulfill requirements. The required report after the first two years should also prove illuminating, and provide guidance for other countries considering similar measures, or looking to update existing data privacy legislation to better protect children.
Companies implementing best practices for data privacy compliance and protection of users of websites, apps, social media platforms, and more should ensure they are well versed in relevant (and overlapping) laws, including specific requirements for special groups like children.
They should consult qualified legal counsel about obligations, and IT specialists about the latest technologies to meet their needs. They should also invest in well integrated tools, like a consent management platform, to collect valid consent for data use where relevant and inform users about data handling and their rights.
2024 saw the number of new data privacy regulations continue to grow, especially in the United States. It also saw the effects of laws passed earlier as they came into force and enforcement began, like with the Digital Markets Act (DMA). But perhaps the biggest impact of data privacy in 2024 was how quickly and deeply it’s become embedded in business operations.
Companies that may not have paid a lot of attention to regulations have rapidly changed course as data privacy requirements have been handed down by companies like Google and Facebook. The idea of “noncompliance” stopped being complicated yet nebulous and became “your advertising revenue is at risk.”
We expect this trend of data privacy becoming a core part of doing business to continue to grow through 2025 and beyond. More of the DMA’s gatekeepers and other companies are likely to ramp up data privacy and consent requirements throughout their platform ecosystems and require compliance from their millions of partners and customers. Let’s not forget that data privacy demands from the public continue to grow as well.
We also expect to see more laws that include or dovetail with data privacy as they regulate other areas of technology and its effect on business and society. AI is the biggest one that comes to mind here, particularly with the EU AI Act having been adopted in March 2024. Similarly, data privacy in marketing will continue to influence initiatives across operations and digital channels. Stay tuned to Usercentrics for more about harnessing Privacy-Led Marketing.
Let’s peer into the future and look at how the data privacy landscape is likely to continue to evolve in the coming year, where the best opportunities for your company may lie, and what challenges you should plan for now.
2025 in global data privacy regulation
For the last several years, change has been the only constant in data privacy regulation around the world. Gartner predicted that 75 percent of the world’s population would be protected by data privacy law by the end of 2024. Were they right?
According to the International Association of Privacy Professionals (IAPP), as of March 2024, data privacy coverage was already close to 80 percent. So the prediction had been exceeded even before we were halfway through the year.
Data privacy regulation in the United States
The United States passed a record number of state-level data privacy regulations in 2024, with Kentucky, Maine, Maryland, Minnesota, Nebraska, New Hampshire, New Jersey, Rhode Island, and Vermont coming on board to bring the number of state-level US data privacy laws to 21. By contrast, six states passed laws in 2023, which was a record number to date then.
The privacy laws in Florida, Montana, Oregon, and Texas went into effect in 2024. The privacy laws in Delaware, Iowa, Maryland, Minnesota, Nebraska, New Hampshire, New Jersey, and Tennessee go into effect in 2025.
Since the majority of US states still don’t have data privacy regulations, more of these laws are likely to be proposed, debated, and (at least sometimes) passed. It will be interesting to see if certain states that have wrangled with privacy legislation repeatedly, like Washington, will make further progress in that direction.
April 2024 saw the release of a discussion draft of the American Privacy Rights Act (APRA), the latest federal legislation in the US to address data privacy. It made some advances during the year, with new sections added addressing children’s data privacy (“COPPA 2.0”), privacy by design, obligations for data brokers, and other statutes. However, the legislation has not yet been passed, and with the coming change in government in January 2025, the future of APRA is unclear.
Data privacy regulation in Europe
The European Union continues to be at the forefront of data privacy regulation and working to keep large tech platforms in check. Two recent regulations, particularly, will continue to shape the tech landscape for some time.
The Digital Markets Act (DMA) and its evolution
With the Digital Markets Act in effect, the first six designated gatekeepers (Alphabet, Amazon, Apple, ByteDance, Meta, and Microsoft) had to comply as of March 2024. Booking.com was designated in May, and had to comply by November.
There is a good chance that additional gatekeepers will be designated in 2025, and that some current ones that have been dragging their metaphorical feet will start to accept the DMA’s requirements. We can expect to see the gatekeepers roll out new policies and requirements for their millions of customers in 2025 to help ensure privacy compliance across their platforms’ ecosystems.
More stringent consent requirements are also being accompanied by expanded consumer rights, including functions like data portability, which will further enhance competitive pressures on companies to be transparent, privacy-compliant, and price competitive while delivering great customer experiences.
The AI Act and its implementation
While the entirety of the AI Act will not be in effect until 2026, some key sections are already in effect in 2024, or coming shortly, so we can expect to see their influence. These include the ban on prohibited AI systems in EU countries and the rules for general purpose AI systems.
Given that training large language models (LLMs) requires an almost endless supply of data, and organizations aren’t always up front about getting consent for it, it’s safe to say that there will continue to be clashes over the technology’s needs and data privacy rights.
Data privacy around the world
There was plenty in the news involving data privacy around the world in 2024, and the laws and lawsuits reported on will continue to make headlines and shape the future of privacy in 2025.
AI, privacy, and consent
There have been complaints reported and lawsuits filed throughout 2024 regarding data scraping and processing without consent. Canadian news publishers and the Canadian Legal Information Institute most recently joined the fray. We don’t expect these issues to be resolved any time soon, though there should be some influential case law resulting once these cases have made their way through the courts. (Unlikely that all of them will be resolved by settlements.) The litigation may have significant implications for the future of these AI companies as well, and not just for their products.
Social media and data privacy
As noted, laws that dovetail with data privacy are also becoming increasingly notable. One recent interesting development is Australia passing a ban on social media for children under 16. In addition to mental health concerns, some social media platforms — including portfolio companies of Alphabet, Meta, and TikTok parent company ByteDance — have run afoul of data privacy regulators, with penalties for collecting children’s data without consent, among other issues. It will be very interesting to see how this ban rolls out, how it’s enforced, and if it serves as inspiration elsewhere for comparable legislation.
The latest generation of data privacy laws and regulatory updates
The UK adopted its own customized version of the General Data Protection Regulation (GDPR), the UK GDPR, upon leaving the EU. It has recently published draft legislation for the UK Data (Use and Access) Bill, which is meant to further modernize the UK GDPR and reform the way data is used to benefit the economy. We will see if the law does get passed and what its practical effects may be.
Further to recent laws and updates for which we are likely to see the effects in 2025, in September 2024, Vietnam issued the first draft of its Personal Data Protection Law (PDPL) for public consultation.
Malaysia passed significant updates to its Personal Data Protection Act (PDPA) via the Personal Data Protection (Amendment) Act. The PDPA was first passed in 2010, so it was due for updates, and companies doing business in the country can expect the new guidelines to be enforced.
Also, the two-year grace period on Law No. 27 in Indonesia’s Personal Data Protection law (PDP Law) ended in October 2024, so we can expect enforcement to ramp up there as well.
Asia already has considerable coverage with data privacy regulation, as countries like China, Japan, South Korea, and India all have privacy laws in effect as well.
The future of privacy compliance and consent and preference management
Just as the regulation of data privacy is reaching an inflection point of maturity and becoming mainstream, so are solutions for privacy compliance, consent, and preference management.
Integrated solutions for compliance requirements and user experience
Companies that are embracing Privacy-Led Marketing in their growth strategy want solutions that can meet several needs, support growth, and seamlessly integrate into their martech stack. Simply offering a cookie compliance solution will no longer be enough.
Managing data privacy will require solutions that enable companies to obtain valid consent — for requirements across international jurisdictions — and signal it to ad platforms and other important tools and services. In addition to consent, companies need to centralize privacy user experience to provide customers with clear ways to express their preferences and set permissions in a way that respects privacy and enables organizations to deliver great experiences with customized communications, offers, and more.
Customer-centric data strategies
It may take some time for third-party cookie use and third-party data to go away entirely, but zero- and first-party data is the future, along with making customers so happy they never want to leave your company. Rather than trying to collect every bit of data possible and preventing them from taking their business elsewhere.
We may see more strategies like Meta’s “pay or ok” attempt where users can pay a subscription fee to avoid having their personal data used for personalized ads, but given EU regulators’ response to the scheme, similar tactics are likely to have an uphill battle, at least in the EU.
Delivering peace of mind while companies to stay focused on their core business
SMBs, particularly, also have a lot to do with limited resources, in addition to focusing on growing their core business. We can expect to see further deep integration of privacy compliance tools and services. These solutions will automate not only obtaining and signaling consent to third-party services, but also notifying users about data processing services in use and data handling, e.g. via the privacy policy, responding to data subject access requests (DSAR), and other functions.
Further to international compliance requirements, as companies grow they are going to need data privacy solutions that scale with them, and enable them to easily handle the complexities of complying with the requirements of multiple privacy laws and other relevant international and/or industry-specific polices and frameworks.
Frameworks like the IAB’s Global Privacy Platform (GPP) are one way of achieving this, enabling organizations to select relevant regional privacy signals to include depending on their business needs.
Usercentrics in 2025
Our keyword to encapsulate data privacy for 2024 was “acceleration”. For 2025 it’s “maturity.” Data privacy laws and other regulations that include data privacy (like AI). Companies’ needs for solutions that enable multi-jurisdictional compliance and data management. The widespread embrace of data privacy as a key part of doing business, and strategizing Privacy-Led Marketing for sustainable growth and better customer relationships. The financial and operational risks of noncompliance moving beyond regulatory penalties to revenues from digital advertising, customer retention, and beyond.
The Usercentrics team is on it. We’ll continue to bring you easy to use, flexible, reliable solutions to manage consent, user preferences, and permissions, and enable you to maintain privacy compliance and be transparent with your audience as your company grows. With world-class support at every step, of course. Plus we have a few other things up our sleeves. (Like this.) Stay tuned! Here’s to the Privacy-Led Marketing era. We can’t wait to help your company thrive.
The Video Privacy Protection Act (VPPA) is a federal privacy law in the United States designed to protect individuals’ privacy regarding their video rental and viewing histories. The VPPA limits the unauthorized sharing of video rental and purchase records. It was passed in 1988 after the public disclosure of Supreme Court nominee Robert Bork’s video rental records raised concerns about the lack of safeguards for personal information.
At the time of the Act’s enactment, video viewing was an offline activity. People would visit rental stores, borrow a tape, and return it after watching. Today, streaming services and social media platforms mean that watching videos is a largely digital activity. In 2023, global revenue from online video streaming reached an estimated USD 288 billion, with the US holding the largest share of that market.
Still, the VPPA has remained largely unchanged since its enactment, apart from a 2013 amendment. However, recent legal challenges to digital video data collection have led courts to reinterpret how the law applies to today’s video viewing habits.
In this article, we’ll examine what the VPPA law means for video platforms, the legal challenges associated with the law, and what companies can do to enable compliance while respecting users’ privacy.
Scope of the Video Privacy Protection Act (VPPA)
The primary purpose of the Video Privacy Protection Act (VPPA) is to prevent the unauthorized disclosure of personally identifiable information (PII) related to video rentals or purchases. PII under the law “includes information which identifies a person as having requested or obtained specific video materials or services from a video tape service provider.”
The law applies to video tape service providers, which are entities involved in the rental, sale, or delivery of prerecorded video materials. Courts have interpreted this definition to include video streaming platforms like Hulu and Netflix, which have widely replaced physical video tape service providers.
The VPPA protects the personal information of consumers. The law defines consumers as “any renter, purchaser, or subscriber of goods or services from a video tape service provider.”
Video tape service providers are prohibited from knowingly disclosing PII linking a consumer to specific video materials, except in the following cases:
- direct disclosure to the consumer
- to a third party with informed, written consent provided by the consumer
- for legal purposes, such as in response to a valid warrant, subpoena, or court order
- limited marketing disclosures, but only if:
- consumers are given a clear opportunity to opt out, and
- the shared data includes only names and addresses and not specific video titles, unless it is for direct marketing to the customer
- as part of standard business operations, such as processing payments
- under a court order, if a court determines the information is necessary and cannot be met through other means, and the consumer is given the opportunity to contest the claim
The 2013 amendment expanded the conditions for obtaining consent, including through electronic means using the Internet. This consent must:
- be distinct and separate from other legal or financial agreements
- let consumers provide consent either at the time of disclosure or in advance for up to two years, with the option to revoke it sooner
- offer a clear and conspicuous way for consumers to withdraw their consent at any time, whether for specific instances or entirely
Tracking technologies and Video Privacy Protection Act (VPPA) claims
Tracking technologies like pixels are central to many claims alleging violations of the VPPA. Pixels are small pieces of code embedded on websites to monitor user activities, including interactions with online video content. These technologies can collect and transmit data, such as the titles of videos someone viewed, along with other information that may identify individuals. This combination of data may meet the VPPA’s definition of personally identifiable information (PII).
VPPA claims often arise when companies use tracking pixels on websites with video content and transmit information about users’ video viewing activity to third parties without requesting affirmative consent. Courts have debated what constitutes a knowing disclosure under the VPPA, but installing tracking pixels that collect and share video data has been found sufficient to potentially establish knowledge in some cases.
Lawsuits under the Video Privacy Protection Act (VPPA)
Many legal claims under the VPPA focus on one or more of three critical questions:
- Does the party broadcasting videos qualify as a video tape service provider?
- Is the individual claiming their rights were violated considered a consumer?
- Does the disclosed information qualify as PII?
Below, we’ll look at how courts have considered these questions and interpreted the law in the context of digital video consumption.
Does the party broadcasting video qualify as a video tape service provider?
Who is considered a video tape service provider under the law may depend on multiple factors. Courts have established that online streaming services qualify, but some rulings have considered other factors, which we’ll outline below, to decide whether a business meets the law’s definition.
Live streaming
The VPPA law defines a video tape service provider as a person engaged in the business of “prerecorded video cassette tapes or similar audiovisual materials.” In 2022, a court ruled that companies do not qualify as video tape service providers for any live video broadcasts, as live streaming does not involve prerecorded content.
However, if a company streams prerecorded content, it may qualify as a video tape service provider in relevant claims.
“Similar audio visual materials”
The definition of a video tape service provider in the digital age includes more than just video platforms that broadcast movies and TV shows. In a 2023 case, a court ruled that a gaming and entertainment website offering prerecorded streaming video content fell within the scope of the VPPA definition of a video tape service provider.
Focus of work
Another 2023 ruling found that the VPPA does not apply to every company that happens to deliver audiovisual materials “ancillary to its business.” Under this decision, a video tape service provider’s primary business must involve providing audiovisual materials. Businesses using video content only as part of their marketing strategy would not qualify as a video tape service provider under this reading of the law.
Is the individual claiming rights violations considered a consumer?
Online video services frequently operate on a subscription-based business model. Many legal challenges under the VPPA focus on whether an individual qualifies as a “subscriber of goods and services from a video tape service provider.”
Type of service subscribed to
Courts have varied in their opinions on whether being a consumer depends on subscribing to videos specifically. In a 2023 ruling, a court held that subscribing to a newsletter that encourages recipients to view videos, but is not a condition to accessing them, does not qualify an individual as a subscriber of video services under the VPPA.
By contrast, a 2024 ruling took a broader approach, finding that the term “subscriber of goods and services” is not limited to audiovisual goods or services. The Second Circuit Federal Court of Appeal determined that subscribing to an online newsletter provided by a video tape service provider qualifies an individual as a consumer. This decision expanded the definition to recognize individuals who subscribe to any service offered by a video tape service provider as consumers.
Payment
Courts have generally agreed that providing payment to a video tape service provider is not necessary for an individual to be considered a subscriber. However, other factors play a role in establishing this status.
A 2015 ruling held that being a subscriber requires an “ongoing commitment or relationship.” The court found that merely downloading a free mobile app and watching videos without registering, providing personal information, or signing up for services does not meet this standard.
However, in a 2016 case, the First Circuit Federal Court of Appeal determined that providing personal information to download a free app — such as an Android ID and GPS location — did qualify the individual as a subscriber. Similarly, in the 2024 ruling above, the Second Circuit found that providing an email address, IP address, and device cookies for newsletter access constituted a meaningful exchange of personal information, qualifying the individual as a subscriber.
Does the disclosed information qualify as PII?
Courts have broadly interpreted PII to include traditional identifiers like names, phone numbers, and addresses, as well as digital data that can reasonably identify a person in the context of video consumption.
In the 2016 ruling referenced above, the First Circuit noted that “[m]any types of information other than a name can easily identify a person.” The court held that GPS coordinates and device identifier information can be linked to a specific person, and therefore qualified as PII under the VPPA.
Just two months later, the Third Circuit Court of Appeal ruled more narrowly, stating that the law’s prohibition on disclosing PII applies only to information that would enable an ordinary person to identify a specific individual’s video-watching behavior. The Third Circuit held that digital identifiers like IP addresses, browser fingerprints, and unique device IDs do not qualify as PII because, on their own, they are not enough for an ordinary person to identify an individual.
These conflicting rulings highlight the ongoing debate about what constitutes PII, especially as digital technologies continue to evolve.
Consumers’ rights under the Video Privacy Protection Act (VPPA)
Although not explicitly framed as consumer rights under the law, the VPPA does grant consumers several rights to protect their information.
- Protection against unauthorized disclosure: Consumers’ PII related to video rentals, purchases, or viewing history cannot be disclosed without consent or other valid legal basis.
- Right to consent: Consumers must provide informed, written consent before a video tape service provider can disclose their PII. This consent must be distinct and separate from other agreements and can be given for a set period (up to two years) or revoked at any time.
- Right to opt out: Consumers must be given a clear and conspicuous opportunity to opt out of the disclosure of their PII.
- Right to notice in legal proceedings: If PII is to be disclosed under a court order, consumers must be notified of the proceeding and given an opportunity to appear and contest the disclosure.
- Right to private action: Consumers can file civil proceedings against video tape service providers for violations of the VPPA.
Penalties under the Video Privacy Protection Act (VPPA)
The VPPA law allows individuals affected by violations to file civil proceedings. Remedies available under the law include damages up to USD 2,500 per violation.
Courts may also award punitive damages to penalize particularly egregious or intentional misconduct. Additionally, plaintiffs can recover reasonable attorneys’ fees and litigation costs. Courts may also grant appropriate preliminary or equitable relief.
The VPPA statute of limitations requires that any lawsuit be filed within two years from the date the violation, or two years from when it was discovered.
Compliance with the Video Privacy Protection Act (VPPA)
Businesses that act as video tape service providers under the VPPA can take several steps to meet their legal obligations.
1. Conduct a data privacy audit
A data privacy audit can help businesses understand what personal data they collect, process, and store, and whether these practices comply with the VPPA. The audit should include assessing the use of tracking technologies like pixels and cookies to confirm whether they are correctly set up and classified.
2. Obtain informed, specific user consent
The VPPA requires businesses to obtain users’ informed, written consent before sharing PII. Implementing a consent management platform (CMP) like Usercentrics CMP can make it easier to collect, manage, and store consent from users.
VPPA compliance also requires businesses to provide clear and easy to find options for consumers to opt out of data sharing, which a CMP can also facilitate. The VPPA amendment outlines that consent records should not be stored for more than two years, and businesses must have a process for renewing consent before it expires.
3. Implement transparent communication practices
Businesses should help consumers understand how their data is used so they can make an informed decision about whether to consent to its disclosure. Cookie banners used to obtain consent should contain simple, jargon-free language to explain the purpose of cookies. They should clearly indicate if third-party cookies are used and identify the parties with whom personal information is shared.
Businesses should include a direct link to a detailed privacy policy, both in the cookie banner and in another conspicuous location on their website or mobile app. Privacy policies must explain how PII is collected, used, and shared, along with clear instructions on how consumers can opt out of PII disclosures.
4. Consult qualified legal counsel
Legal experts can help businesses achieve VPPA compliance and offer tailored advice based on specific business operations. Counsel can also help businesses keep up with current litigation to understand how courts are interpreting the VPPA, which is critical as the law continues to face new challenges and evolving definitions.
Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or privacy specialists regarding data privacy and protection issues and operations.
The EU Cyber Resilience Act (CRA) has been in the works for several years, but has now been adopted by EU regulators. It enters into force 10 December 2024, though its provisions will be rolled out over the next several years. We look at what the CRA is, who it affects, and what it means for businesses in EU markets.
What is the Cyber Resilience Act (CRA)?
The EU Cyber Resilience Act aims to bring greater security to software and hardware that includes digital elements, as well as the networks to which these products connect. Focused around cybersecurity and reducing vulnerabilities, the law covers products that can connect to the internet, whether wired or wireless, like laptops, mobile phones, routers, mobile apps, video games, desktop applications, and more.
The CRA enters into force 10 December 2024, though requirements are being rolled out gradually. Organizations have 21 months from the law coming into effect to start meeting reporting obligations, and by late 2027 all remaining provisions will be in effect (36 months from December 2024).
Broader scope of EU cybersecurity initiatives
The CRA is part of the larger EU Cybersecurity Strategy, particularly the Directive on measures for a high common level of cybersecurity across the European Union, known as the NIS2 Directive. The Strategy aims to “build resilience to cyber threats and ensure citizens and businesses benefit from trustworthy digital technologies.” It also aims to address the cross-border nature of cybersecurity threats to help ensure products sold across the EU meet adequate and consistent standards.
With the ever-growing number of connected products in consumers’ lives and used for business operations, the need for security and vigilance in manufacturing and consumer goods is only likely to grow. The law also intends to ensure consumers receive adequate information about the security and vulnerabilities of products they purchase so they can make informed decisions at home and at work.
Who and what does the Cyber Resilience Act apply to?
The CRA applies to manufacturers, retailers, and importers of products — both hardware and software — if they have digital components. This does include consent management platforms.
Under the law, included products will have to comply with specific requirements throughout the full product lifecycle, from the design phase to when they’re in consumers’ hands. Design, development, and production will need to ensure adequate levels of cybersecurity based on risk levels and factors. It’s a bit like the concept of privacy by design, but even more security-focused and codified into law.
How can companies comply with the Cyber Resilience Act?
Companies required to comply will have responsibilities for bringing products to market that do not have any known vulnerabilities that can be exploited, and that are configured in a way that is “secure by default”. Products will also need to bear the CE mark to show compliance.
Additionally, companies will need to implement various other security measures, including:
- control mechanisms like authentication and identity/access management
- high level encryption (both in transit and at rest)
- mechanisms to enable resilience against denial-of-service (DOS) attacks
Handling vulnerabilities under the CRA
There are specific requirements for manufacturers for handling vulnerabilities, including identifying and documenting components the products contain and any vulnerabilities, also creating a software bill of materials that lists top-level dependencies that’s in a common, machine-readable format (where relevant).
Any discovered vulnerabilities will have to be addressed through subsequent security updates that will have to meet a number of requirements:
- delivered without delay
- provided free of charge
- including advisory messages for users, with information like necessary actions
- implementation of a vulnerability disclosure policy
- public disclosure of repaired vulnerabilities, with:
- description of the vulnerabilities
- information to identify the product affected
- severity and impact of the vulnerabilities
- information to help users remediate the vulnerabilities
Reporting requirements under the CRA
In the event of a severe cybersecurity incident or exploited vulnerability, the manufacturer will have to report the issue by electronic notification to the European Union Agency for Cybersecurity and the competent computer security incident response team within 24 hours (a number of factors will be used to determine who makes up this team). Followup notices are also usually required within 72 hours and 14 days. Timely notification of product end users is also required.
Consent requirements under the CRA
The CRA is focused on cybersecurity, so does not focus on the end user or on consent or its management like the GDPR does, for example. However, like data privacy laws, it requires transparency and notification of important information, including reporting to authorities as required, and to end users in the event of a security incident.
Manufacturers’ provision of clear information on cybersecurity measures and potential vulnerabilities in their products will enable informed decision-making by consumers. This is also a goal of data privacy laws like the GDPR.
Additionally, the regulation is quite clear on products’ need for security to prevent unauthorized access and to protect potentially sensitive personal data, also goals of privacy regulations.
Critical products and special requirements under the CRA
Hardware and software products with digital elements face different requirements under the CRA depending on factors like use. For example, some products are considered critical because under the NIS2 Directive essential entities critically rely on them.
Cybersecurity incidents or vulnerability exploitation with these products could seriously disrupt crucial supply chains or networks; pose a risk to safety, security, or health of users; and/or are critical to the cybersecurity of other networks, products, or services. The European Commission will maintain the list of critical products. Examples include “Hardware Devices with Security Boxes” and smartcards.
Products considered critical will have to obtain a European cybersecurity certificate at the required level, e.g. assurance, substantial, etc. in keeping with an accepted European cybersecurity certification scheme where possible. There is also a list of “important” products that will need to meet conformity assessment requirements, though these are not classed as critical. These include VPNs, operating systems, identity management systems, routers, interconnected wearables, and more. The European Commission will also maintain this list.
What are the exclusions to Cyber Resilience Act compliance?
Certain products that are already covered by other product safety regulations are excluded from the scope of the CRA. These include motor vehicles, civil aviation vehicles, medical devices, products for national security or defense, etc. Hardware without digital elements would not be included, nor would products that can’t be connected to the internet or other network, or that can’t be exploited through cyber attack (e.g. it holds no data).
What are the penalties for noncompliance under the Cyber Resilience Act?
Failure to maintain adequate security standards, fix vulnerabilities, notify relevant authorities and parties about security incidents, or otherwise violating the CRA can result in fines up to EUR 15 million or 2.5 percent of global annual turnover for the preceding year, whichever is higher. These penalties are even higher than the first tier of penalties for GDPR violations.
Usercentrics and the Cyber Resilience Act
The CRA will apply to our products by 2027. However, Usercentrics takes security and data protection as seriously as we do valid consent under international privacy laws — today and every day. We are always evaluating our practices, from design to development to implementation and maintenance, and will continue to upgrade our products and systems to keep them, our company, and our partners and customers as safe as possible.