Skip to content

In addition to being convenient and efficient, single sign-on is meant to be a more secure way to handle user logins. Instead of having to set up and record or remember separate usernames and passwords for every site, individuals can login with account credentials they already have. Large tech platforms accounts are popular options, like Google or Facebook.

But that added security and convenience can result in a privacy violation if requirements of data privacy laws like the European Union’s General Data Protection Regulation (GDPR). Where is the data that’s collected stored? And who may have access to it?

This type of issue is what happened when visitors to a website managed by the European Commission were able to login using social platform credentials, resulting in an infringement to EU citizens’ privacy rights.

We look at what happened and how, as well as how the Commission was penalized and what companies can learn to employ single sign-on compliantly.

Conference on the Future of Europe website login and the complaint

The Conference on the Future of Europe ran in 2021 and 2022, and visitors to the website could register for the various related events there. The European Commission (the Commission) managed the conference website. 

One of the login options for visitors interested in registering for conference events was single sign-on using social platform login credentials. Specifically, there was a “Sign in with Facebook” link on the login web page.

However, as Meta Platforms, Facebook’s parent company, is located in the United States, if an EU resident used this login method, it created the conditions for that user’s personal data to potentially be transferred to the United States without the individual’s knowledge or consent. 

Who was affected by the GDPR violation?

An individual residing in Germany logged in to the conference website and registered for the “GoGreen” event using his Facebook account credentials. According to the individual, in doing so, his personal data was collected and transferred to the US, including IP address plus browser and device information. 

Amazon Web Services was the operator of the Amazon CloudFront content delivery network in use by the conference website, which is how his personal data was transferred. Amazon is also based in the United States. 

The individual who made the complaint maintained that the data transfers created a risk of his data being accessed by US security and intelligence services. An additional claim was that neither the Commission nor conference organizers indicated that appropriate measures were in place to prevent or justify those data transfers if visitors used that sign-in method.

How did the European Commission violate the GDPR?

The Court of Justice of the European Union (CJEU) found that the “Sign in with Facebook” link on the conference website created conditions for transferring the complainant’s personal data to Facebook, which, as noted, is based in the US. As the European Commission managed the conference website, they were responsible for the data transfer and contravened their own rules.

At the time the transfer occurred (the conference ran in 2021 and 2022), the US was not considered adequate for ensuring data protections for the personal data of EU residents. The EU-U.S. Privacy Shield framework had been struck down in 2020 and the EU-U.S. Data Privacy Framework, which introduced a new adequacy agreement between the two regions, was not enacted until 2023.

Additionally, the Commission was found not to have demonstrated nor claimed that an appropriate safeguard for personal data transfers was in place for personal data obtained and transferred via the login using Facebook account credentials, i.e. a standard contractual clause or data protection clause. Facebook’s platform entirely governed the terms and conditions of displaying — and as a result, logging in with — the “Sign in with Facebook” link.

The CJEU found that the Commission did not comply with the requirements of EU law for data transfers to a third country by “an EU institution, body, office or agency” (Chapter 5 GDPR.)

How was the complaint resolved?

The complainant was awarded EUR 400 by the CJEU, to be paid by the European Commission, as compensation for non-material damage experienced due to the data transfers.

The complainant also sought several other methods of redress, including:

The CJEU dismissed all three. The Court found that in one connection the data was transferred to a server in Germany, rather than to the United States, as Amazon Web Services is required to ensure that data remains in Europe in transit and at rest. 

In another connection, the Court found that the complainant was responsible for the redirection of the data to US-based services via the Amazon CloudFront routing mechanism. A technical adjustment made the complainant appear to be located in the US at that time. Using a VPN can cause this result.

How can companies operating in digital spaces protect their operations?

Single sign-on options using popular tech platforms are convenient. But companies that knowingly or that may process personal data from EU residents need to be aware of how the login process works, what personal data is collected, where it may be transferred to and stored, and who may have access to it. Users whose personal data may be processed need to be informed as well and enabled to exercise their rights under relevant laws. 

Facebook and Google are two such popular platforms where their account credentials are used for single sign-on, and they are both US-based companies, though they do have EU-based servers and data centers, necessitated by certain legal requirements.

If providing such login options is necessary on your website, ensure that the required agreements and/or contractual clauses to ensure adequate data protection are in place and that users are adequately informed and their privacy rights — including consent or opt-out — are maintained. 

This also goes for other third-party services that process users’ personal data, which many companies use on their websites for advertising, analytics, ecommerce fulfillment, and other functions. Under the GDPR and other data privacy laws, controllers are responsible for the privacy compliance and data security of third-party processors working for them.

Obtain informed and explicit consent from website visitors and others whose personal data is collected and processed for various purposes, so that they know about the data processing and third parties that may have access to their data, and can exercise their rights and consent choices. 

A consent management platform would have enabled the Commission to notify users about personal data collection and transfer and obtain their consent. Or enable them to use another login option if they declined.

Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or privacy specialists regarding data privacy and protection issues and operations.

The General Data Protection Regulation (GDPR) sets strict standards for how organizations must handle personal data collected from individuals in the European Union (EU) and European Economic Area (EEA). This comprehensive data protection regulation applies to all organizations that collect or process this data — regardless of where the organization is located — if they offer goods or services to EU/EEA residents or monitor their behavior.

Among its many requirements, the GDPR places specific legal obligations on how organizations may handle special categories of personal data or sensitive personal data. These data categories receive additional protections due to their potential impact on an individual’s rights and freedoms if they are misused.

In this article, we’ll look at what constitutes sensitive personal data under the GDPR, what additional protections it receives, and the steps organizations can take to achieve compliance with the GDPR’s requirements.

What is sensitive personal data under the GDPR?

Sensitive personal data includes specific categories of data that require heightened protection under the GDPR, because their misuse could significantly impact an individual’s fundamental rights and freedoms.

Under Art. 9 GDPR, sensitive personal data is:

Recital 51 GDPR elaborates that the processing of photographs is not automatically considered processing of sensitive personal data. Photographs fall under the definition of biometric data only when processed through specific technical means that allow the unique identification or authentication of a natural person.

By default, the processing of sensitive personal data is prohibited under the GDPR. Organizations must meet specific conditions to lawfully handle such information.

This higher standard of protection reflects the potential risks associated with the misuse of sensitive personal data, which could lead to discrimination, privacy violations, or other forms of harm.

What is the difference between personal data and sensitive personal data?

Under the GDPR, personal data includes any information that can identify a natural person — known as a data subject under the regulation — either directly or indirectly. This may include details such as an individual’s name, phone number, email address, physical address, ID numbers, and even IP address and information collected via browser cookies.

While all personal data requires protection, sensitive personal data faces stricter processing requirements and heightened protection standards. Organizations must meet specific conditions before they can collect or process it.

The distinction lies in both the nature of the data and its potential impact if misused. Regular personal data helps identify an individual, while sensitive personal data can reveal intimate details about a person’s life, beliefs, health, financial status, or characteristics that could lead to discrimination or other serious consequences if compromised.

Conditions required for processing GDPR sensitive personal data

Under the GDPR, processing sensitive personal data is prohibited by default. However, Art. 9 GDPR outlines specific conditions under which processing is allowed.

Explicit consent given by the data subject, with the right to withdraw Required for legal obligations in employment or social protection Necessary to protect life when consent cannot be given Processed by nonprofits for members, without external disclosure Made publicly available by the data subject Required for legal proceedings or judicial actions Necessary for substantial public interest under the law Needed for medical care, diagnosis, or health system management Used for disease control or medical safety, with confidentiality safeguards Required for archiving, scientific research, or statistical purposes

The GDPR authorizes EU member states to implement additional rules or restrictions for processing genetic, biometric, or healthcare data. They may establish stricter standards or safeguards beyond the regulation’s requirements.

Art. 4 GDPR defines consent as “any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her.”

Although the GDPR does not separately define explicit consent, it does require a clear and unambiguous action from users to express their acceptance of data processing. In other words, users must take deliberate steps to consent to their personal data being collected. Pre-ticked boxes, inactivity, or implied consent through continued use of a service do not meet GDPR requirements for explicit consent.

Common examples of explicit consent mechanisms include:

Additional compliance requirements for processing sensitive personal data under the GDPR

Organizations processing personal data under the GDPR must follow several core obligations. These include maintaining records of processing activities, providing transparent information on data practices, and adhering to principles such as data minimization and purpose limitation. However, processing sensitive personal data requires additional safeguards due to the potential risks involved.

Data Protection Officer (DPO)

Organizations with core activities that involve large-scale processing of sensitive personal data must appoint a Data Protection Officer (DPO) under Art. 37 GDPR. The DPO may be an employee of the organization or an outside consultant.

Among other responsibilities, the DPO monitors GDPR compliance, advises on data protection obligations, and acts as a point of contact for regulatory authorities.

Data Protection Impact Assessment (DPIA)

Art. 35 GDPR requires a Data Protection Impact Assessment (DPIA) for processing operations that are likely to result in high risks to individuals’ rights and freedoms. A DPIA is particularly important when processing sensitive data on a large scale. This assessment helps organizations identify and minimize data protection risks before beginning processing activities.

Restrictions on automated processing and profiling

Art. 22 GDPR prohibits automated decision-making, including profiling, based on sensitive personal data unless one of the following applies:

If automated processing of sensitive personal data is permitted under these conditions, organizations must implement safeguards to protect individuals’ rights and freedoms.

Penalties for noncompliance with the GDPR

GDPR penalties are substantial. There are two tiers of fines based on the severity of the infringement or if it’s a repeat offense.

For severe infringements, organizations face fines up to:

Less severe violations can result in fines up to:

While violations involving sensitive personal data are often categorized as severe, supervisory authorities will consider the specific circumstances of each case when determining penalties.

Practical steps for organizations to protect GDPR sensitive personal data

Organizations handling sensitive personal data must take proactive measures to meet GDPR requirements and protect data subjects’ rights.

Conduct data mapping

Organizations should identify and document all instances in which sensitive personal data is collected, processed, stored, or shared. This includes tracking data flows across internal systems and third-party services. A thorough data inventory helps organizations assess risks, implement appropriate safeguards, and respond to data subject requests efficiently.

Develop internal policies

Establish clear internal policies and procedures to guide employees through the proper handling of sensitive personal data. These policies should cover, among other things, data access controls, storage limitations, security protocols, and breach response procedures, as well as specific procedures for data collection, storage, processing, and deletion. Organizations should conduct regular training programs to help employees understand their responsibilities and recognize potential compliance risks.

The GDPR requires businesses to obtain explicit consent before processing sensitive personal data. Consent management platforms (CMPs) like Usercentrics CMP provide transparent mechanisms for users to grant or withdraw explicit consent, which enables organizations to be transparent about their data practices and maintain detailed records of consent choices.

Manage third-party relationships

Many businesses rely on third-party vendors to process sensitive personal data, so it’s essential that these partners meet GDPR standards. Organizations should implement comprehensive data processing agreements (DPAs) that define each party’s responsibilities, outline security requirements, and specify how data will be handled, stored, and deleted. Businesses should also conduct due diligence on vendors to confirm their compliance practices before engaging in data processing activities. 

Perform regular audits

Conducting periodic reviews of data processing activities helps businesses identify compliance gaps and address risks before they become violations. Review consent management practices, security controls, and third-party agreements on a regular basis to maintain GDPR compliance and respond effectively to regulatory scrutiny.

Checklist for GDPR sensitive personal data handling compliance

Below is a non-exhaustive checklist to help your organization handle sensitive personal data in compliance with the GDPR. This checklist includes general data processing requirements as well as additional safeguards specific to sensitive personal data. 

Obtain explicit consent before processing sensitive data Make consent withdrawal simple and accessible Stop processing data if consent is withdrawn Implement strong security measures to protect sensitive data Document all processing activities with their purpose, legal basis, and retention periods Create clear privacy policies about data usage and users’ rights Review and update data protection policies often Train employees on GDPR requirements and data handling rules Set up data breach detection and reporting systems Conduct Data Protection Impact Assessments (DPIAs) for high-risk data processing activities Assess whether you need a Data Protection Officer Review third-party processor compliance regularly
Download checklist

For advice specific to your organization, we strongly recommend consulting a qualified legal professional or data privacy expert.

Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or privacy specialists regarding data privacy and protection issues and operations.

Google Analytics is a powerful tool for understanding website performance, user behavior, and traffic patterns. However, its compliance with the General Data Protection Regulation (GDPR) has been a subject of concern and controversy, particularly in the European Union (EU). The data protection authorities of several European Union (EU) countries have weighed in on privacy compliance issues with Google Analytics, with similar complaints that focus on its insufficient protections and data transfer practices.

In this article, we’ll examine the timeline of EU-US data transfers and the law, the relationship between Google Analytics and data privacy, and whether Google’s popular service is — or can be — GDPR-compliant.

Google Analytics and data transfers between the EU and US

One of the key compliance issues with Google Analytics is its storage of user data, including EU residents’ personal information, on US-based servers. Because Google is a US-owned company, the data it collects is subject to US surveillance laws, potentially creating conflicts with EU privacy rights.

The EU-US Privacy Shield was invalidated in 2020 with the Schrems II ruling, and there was no framework or Standard Contractual Clauses (SCC) in place for EU to US data transfers until September 2021 when new SCCs were implemented. These were viewed as a somewhat adequate safeguard if there were additional measures like encryption or anonymization in place to make data inaccessible by US authorities.

A wave of rulings against Google Analytics after the invalidation of the Privacy Shield

The Schrems II ruling sparked a series of legal issues and decisions by European Data Protection Authorities (DPAs), which declared the use of Google Analytics as noncompliant with the GDPR.

A week before the Austrian ruling, the European Data Protection Supervisor (EDPS) sanctioned the European Parliament for using Google Analytics on its COVID testing sites due to insufficient data protections. This is viewed as one of the earliest post-Schrems II rulings and set the tone for additional legal complaints.

The EU-U.S. Data Privacy Framework

On July 10, 2023, the European Commission adopted its adequacy decision for the EU-U.S. Data Privacy Framework, which covers data transfers among the EU, European Economic Area (EEA) and the US in compliance with the GDPR.

The framework received some criticism from experts and stakeholders. Some privacy watchdogs, including the European Data Protection Board (EDPB), pointed out striking similarities between the new and the previous agreements, raising doubts about its efficacy in protecting EU residents’ data.

As of early 2025, the EU-U.S. Data Privacy Framework and adequacy for EU/U.S. data transfers are in jeopardy. President Trump fired all of the Democratic party members of the Privacy and Civil Liberties Oversight Board (PCLOB). As a result, the number of PCLPB board members is below the threshold that enables the PCLOB to operate as an oversight body for the EU-U.S. Data Privacy Framework.

This action will likely undermine the legal validity of the Framework for EU authorities, particularly the courts. The EU Commission could withdraw its adequacy decision for the EU-U.S. Data Privacy Framework, which would invalidate it. The Court of Justice of the EU (CJEU) could also overturn the Commission’s adequacy decision following a legal challenge. The last option is how the preceding agreements to the Framework were struck down, e.g. with Schrems II. 

Should the EU-U.S. Data Privacy Framework be struck down, it could have significant effects on data transfers, cloud storage, and the function of platforms based outside of the EU, like those from Google, including Analytics. At the very least, Google may be required to make further changes to the function of tools like Google Analytics, along with related data storage, to meet European privacy standards.

Google Analytics GDPR compliance?

Google Analytics 4 has several significant changes compared to Universal Analytics. The new version adopts an event-based measurement model, contrasting the session-based data model of Universal Analytics. This shift enables Google Analytics 4 to capture more granular user interactions, better capturing the customer journey across devices and platforms. Website owners can turn this off to stop it from collecting data such as city or latitude or longitude, among others. Website owners also have the option to delete user data upon request.

Another notable feature is that Google Analytics 4 does not log or store IP addresses from EU-based users. According to Google, this is part of Google Analytics 4’s EU-focused data and privacy measures. This potentially addresses one of the key privacy concerns raised by the Data Protection Authorities, which found that anonymizing IP addresses was not an adequate level of protection.

The EU-U.S. Data Privacy Framework alone doesn’t make Google Analytics 4 GDPR-compliant. The framework can make data transfers to the US compliant, if they are with a certified US company, but the onus is on website owners to ensure that the data was collected in compliance with the legal requirements of the GDPR in the first place.

How to make Google Analytics GDPR compliant

All Google Analytics cookies should be set up and controlled so they only activate after users have granted explicit consent. Users should also have granular control so that they can choose to allow cookies for one purpose while rejecting cookies for another.

A consent management platform (CMP) like Usercentrics can enable blocking of the activation of services until user consent has been obtained. Google Analytics couldn’t transfer user data because it would never have collected it.

Google Consent Mode allows websites to dynamically adjust the behavior of Google tags based on the user’s consent choices regarding cookies. This feature ensures that measurement tools, such as Google Analytics, are only used for specific purposes if the user has given their consent, even though the tags are loaded onto the webpage before the cookie consent banner appears. By implementing Google Consent Mode, websites can modify the behavior of Google tags after the user allows or rejects cookies so that it doesn’t collect data without consent.

Read about consent mode GA4 now

Website operators must provide clear, transparent data processing information for users on the website. This information is included in the privacy policy. Information related specifically to cookies should be provided in the cookie policy, with details of the Google Analytics cookies and other tracking technologies that are used on the site, including the data collected by these cookies, provider, duration and purpose. The cookie policy is often a separate document, but can be a section within the broader privacy policy.

The GDPR requires user consent to be informed, which is what the privacy policy is intended to enable. To help craft a GDPR-compliant privacy policy, extensive information on the requirements can be found in Articles 12, 13 and 14 GDPR.

4. Enter into a Data Processing Agreement with Google

A data processing agreement (DPA) is a legally binding contract and a crucial component of GDPR compliance. The DPA covers important aspects such as confidentiality, security measures and compliance, data subjects’ rights, and the security of processing. It helps to ensure that both parties understand their responsibilities and take appropriate measures to protect personal data. Google has laid down step-by-step instructions on how to accept its DPA.

Can server-side tracking make Google Analytics more privacy-friendly?

Server side tracking allows for the removal or anonymization of personally identifiable information (PII) before it reaches Google’s servers. This approach can improve data accuracy by circumventing client-side blockers, and it offers a way to better align with data protection regulations like the GDPR. By routing data through your own server first, you gain more control over what eventually gets sent to Google Analytics.

Impact of the Digital Markets Act on Google Analytics 4


The implementation of the Digital Markets Act (DMA) has had some impact on Google Analytics 4, affecting functions, data collection practices, and privacy policies. Website owners who use the platform have been encouraged to take the following steps for ongoing compliance.

  1. Audit your privacy policy, cookies policy and data practices.
  2. Conduct a data privacy audit to check compliance with GDPR, and take any corrective steps if necessary.
  3. Install a ​​CMP that enables GDPR compliance to obtain valid user consent per the regulation’s requirements.
  4. Seek advice from qualified legal counsel and/or a privacy expert, like a Data Protection Officer, on measures required specific to your business.

Learn more about DMA compliance.

How to use Google Analytics 4 and achieve GDPR compliance with Usercentrics CMP

Taking steps to meet the conditions of Art. 7 GDPR for valid user consent, website operators must obtain explicit end-user consent for all Google Analytics cookies set by the website. Consent must be obtained before these cookies are activated and in operation. Using Usercentrics’ DPS Scanner helps identify and communicate to users all cookies and tracking services in use on websites to ensure full consent coverage options. 

Next steps with Google Analytics and Usercentrics

Google Analytics helps companies pursue growth and revenue goals, so understandably, businesses are caught between not wanting to give that up, but also not wanting to risk GDPR violation penalties or the ire of their users over lax privacy or data protection.

The Usercentrics team closely monitors regulatory changes and legal rulings, makes updates to our services and posts recommendations and guidance as appropriate. 

However, website operators should always get relevant legal advice from qualified counsel regarding data privacy, particularly in jurisdictions relevant to them. This includes circumstances where there could be data transfers outside of the EU to countries without adequacy agreements for data privacy protection.

As the regulatory landscape and privacy compliance requirements for companies are complex and ever-changing, we’re here to help.

The Interactive Advertising Bureau (IAB) launched the Global Privacy Platform (GPP) in 2022, a project of, and part of the portfolio of solutions from, the IAB Tech Lab’s Global Privacy Working Group. The GPP is the result of significant collaboration among industry stakeholders, including leading tech companies and tech experts around the world.

In line with aspects of the evolution of data privacy, the GPP enables streamlined transmission of signals from websites and apps to ad tech vendors and advertisers. This includes consent, preferences, permissions, and other relevant and often legally required information that affects data handling tools and processing. We look at how this tool can benefit publishers as data privacy compliance requirements expand and evolve, especially across digital marketing platforms.

What is the Global Privacy Platform (GPP)?

The GPP provides a framework for publishers that works similarly to the TCF or Google Consent Mode. Where Consent Mode signals consent information to Google services’ tags to control use of cookies and trackers, the GPP is a protocol that enables simple and automated communication of users’ consent and preference choices via a signal to third parties like ad tech vendors. 

The GPP enables advertisers, publishers, and technology vendors in the digital advertising industry to adapt to regulatory demands over time and across markets. It employs a GPP String, which encapsulates and encodes transparency details and consumer choices (like granular consent) as applicable to each region, helping enable compliance with privacy requirements by jurisdiction.

How does the GPP signal work?

Digital property owners, like companies running websites or apps, are responsible for generating, transmitting, and documenting the GPP String and the information it sends. This enables data integrity and contributes to compliance.

Usercentrics CMP generates and manages the GPP String in an HTTP-transferable and encoded format. Ad tech vendors receive user choice information for consent and preferences, and can decode the GPP String to determine compliance requirements and status for each user. 

The format’s flexibility enables granular regulatory coverage, e.g. state-specific strings for the US where data privacy laws are in effect. The GPP covers 15 states as of early 2025, and five more are expected to get coverage this year. Country and regional strings like for the US and EU are also supported, as are non-geographic signals like those from Global Privacy Control, which are browser-based, with recognition of them to date only required by some laws.

The GPP is designed to evolve as the data privacy and regulatory landscape does, not requiring significant redevelopment when requirements change. The IAB Tech Lab’s Global Privacy Working Group handles the ongoing work of the GPP’s technical specification.

Why do publishers and others in ad tech need the GPP?

The majority of the world’s population is now covered by at least one data privacy law. Some regions, like the European Union, have multiple laws that intersect in various ways. Additionally, these regulations affect major tech platforms, which are adopting more stringent requirements for their customers to enable privacy-compliant ecosystems. This has significant effects on digital advertising, as major players like Google and Facebook adapt their operations and requirements. 

Additionally, in the United States, there isn’t one federal law to comply with. To date the data privacy laws are state-level, so a company could have to comply with one or ten or more as the regulatory landscape continues to evolve. However, many of these US regulations are fairly similar, which does support the “US National” signaling approach. Companies need tools, like a consent management platform and the Global Privacy Platform, designed to evolve with changes and expansion in regulations.

The GPP is designed for flexibility and scalability. It supports all current privacy signals and will be able to support future ones as new laws are passed and existing ones evolve. The architecture is designed to grow with companies’ operations, enabling publishers to better respect users’ privacy choices and more effectively signal them to vendors and partners.

Does the GPP affect the TCF?

The GPP isn’t the IAB’s first framework for publishers and ad tech. The Transparency & Consent Framework (TCF) was launched in 2018, the same year the EU’s General Data Protection Regulation (GDPR) came into effect. As of 2024, the TCF is now at version 2.2.

The GPP is designed to better meet the needs of publishers that need to signal consent across multiple jurisdictions, as many companies doing business around the world — or across the United States — need to do.

The plan is to ensure that updates made to the TCF over time are also reflected in the GPP, giving companies the best tools to achieve and maintain compliance with their digital advertising operations. Eventually, the goal is for the Global Privacy Platform — as the name suggests — to be the single framework for consent and preference signaling.

In Europe and the UK, Google will continue to use the TCF and will not be accepting the GPP signal. Using Ad Manager will still require the use of a certified consent management platform integrated with the TCF. TCF strings sent through the GPP won’t be accepted.

What is the multi-state privacy agreement (MSPA) and how does the GPP affect it?

The Multi-State Privacy Agreement (MSPA) is an industry-centric contractual framework for companies doing business in the US, which covers 19 states as of early 2025. It’s meant to “aid advertisers, publishers, agencies, and ad tech intermediaries in complying with five state privacy laws.” The IAB Tech Lab is prioritizing updates to MSPA/US National before providing further state-specific strings, though that’s expected later in 2025. 

The MSPA evolved from the IAB’s Limited-Service Provider Agreement (LSPA), from 2020 and focused on CCPA/CPRA compliance initially. The evolution has focused on legal standards and protecting consumers’ privacy rights, and working with the GPP (including the specific privacy strings for each state). The MPSA is also designed for flexibility and scalability as US data privacy challenges become more complex.

The Global Privacy Platform currently supports various privacy signals around the world, both for their own frameworks and external ones. Some US state-level data privacy laws require recognizing a universal opt-out mechanism like Global Privacy Control, but not all of them.

GPP and international privacy laws

The Global Privacy Platform was designed to address the increasing complexity of data privacy regulation and requirements. Many companies do business across international jurisdictions and have many partners and vendors that they work with. This is only going to increase.

GPP and the GDPR

Europe has led the way in modern data privacy with the GDPR, TCF, and other relevant regulations and frameworks. It was the IAB Europe that brought the TCF to the market, and the GPP supports the EU TCF v2 signal. As noted, Google does not currently support the TCF via the GPP, so until industry adoption changes, this implementation isn’t recommended.

One of the main goals of the TCF was to help organizations meet GDPR compliance requirements, and the GPP is meant to extend this mandate.

GPP and PIPEDA

In Canada data privacy is governed by the Personal Information Protection and Electronic Documents Act (PIPEDA), which has been in effect since 2000, and a lot has changed since then. There are a number of requirements in PIPEDA and Quebec’s Law 25 that the GPP helps with, and the Platform already does support the CA TCF signal. Here are some of the benefits.

GPP and US privacy laws

The patchwork of data privacy laws and requirements in the US was a major factor in building out the Global Privacy Platform. As of the end of 2024, 21 data privacy laws have been passed by US state-level governments, which can introduce a lot of complexity into doing business. 

The IAB Tech Lab created the US Privacy Specifications, which have been used to support the CCPA Compliance Framework. However, a lot more laws have been passed since the CCPA came into effect. As of 2023, the US Privacy Specifications are not being updated, and have been replaced by state-specific privacy strings available via the GPP.

However, IAB MSPA US National also provides a national approach to privacy compliance with state-level laws by utilizing the highest standard. 

Additionally, the GPP is designed to evolve and scale with further data privacy regulatory requirements in the US, and to enable companies to manage consent and preferences with vendor relations in a streamlined way. This will also be relevant as more and more platforms evolve their data privacy requirements.

How Usercentrics supports the Global Privacy Platform

Usercentrics currently supports the GPP and is working toward additional regulatory coverage. Direct support from the Consent Management Platform’s Admin Interface is also being developed, along with further enhancements. 

The Usercentrics CMP integrates with the GPP and generates the necessary GPP string to signal consent information.

Companies serving Google ads in the EU, EEA, or UK also continue to need a Google-certified CMP like Usercentrics CMP, which comes with the TCF v2.2 integrated, since, as noted, Google will continue to only support this format and is not accepting TCF strings sent through the GPP.

As complexity and requirements for data privacy continue to evolve, and as individuals become more invested in their privacy and choice, it’s never been more important to invest in reliable, scalable tools to obtain, manage, and signal valid consent — in every region where you do business. It’s becoming a key competitive advantage to grow trust and revenue.

As more and more digital platforms adapt to regulatory requirements as well, your company’s international advertising operations will increasingly depend on how well you’ve implemented consent and preference management with tools like Usercentrics CMP and the Global Privacy Platform. The era of Privacy-Led Marketing is here, and Usercentrics has the tools to help you embrace it and grow with confidence.

In 2019, New York’s data breach laws underwent significant changes when the SHIELD Act was signed into law. The regulation has continued to evolve, with new amendments in December 2024. This article outlines the SHIELD Act’s requirements for businesses and protecting and handling New York state residents’ private information, from security requirements to breach notifications.

What is the New York SHIELD Act?

The New York Stop Hacks and Improve Electronic Data Security Act (New York SHIELD Act) established data breach notification and security requirements for businesses that handle the private information of New York state residents. The law updated the state’s 2005 Information Security Breach and Notification Act with expanded definitions and additional safeguards for data protection.

The New York SHIELD Act introduced several requirements to protect New York residents’ data. These include:

The law also increased penalties for noncompliance with its data security and breach notification requirements.

The New York SHIELD Act was implemented in two phases: 

Who does the New York SHIELD Act apply to?

The New York SHIELD Act applies to any person or business that owns or licenses computerized data containing the private information of New York state residents. It applies regardless of whether the business itself is located in New York. This scope marked a significant expansion from the previous 2005 law, which only applied to businesses operating within New York state. The law’s extraterritorial reach means that organizations worldwide must comply with its requirements if they possess private information of New York residents, even if they conduct no business operations within the state.

What is a security breach under the New York SHIELD law?

The New York SHIELD Act expanded the definition of a security breach beyond the 2005 law’s limited scope. The previous law only considered unauthorized acquisition of computerized data as a security breach. The New York SHIELD Act includes the following actions that compromise the security, confidentiality, or integrity of private information:

The law provides specific criteria to determine unauthorized access by examining whether an unauthorized person viewed, communicated with, used, or altered the private information.

What is private information under the New York SHIELD Act?

The New York SHIELD law defines two types of information: personal and private.

Personal information includes any details that could identify a specific person, such as their name or phone number.

Under the 2005 law, private information was defined as personal information concerning a natural person combined with one or more of the following: 

The New York SHIELD Act expands this definition of private information to include additional elements:

The law specifically states that publicly available information is not considered private information.

This definition is set to expand once again. On December 21, 2024, Governor Kathy Hochul signed two bills that strengthened New York’s data breach notification laws. Under one of the amendments, effective March 21, 2025, private information will include:

What are the data security requirements under the New York SHIELD Act?

This New York data security law requires any person or business that maintains private information to implement reasonable safeguards for its protection. There are three categories of safeguards required: administrative, technical, and physical.

Administrative safeguards include:

Technical safeguards include:

Physical safeguards include:

Businesses are deemed compliant with these safety requirements if they are subject to and compliant with certain federal laws, such as the Gramm-Leach-Bliley Act (GLBA), the Health Insurance Portability and Accountability Act (HIPAA), and the Health Information Technology for Economic and Clinical Health Act (HITECH).

What are the data breach notification requirements under the New York SHIELD law?

The New York SHIELD Act sets specific requirements for how and when businesses must notify individuals and authorities about data breaches involving private information.

The law previously required businesses that discover a security breach of computer data systems containing private information to notify affected consumers “in the most expedient time possible and without unreasonable delay.” The December 2024 amendment added a specific timeline to this requirement. Businesses now have a maximum of 30 days in which to notify affected New York state residents of data breaches. The 30-day time limit came into effect immediately upon the bill being signed.

The New York SHIELD Act also previously required businesses to notify three state agencies about security breaches: 

The December 2024 amendment added a fourth state agency to be notified, with immediate effect: the New York State Department of Financial Services. 

These notices must include information about the timing, content, distribution of notices, and approximate number of affected persons, as well as a copy of the template of the notice sent to affected persons. If more than 5,000 New York state residents are affected and notified, businesses must also notify consumer reporting agencies about the timing, content, distribution of notices, and approximate number of affected persons.

The law introduced specific restrictions on methods for notifying affected consumers. Email notifications are not permitted if the compromised information includes an email address along with a password or security question and answer that could allow access to the online account.

All notifications must provide contact information for the person or business notifying affected persons as well as telephone numbers and websites for relevant state and federal agencies that offer guidance on security breach response and identity theft prevention.

Enforcement of the New York SHIELD Act and penalties for noncompliance

The New York Attorney General has the authority to enforce the New York SHIELD Act, with the power to pursue injunctive relief, restitution, and penalties against businesses that violate the law.

The law establishes different levels of penalties based on the nature and severity of the violations. When businesses fail to provide proper breach notifications, but their actions are not reckless or intentional, courts may require them to pay damages that cover the actual costs or losses experienced by affected persons.

More severe penalties apply to knowing and/or reckless violations of notification requirements. In these cases, courts can impose penalties of up to USD 5,000 or USD 20 per instance of failed notification, whichever amount is greater. These penalties are capped at USD 250,000.

Businesses that fail to implement reasonable safeguards as required by the law face separate penalties. Courts can impose fines of up to USD 5,000 for each violation of these security requirements.

Impact of the New York SHIELD Act on businesses

The New York SHIELD law imposes significant obligations for any organization handling New York residents’ private information, regardless of location. Businesses must implement comprehensive data security programs with specific safeguards, meet strict breach notification deadlines, and prepare for expanded data protection requirements.

Key impacts include:

New York SHIELD Act Compliance Checklist

Download now

Below is a non-exhaustive checklist to help your business comply with the New York SHIELD Act. For advice specific to your organization, it’s strongly recommended to consult a qualified legal professional.

The Norwegian Electronic Communications Act (E-com Act / Ekomloven) has been updated, effective January 1, 2025. This follows the Norwegian Parliament (Stortinget) adopting a proposal submitted by the Norwegian Ministry of Digitalisation and Public Governance in November 2024. Previously, Norway’s cookie use and consent requirements were notably more lax than European standards.

The revision better aligns Norwegian regulation of cookie use with the GDPR and ePrivacy Directive (though Norway is not an EU Member State). It introduces stricter standards for obtaining and managing user consent for use of cookies and other tracking technologies.

Norway also has the Personal Data Act to protect data and privacy when data processing occurs, with oversight and enforcement by the Norwegian Data Protection Authority (Datatilsynet).

The updated guidelines affect all businesses operating websites or applications that have or target Norwegian users, so both Norway-based businesses and international companies with platforms, products, or services used by Norwegians.

Specific platforms and parameters that will be affected:

The E-comm Act’s consent requirements are now aligned with the stricter consent standards of the GDPR. Like in Art. 4(11) GDPR, consent must be “freely given, informed, specific, and unambiguous.” 

Active/explicit consent is mandatory, so users must perform a specific action to indicate giving consent. Ignoring a consent banner being construed as consent, or passive actions like pre-checking boxes or using browser settings is not allowed. Previously, some passive actions were acceptable under the law.

Businesses must also enable users to modify or withdraw previously granted consent at any time. The tools to do so must also be user-friendly to be compliant with the law’s requirements.

Only cookies or tracking technologies classified as “strictly necessary” can be used to collect data without obtaining user consent. What qualifies has been refined and now only includes cookies required for the basic operation of a website or app, e.g. shopping cart functionality or maintaining an active login session.

Analytics, marketing, and user preference cookies are not strictly necessary and do require valid user consent prior to being activated. 

The law does not specify what the lifespan of cookies is allowed to be, but does require transparency from businesses about the cookies and trackers in use, including what data is collected and for what purposes, how long it will be retained, what parties it may be shared with, and what users’ rights are and how they can exercise them.

Additionally, companies that meet the law’s criteria must deploy a cookie consent banner that meets new guidelines’ requirements. There must be mechanisms to equally enable users to consent to cookie use or decline it, as well as to manage consent at a granular level, and to easily modify or withdraw it.

Companies must provide information about cookie use and consent in an easily accessible way on their website or app, including the E-com Act’s rules for cookie use, and details about which cookies or other tracking technologies are in use, what data is processed and why, and the processor’s identity.

Websites must remain accessible to users who refuse cookies, so cookie walls are not allowed, though it is acceptable for some functionality to be reduced slightly if a user declines cookies.

Companies also need to document and securely store users’ consent information over time, and be able to provide it in the event of a data request or audit.

Businesses that are already GDPR-compliant are already well positioned for compliance with the Norwegian cookie guidelines as well.

Companies operating websites and apps that do not comply with the new guidelines risk daily fines and government orders to improve their compliance activities. Fines can be up to 5 percent of the business’s total sale revenue for the preceding year, depending on how long the violation has been going on and how serious it is.

Compliance is overseen by the Norwegian Communications Authority (NKOM) and Norwegian Data Protection Authority (Datatilsynet).

Usercentrics has been enabling ongoing data privacy compliance since the GDPR was implemented. In addition to helping companies to meet their legal obligations, Usercentrics Web CMP and Usercentrics App CMP enable you to deliver better transparency and great user experience. 

Collect, securely store, and document valid user consent that meets Norwegian, EU, and/or international regulatory requirements while building trust with your users, helping you get the data you need and growing engagement and revenue.

Setup is designed for ease of use for technical and non-technical teams. Use one of our high quality pre-built templates, or fully customize your consent banner to match your brand. 

Our powerful scanning technology detects and automates categorization of the cookies and tracking technologies you’re using, and we provide over 2,200 legal templates for data processing services in use, saving your time and resources at implementation and maintenance. A/B testing and in-depth analytics help you understand user interactions and consent choices to optimize your banner for higher consent rates.

Plus, you always get our expert guidance and detailed documentation every step of the way, so you can stay focused on your core business and harness the competitive advantage of Privacy-Led Marketing.

The United States does not have a comprehensive federal data privacy law that governs how businesses access or use individuals’ personal information. Instead, privacy protections and regulation are currently left to individual states. California led the way in 2020 with the California Consumer Privacy Act (CCPA), later strengthened by the California Privacy Rights Act (CPRA). As of January 2025, 20 states have passed similar laws. The variances in consumers’ rights, companies’ responsibilities, and other factors makes compliance challenging for businesses operating in multiple states.

The American Data Privacy and Protection Act (ADPPA) sought to simplify privacy compliance by establishing a comprehensive federal privacy standard. The ADPPA emerged in June 2022 when Representative Frank Pallone introduced HR 8152 to the House of Representatives. The bill gained strong bipartisan support in the House Energy and Commerce Committee, passing with a 53-2 vote in July 2022. It also received amendments in December 2022. However, the bill did not progress any further.

As proposed, the ADPPA would have preempted most state-level privacy laws, replacing the current multi-state compliance burden with a single federal standard.

In this article, we’ll examine who the ADPPA would have applied to, its obligations for businesses, and the rights it would have granted US residents.

What is the American Data Privacy and Protection Act (ADPPA)? 

The American Data Privacy and Protection Act (ADPPA) was a proposed federal bill that would have set consistent rules for how organizations handle personal data across the United States. It aimed to protect individuals’ privacy with comprehensive safeguards while requiring organizations to meet strict standards for handling personal data.

Under the ADPPA, an individual is defined as “a natural person residing in the United States.” Organizations that collect, use, or share individuals’ personal data would have been responsible for protecting it, including measures to prevent unauthorized access or misuse. By balancing individual rights and business responsibilities, the ADPPA sought to create a clear and enforceable framework for privacy nationwide.

What data would have been protected under the American Data Privacy and Protection Act (ADPPA)?

The ADPPA aimed to protect the personal information of US residents, which it refers to as covered data. Covered data is broadly defined as “information that identifies or is linked, or reasonably linkable, alone or in combination with other information, to an individual or a device that identifies or is linked or reasonably linkable to an individual.” In other words, any data that would either identify or could be traced to a person or to a device that is linked to an individual. This includes data that may be derived from other information and unique persistent identifiers, such as those used to track devices or users across platforms.

The definition excludes:

Sensitive covered data under the ADPPA

The ADPPA, like other data protection regulations, would have required stronger safeguards for sensitive covered data that could harm individuals if it was misused or unlawfully accessed. The bill’s definition of sensitive covered data is extensive, going beyond many US state-level data privacy laws.

Protected categories of data include, among other things:

Who would the American Data Privacy and Protection Act (ADPPA) have applied to?

The ADPPA would have applied to a broad range of entities that handle covered data.

Covered entity under the ADPPA

A covered entity is “any entity or any person, other than an individual acting in a non-commercial context, that alone or jointly with others determines the purposes and means of collecting, processing, or transferring covered data.” This definition matches similar terms like “controller” in US state privacy laws and the European Union’s General Data Protection Regulation (GDPR). To qualify as a covered entity under the ADPPA, the organization would have had to be in one of three categories:

Although the bill did not explicitly address international jurisdiction, its reach could have extended beyond US borders. Foreign companies would have needed to comply if they handle US residents’ data for commercial purposes and meet the FTC Act’s jurisdictional requirements, such as conducting business activities in the US or causing foreseeable injury within the US. This type of extraterritorial scope is common among a number of other international data privacy laws.

Service provider under the ADPPA

A service provider was defined as a person or entity that engages in either of the following:

OR

This role mirrors what other data protection laws call a processor, including most state privacy laws and the GDPR.

Large data holders under the ADPPA

Large data holders were not considered a third type of organization. Both covered entities and service providers could have qualified as large data holders if, in the most recent calendar year, they had gross annual revenues of USD 250 million or more, and collected, processed, or transferred: 

Large data holders would have faced additional requirements under the ADPPA.

Third-party collecting entity under the ADPPA

The ADPPA introduced the concept of a third-party collecting entity, which refers to a covered entity that primarily earns its revenue by processing or transferring personal data it did not collect directly from the individuals to whom the data relates. In other contexts, they are often referred to as data brokers.

However, the definition excluded certain activities and entities:

An entity is considered to derive its principal source of revenue from data processing or transfer if, in the previous 12 months, either:

or

Third-party collecting entities that process data from more than 5,000 individuals or devices in a calendar year would have had to register with the Federal Trade Commission by January 31 of the following year. Registration would require a fee of USD 100 and basic information about the organization, including its name, contact details, the types of data it handles, and a link to a website where individuals can exercise their privacy rights.

Exemptions under the ADPPA

While the ADPPA potentially would have had a wide reach, certain exemptions would have applied.

Definitions in the American Data Privacy and Protection Act (ADPPA)

Like other data protection laws, the ADPPA defined several terms that are important for businesses to know. While many — like “collect” or “process” — can be found in other regulations, there are also some that are unique to the ADPPA. We look at some of these key terms below.

Knowledge under the ADPPA

“Knowledge” refers to whether a business is aware that an individual is a minor. The level of awareness required depends on the type and size of the business.

Some states — like Minnesota and Nebraska — define “known child” but do not adjust the criteria for what counts as knowledge based on the size or revenue of the business handling the data. Instead, they apply the same standard to all companies, regardless of their scale.

The ADPPA uses the term “affirmative express consent,” which refers to “an affirmative act by an individual that clearly communicates the individual’s freely given, specific, and unambiguous authorization” for a business to perform an action, such as collecting or using their personal data. Consent for data collection would have to be obtained after the covered entity provides clear information about how it will use the data.

Like the GDPR and other data privacy regulations, consent would have needed to be freely given, informed, specific, and unambiguous.

Under this definition, consent cannot be inferred from an individual’s inaction or continued use of a product or service. Additionally, covered entities cannot trick people into giving consent through misleading statements or manipulative design. This includes deceptive interfaces meant to confuse users or limit their choices. 

Transfer under the ADPPA

Most data protection regulations include a definition for the sale of personal data or personal information. While the ADPPA did not define sale, it instead defined “transfer” as “to disclose, release, disseminate, make available, license, rent, or share covered data orally, in writing, electronically, or by any other means.”

What are consumers’ rights under the American Data Privacy and Protection Act (ADPPA)?

Under the ADPPA, consumers would have had the following rights regarding their personal data.

What are privacy requirements under the American Data Privacy and Protection Act (ADPPA)?

The ADPPA would have required organizations to meet certain obligations when handling individuals’ covered data. Here are the key privacy requirements under the bill.

Organizations must obtain clear, explicit consent through easily understood standalone disclosures. Consent requests must be accessible, available in all service languages, and give equal prominence to accept and decline options. Organizations must provide mechanisms to withdraw consent that are as simple as giving it. 

Organizations must avoid using misleading statements or manipulative designs, and must obtain new consent for different data uses or significant privacy policy changes. While the ADPPA works alongside the Children’s Online Privacy Protection Act (COPPA)’s parental consent requirements for children under 13, it adds its own protections for minors up to age 17.

Privacy policy

Organizations must maintain clear, accessible privacy policies that detail their data collection practices, transfer arrangements, retention periods, and rights granted to individuals. These policies must specify whether data goes to countries like China, Russia, Iran, or North Korea, which could present a security risk, and they must be available in all languages where services are offered. When making material changes, organizations must notify affected individuals in advance and give them a chance to opt out.

Data minimization

Organizations can only collect and process data that is reasonably necessary to provide requested services or for specific allowed purposes. These allowed purposes include activities like completing transactions, maintaining services, protecting against security threats, meeting legal obligations, and preventing harm or if there is a risk of death, among others. Collected data must also be proportionate to these activities. 

Privacy by design

Privacy by design is a default requirement under the ADPPA. Organizations must implement reasonable privacy practices that consider the organization’s size, data sensitivity, available technology, and implementation costs. They must align with federal laws and regulations and regularly assess risks in their products and services, paying special attention to protecting minors’ privacy and implementing appropriate safeguards.

Data security

Organizations must establish, implement, and maintain appropriate security measures, including vulnerability assessments, preventive actions, employee training, and incident response plans. They must implement clear data disposal procedures and match their security measures to their data handling practices.

Privacy and data security officers

Organizations with more than 15 employees must appoint both a privacy officer and data security officer, who must be two distinct individuals. These officers are responsible for implementing privacy programs and maintaining ongoing ADPPA compliance.

Privacy impact assessments

Organizations — excluding large data holders and small businesses — must conduct regular privacy assessments that evaluate the benefits and risks of their data practices. These assessments must be documented and maintained, and consider factors like data sensitivity and potential privacy impacts.

Loyalty with respect to pricing

Organizations cannot discriminate against individuals who exercise their privacy rights. While they can adjust prices based on necessary financial information and offer voluntary loyalty programs, they cannot retaliate through changes in pricing or service quality, e.g. if an individual exercises their rights and requests their data or does not consent to certain data processing.

Special requirements for large data holders

In addition to their general obligations, large data holders would have had unique responsibilities under the proposed law.

Infographic illustrating special requirements for managing large data holders effectively and securely.

Privacy policy

Large data holders would have been required to maintain and publish 10-year archives of their privacy policies on their websites. They would need to keep a public log documenting significant privacy policy changes and their impact. Additionally, they would need to provide a short-form notice (under 500 words) highlighting unexpected practices and sensitive data handling.

Privacy and data security officers

At least one of the appointed officers would have been designated as a privacy protection officer who reports directly to the highest official at the organization. This officer, either directly or through supervised designees, would have been required to do the following:

Privacy impact assessments

While all organizations other than small businesses would be required to conduct privacy impact assessments under the proposed law, large data holders would have had additional requirements. 

Metrics reporting

Large data holders would be required to compile and disclose annual metrics related to verified access, deletion, and opt-out requests. These metrics would need to be included in their privacy policy or published on their website.

Executive certification

An executive officer would have been required to annually certify to the FTC that the large data holder has internal controls and a reporting structure in place to achieve compliance with the proposed law.

Algorithm impact assessments

Large data holders using covered algorithms that could pose a consequential risk of harm would be required to conduct an annual impact assessment of these algorithms. This requirement would be in addition to privacy impact assessments and would need to begin no later than two years after the Act’s enactment.

American Data Privacy and Protection Act (ADPPA) enforcement and penalties for noncompliance

The ADPPA would have established a multi-layered enforcement approach that set it apart from other US privacy laws.

Starting two years after the law would have taken effect, individuals would gain a private right of action, or the right to sue for violations. However, before filing a lawsuit, they would need to notify both the Commission and their state Attorney General.

The ADPPA itself did not establish specific penalties for violations. Instead, violations of the ADPPA or its regulations would be treated as violations of the Federal Trade Commission Act, subject to the same penalties, privileges, and immunities provided under that law.

The American Data Privacy and Protection Act (ADPPA) compared to other data privacy regulations

As privacy regulations continue to evolve worldwide, it’s helpful to understand how the ADPPA would compare with other comprehensive data privacy laws.

The EU’s GDPR has set the global standard for data protection since 2018. In the US, the CCPA (as amended by the CPRA) established the first comprehensive state-level privacy law and has influenced subsequent state legislation. Below, we’ll look at how the ADPPA compares with these regulations.

The ADPPA vs the GDPR

There are many similarities between the proposed US federal privacy law and the EU’s data protection regulation. Both require organizations to implement privacy and security measures, provide individuals with rights over their personal data (including access, deletion, and correction), and mandate clear privacy policies that detail their data processing activities. Both also emphasize data minimization principles and purpose limitation.

However, there are also several important differences between the two. 

AspectADPPAGDPR
Territorial scopeWould have applied to individuals residing in the US.Applies to EU residents and any organization processing their data, regardless of location.
ConsentNot a standalone legal basis; required only for specific activities like targeted advertising and processing sensitive data.One of six legal bases for processing; can be a primary justification.
Government entitiesExcluded federal, state, tribal, territorial and local government entities.Applies to public bodies and authorities.
Privacy officersRequired “privacy and security officers” for covered entities with more than 15 employees, with stricter rules for large data holders.Requires a Data Protection Officer (DPO) for public authorities or entities engaged in large-scale data processing.
Data transfersNo adequacy requirements; focus on transfers to specific countries (China, Russia, Iran, North Korea).Detailed adequacy requirements and transfer mechanisms.
Children’s dataExtended protections to minors up to age 17.Focuses on children under 16 (can be lowered to 13 by member states).
PenaltiesViolations would have been treated as violations of the Federal Trade Commission Act.Imposes fines up to 4% of annual global turnover or 20 million, whichever is higher.

The ADPPA vs the CCPA/CPRA

There are many similarities between the proposed US federal privacy law and California’s existing privacy framework. Both include comprehensive transparency requirements, including privacy notices in multiple languages and accessibility for people with disabilities. They also share similar approaches to prohibiting manipulative design practices and requirements for regular security and privacy assessments.

However, there are also differences between the ADPPA and CCPA/CPRA.

AspectADPPACCPA/CPRA
Covered entitiesWould have applied to organizations under jurisdiction of the Federal Trade Commission, including nonprofits and common carriers; excluded government agencies.Applies only to for-profit businesses meeting any of these specific thresholds:gross annual revenue of over USD 26,625,000receive, buy, sell, or share personal information of 100,000 or more consumers or householdsearn more than half of their annual revenue from the sale of consumers’ personal information
Private right of actionBroader right to sue for various violations.Limited to data breaches only.
Data minimizationRequired data collection and processing to be limited to what is reasonably necessary and proportionate.Similar requirement, but the CPRA allows broader processing for “compatible” purposes.
Algorithmic impact assessmentsRequired large data holders to conduct annual assessments focusing on algorithmic risks, bias, and discrimination.Requires risk assessments weighing benefits and risks of data practices, with no explicit focus on bias.
Executive accountabilityRequired executive certification of compliance.No executive certification requirement.
EnforcementWould have been enforced by the Federal Trade Commission, State Attorney Generals, and the California Privacy Protection Authority (CPPA).CPPA and local authorities within California.

The ADPPA would have required organizations to obtain affirmative express consent for certain data processing activities through clear, conspicuous standalone disclosures. These consent requests would need to be easily understood, equally prominent for either accepting or declining, and available in all languages where services are offered. Organizations would also need to provide simple mechanisms for withdrawing consent that would be as easy to use as giving consent was initially. The bill also required organizations to honor opt-out requests for practices like targeted advertising and certain data transfers. These opt-out mechanisms would need to be accessible and easy to use, with clear instructions for exercising these rights.

Organizations would need to clearly disclose not only the types of data they collect but also the parties with whom this information is shared. Consumers would also need to be informed about their data rights and how to act on them, such as opting out of processing, through straightforward explanations and guidance. 

To support transparency, organizations would also be required to maintain privacy pages that are regularly updated to reflect their data collection, use, and sharing practices. These pages would help provide consumers with access to the latest information about how their data is handled. Additionally, organizations would have been able to use banners or buttons on websites and apps to inform consumers about data collection and provide them with an option to opt out.

Though the ADPPA was not enacted, the US does have an increasing number of state-level data privacy laws. A consent management platform (CMP) like the Usercentrics CMP for website consent management or app consent management can help organizations streamline compliance with the many existing privacy laws in the US and beyond. The CMP securely maintains records of consent, automates opt-out processes, and enables consistent application of privacy preferences across an organization’s digital properties. It also helps to automate the detection and blocking of cookies and other tracking technologies that are in use on websites and apps.

Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or privacy specialists regarding data privacy and protection issues and operations.

How far can companies go to get a user’s consent? When does inconvenience or questionable user experience tip over into legally noncompliant manipulation? These continue to be important questions across the data privacy landscape, especially with mobile apps, an area where regulatory scrutiny and enforcement have been ramping up.

French social networking app BeReal requests users’ consent to use their data for targeted advertising, which is very common. However, how they go about presenting (and re-presenting) that request has led to a complaint against them relating to their GDPR compliance. Let’s look at what BeReal is doing to get user consent, what the complaint is, and the legal basis for it.

According to noyb’s complaint, BeReal introduced a new consent banner feature for European users in July 2024. The contention is that this banner requested user consent for use of their data for targeted advertising, which is not unusual or problematic in itself. However, the question is whether the banner provides users with real consent choice or not.

Based on the description from the complaint, BeReal designed their banner to be displayed to users when they open the app. If a user accepts the terms — giving consent for data use for targeted advertising — then they never see the banner again. However, if a user declines consent, the banner allegedly reappears every day when users attempt to post on the app. As the app requires users to snap photos multiple times a day, seeing a banner display every time one tries to do so could be understandably frustrating.

In addition to resulting in an annoying user experience, this alleged action is also potentially a GDPR violation. It’s questionable if user consent under these described conditions is actually freely given.

The GDPR does require organizations to ask users for consent again if, for example, there have been changes in their data processing operations, like they want to collect new data, or want to use data for a new purpose. 

It’s also recommended that organizations refresh user consent data from time to time, even though the GDPR doesn’t specify an exact time frame, as some other laws and guidelines do. For example, a company could ask users for consent for specific data uses every 12 months, either to ensure consent is still current, or to see if users who previously declined have changed their minds. 

The noyb complaint against BeReal

In December 2024, privacy advocacy group noyb (the European Center for Digital Rights) filed a complaint against BeReal with French data protection authority Commission Nationale de l’Informatique et des Libertés (CNIL), arguing that the company’s alleged repeated banner displays for non-consenting users are a form of “nudging” or use of dark patterns

The CNIL is one of the EU data protection authorities that has previously announced increased enforcement of data privacy for mobile apps, and released guidelines for better privacy protection for mobile apps in September 2024.

While regulators have increasingly taken a dim view of various design manipulations to obtain users’ consent, like hiding the “reject” option, noyb argues BeReal’s actions are a new dark pattern trend: “annoying people into consent”. Simply put, they contend that BeReal does not take no for an answer, meaning consent obtained through this repeated tactic is not freely given, and thus is a clear violation of the GDPR’s requirements.

The noyb legal team has requested that the CNIL order BeReal to delete the personal data of affected users, modify its consent practices to be GDPR-compliant, and impose an administrative fine as a deterrent to other companies that may consider similar tactics.

Whether it’s making users go hunting to find the “reject” button (or removing it entirely), or wearing them down with constant banner displays until they give in and consent to the requested data use, the European Data Protection Board (EDPB) has seen and addressed similar issues before.

It’s generally understood that users are likely to give in over time out of fatigue or frustration and consent to the requested data use. Companies get what they want, but not in a way that is voluntary or a good user experience. The EDPB has emphasized that in addition to being specific, informed, and unambiguous, consent must be freely given. Persistent prompts can be a form of coercion, and thus consent received that way may not be legally valid (Art. 4 GDPR).

As technologies change over time, the ways in which dark patterns can be deployed to manipulate users into giving consent are likely to further evolve and become more sophisticated.

A fine balance: Data monetization and privacy compliance

It is a common challenge for companies to try to find ways to increase consent rates for access to user data to drive monetization strategies via their websites, apps, and other connected platforms. Cases like the one against BeReal could potentially set the tone for regulators’ increasingly stringent expectations for online platforms’ data operations, and the company could serve as a cautionary tale for others considering questionable tactics where user privacy is concerned.

As more individuals around the world are protected by more data privacy laws, what data companies are allowed to access and under what circumstances is becoming more strictly controlled. Thus the increasing challenge for companies that need data for advertising, analytics, personalization, and additional uses to grow their businesses.

Fortunately, there is a way to strike a balance between data privacy and data-driven business. With clear, user-friendly consent management, a shift to reliance on zero- and first-party data, and embracing Privacy-Led Marketing by employing preference management and other strategies to foster engagement and long-term customer satisfaction and loyalty.

How Usercentrics helps

Good consent practices require making user experience better, not more frustrating. Usercentrics App CMP helps your company deliver, building trust with users and providing a smooth, friendly user experience for consent management. You can obtain higher consent rates while achieving and maintaining privacy compliance.

Simple, straightforward setup for technical and non-technical teams automates integration of your vendors, SSPs, and SDKs with the App Scanner. We provide over 2,200 pre-built legal templates so you can provide clear, comprehensive consent choices to your users. 

With extensive customization, you can make sure your banners fit your app or game’s design and branding and provide key information, enabling valid user consent without getting in their way or causing frustration. And you also get our expert guidance and detailed documentation every step of the way.

Oregon was the twelfth state in the United States to pass comprehensive data privacy legislation with SB 619. Governor Tina Kotek signed the bill into law on July 18, 2023, and the Oregon Consumer Privacy Act (OCPA) came into effect for most organizations on July 1, 2024. Nonprofits have an extra year to prepare, so their compliance is required as of July 1, 2025.

In this article, we’ll look at the Oregon Consumer Privacy Act’s requirements, who they apply to, and what businesses can do to achieve compliance.

What is the Oregon Consumer Privacy Act (OCPA)?

The Oregon Consumer Privacy Act protects the privacy and personal data of over 4.2 million Oregon residents. The law establishes rules for any individual or entity conducting business in Oregon or those providing goods and services to its residents and processing their personal data. Affected residents are known as “consumers” under the law.

The OCPA protects Oregon residents’ personal data when they act as individuals or in household contexts. It does not cover personal data collected in a work context. This means information about individuals acting in their professional roles, rather than as consumers, is not covered under this law.

Consistent with the other US state-level data privacy laws, the OCPA requires businesses to inform residents about how their personal data is collected and used. This notification — usually included in a website’s privacy policy — must cover key details such as:

The Oregon privacy law uses an opt-out consent model, which means that in most cases, organizations can collect consumers’ personal data without prior consent. However, they must make it possible for consumers to opt out of the sale of their personal data and its use in targeted advertising or profiling. The law also requires businesses to implement reasonable security measures to protect the personal data they handle.

Who must comply with the Oregon Consumer Privacy Act (OCPA)?

Similar to many other US state-level data privacy laws, the OCPA establishes thresholds for establishing which organizations must comply with its requirements. However, unlike some other laws, it does not contain a revenue-only threshold.

To fall under the OCPA’s scope, during a calendar year an organization must control or process the personal data of:

or

Exemptions to OCPA compliance

The OCPA is different from some other data privacy laws because many of its exemptions focus on the types of data being processed and what processing activities are being conducted, rather than just on the organizations themselves.

For example, instead of exempting healthcare entities under the Health Insurance Portability and Accountability Act (HIPAA), the OCPA exempts protected health information handled in compliance with HIPAA. This means protected health information is outside of the OCPA’s scope, but other data that a healthcare organization handles could still fall under the law. Organizations that may be exempt from compliance with other state-level consumer privacy laws should consult a qualified legal professional to determine if they are required to comply with the OCPA.

Exempted organizations and their services or activities include:

Personal data collected, processed, sold, or disclosed under the following federal laws is also exempt from the OCPA’s scope:

Definitions in the Oregon Consumer Privacy Act (OCPA)

This Oregon data privacy law defines several key terms related to the data it protects and relevant data processing activities.

What is personal data under the OCPA?

The Oregon privacy law protects consumers’ personal data, which it defines as “data, derived data or any unique identifier that is linked to or is reasonably linkable to a consumer or to a device that identifies, is linked to or is reasonably linkable to one or more consumers in a household.”

The law specifically excludes personal data that is:

The law does not specifically list what constitutes personal data. Common types of personal data that businesses collect include a consumer’s name, phone number, email address, Social Security Number, or driver’s license number.

It should be noted that personal data (also called personal information under some state privacy laws) and personally identifiable information are not always the same thing, and distinctions between the two are often made in data privacy laws.

What is sensitive data under the OCPA?

Sensitive data is personal data that requires special handling because it could cause harm or embarrassment if misused or unlawfully accessed. It refers to personal data that would reveal an individual’s:

All personal data belonging to children is also considered sensitive data under the OCPA.

Oregon’s law is the first of the US privacy laws to include either transgender or non-binary gender expression or the status as a victim of crime as sensitive data. The definition of biometric data excludes facial geometry or mapping unless it is done for the purpose of identifying an individual.

An exception to the law’s definition of sensitive data includes “the content of communications or any data generated by or connected to advanced utility metering infrastructure systems or equipment for use by a utility.” In other words, the law does not consider sensitive information to include communications content, like that in emails or messages, or data generated by smart utility meters and related systems used by utilities.

Like many other data privacy laws, the Oregon data privacy law follows the European Union’s General Data Protection Regulation (GDPR) regarding the definition of valid consent. Under the OCPA, consent is “an affirmative act by means of which a consumer clearly and conspicuously communicates the consumer’s freely given, specific, informed and unambiguous assent to another person’s act or practice…”

The definition also includes conditions for valid consent:

These conditions are highly relevant to online consumers and reflect that the use of manipulative dark patterns are increasingly frowned upon by data protection authorities, and increasingly prohibited. The Oregon Department of Justice (DOJ) website also clarifies that the use of dark patterns may be considered a deceptive business practice under Oregon’s Unlawful Trade Practices Act.

What is processing under the OCPA?

Processing under the OCPA means any action or set of actions performed on personal data, whether manually or automatically. This includes activities like collecting, using, storing, disclosing, analyzing, deleting, or modifying the data.

Who is a controller under the OCPA?

The OCPA uses the term “controller” to describe businesses or entities that decide how and why personal data is processed. While the law uses the word “person,” it applies broadly to both individuals and organizations.

The OCPA definition of controller is “a person that, alone or jointly with another person, determines the purposes and means for processing personal data.” In simpler terms, a controller is anyone who makes the key decisions about why personal data is collected and how it will be used.

Who is a processor under the OCPA?

The OCPA defines a processor as “a person that processes personal data on behalf of a controller.” Like the controller, while the law references a person, it typically refers to businesses or organizations that handle data for a controller. Processors are often third parties that follow the controller’s instructions for handling personal data. These third parties can include advertising partners, payment processors, or fulfillment companies, for example. Their role is to carry out specific tasks without deciding how or why the data is processed.

What is profiling under the OCPA?

Profiling is increasingly becoming a standard inclusion in data privacy laws, particularly as it can relate to “automated decision-making” or the use of AI technologies. The Oregon privacy law defines profiling as “an automated processing of personal data for the purpose of evaluating, analyzing or predicting an identified or identifiable consumer’s economic circumstances, health, personal preferences, interests, reliability, behavior, location or movements.”

What is targeted advertising under the OCPA?

Targeted advertising may involve emerging technologies like AI tools. It is also becoming a standard inclusion in data privacy laws. The OCPA defines targeted advertising as advertising that is “selected for display to a consumer on the basis of personal data obtained from the consumer’s activities over time and across one or more unaffiliated websites or online applications and is used to predict the consumer’s preferences or interests.” In simpler terms, targeted advertising refers to ads shown to a consumer based on their interests, which are determined by personal data that is collected over time from different websites and apps.

However, some types of ads are excluded from this definition, such as those that are:

The definition also excludes processing of personal data solely to measure or report an ad’s frequency, performance, or reach.

What is a sale under the OCPA?

The OCPA defines sale as “the exchange of personal data for monetary or other valuable consideration by the controller with a third party.” This means a sale doesn’t have to involve money. Any exchange of data for something of value, even if it’s non-monetary, qualifies as a sale under the law.

The Oregon privacy law does not consider the following disclosures of personal data to be a “sale”:

Consumers’ rights under the Oregon Consumer Privacy Act (OCPA)

The Oregon privacy law grants consumers a range of rights over their personal data, comparable to other US state-level privacy laws.

Consumers can designate an authorized agent to opt out of personal data processing on their behalf. The OCPA also introduces a requirement for controllers to to recognize universal opt-out signals, further simplifying the opt-out process.

This Oregon data privacy law stands out by giving consumers the right to request a specific list of third parties that have received their personal data. Unlike many other privacy laws, this one requires controllers to maintain detailed records of the exact entities they share data with, rather than just general categories of recipients.

Children’s personal data has special protections under the OCPA. Parents or legal guardians can exercise rights for children under the age of 13, whose data is classified as sensitive personal data and subject to stricter rules. For minors between 13 and 15, opt-in consent is required for specific processing activities, including its use for targeted advertising or profiling. “Opt-in” means that explicit consent is required before the data can be used for these purposes.

Consumers can make one free rights request every 12 months, to which an organization has 45 days to respond. They can extend that period by another 45 days if reasonably necessary. Organizations can deny consumer requests for a number of reasons. These include cases in which the consumer’s identity cannot reasonably be verified, or if the consumer has made too many requests within a 12-month period.

Oregon’s privacy law does not include private right of action, so consumers cannot sue data controllers for violations. California remains the only state that allows this provision.

What are the privacy requirements under the Oregon Consumer Privacy Act (OCPA)

Controllers must meet the following OCPA requirements to protect the personal data they collect from consumers.

Privacy notice and transparency under the OCPA

The Oregon privacy law requires controllers to be transparent about their data handling practices. Controllers must provide a clear, easily accessible, and meaningful privacy notice for consumers whose personal data they may process. The privacy notice, also known as the privacy policy, must include the following:

According to the Oregon DOJ website, the third-party categories requirement must strike a particular balance. It should offer consumers meaningful insights into the relevant types of businesses or processing activities, without making the privacy notice overly complex. Acceptable examples include ”analytics companies,” “third-party advertisers,” and ”payment processors,” among others.

The privacy notice or policy must be easy for consumers to access. It is typically linked in the website footer for visibility and accessibility from any page.

Data minimization and purpose limitation under the OCPA

The OCPA requires controllers to limit the personal data they collect to only what is “adequate, relevant, and reasonably necessary” for the purposes stated in the privacy notice. If the purposes for processing change, controllers must notify consumers and, where applicable, obtain their consent.

Data security under the OCPA

The Oregon data privacy law requires controllers to establish, implement, and maintain reasonable safeguards for protecting “the confidentiality, integrity and accessibility” of the personal data under their control. The data security measures also apply to deidentified data.

Oregon’s existing laws about privacy practices remain in effect as well. These laws include requirements for reasonable administrative, technical, and physical safeguards for data storage and handling, IoT device security features, and truth in privacy and consumer protection notices.

Data protection assessments (DPA) under the OCPA

Controllers must perform data protection assessments (DPA), also known as data protection impact assessments, for processing activities that present “a heightened risk of harm to a consumer.” These activities include:

The Attorney General may also require a data controller to conduct a DPA or share the results of one in the course of an investigation.

The OCPA primarily uses an opt-out consent model. This means that in most cases controllers are not required to obtain consent from consumers before collecting or processing their personal data. However, there are specific cases where consent is required:

To help consumers to make informed decisions about their consent, controllers must clearly disclose details about the personal data being collected, the purposes for which it is processed, who it is shared with, and how consumers can exercise their rights. Controllers must also provide clear, accessible information on how consumers can opt out of data processing.

Consumers must be able to revoke consent at any time, as easily as they gave it. Data processing must stop after consent has been revoked, and no later than 15 days after receiving the revocation.

Nondiscrimination under the OCPA

The OCPA prohibits controllers from discriminating against consumers who exercise their rights under the law. This includes actions such as:

For example, if a consumer opts out of data processing on a website, that individual cannot be blocked from accessing that website or its functions.

Some website features and functions do not work without certain cookies or trackers being activated, so if a consumer does not opt in to their use because they collect personal data, the site may not work as intended. This is not considered discriminatory.

This Oregon privacy law permits website operators and other controllers to offer voluntary incentives for consumers’ participation in activities where personal data is collected. These may include newsletter signups, surveys, and loyalty programs. Offers must be proportionate and reasonable to the request as well as the type and amount of data collected. This way, they will not look like bribes or payments for consent, which data protection authorities frown upon.

Third party contracts under the OCPA

Before starting any data processing activities, controllers must enter into legally binding contracts with third-party processors. These contracts govern how processors handle personal data on behalf of the controller, and must include the following provisions:

These contracts are known as data processing agreements under some data protection regulations like the GDPR.

Universal opt-out mechanism under the OCPA

As of January 1, 2026, organizations subject to the OCPA must comply with a universal opt-out mechanism. Also called a global opt-out signal, it includes tools like the Global Privacy Control.

This mechanism enables a consumer to set their data processing preferences once and have those preferences automatically communicated to any website or platform that detects the signal. Preferences are typically set via a web browser plugin.

While this requirement is not yet standard across all US or global data privacy laws, it is becoming more common in newer legislation. Other states that require controllers to recognize global opt-out signals include California, Minnesota, Nebraska, Texas, and Delaware.

How to comply with the Oregon Consumer Privacy Act (OCPA)

Below is a non-exhaustive checklist to help your business and website address key OCPA requirements. For advice specific to your organization, consulting a qualified legal professional is strongly recommended.

Enforcement of the Oregon Consumer Privacy Act (OCPA)

The Oregon Attorney General’s office is the enforcement authority for the OCPA. Consumers can file complaints with the Attorney General regarding data processing practices or the handling of their requests. The Attorney General’s office must notify an organization of any complaint and in the event that an investigation is launched. During investigations, the Attorney General can request controllers to submit data protection assessments and other relevant information. Enforcement actions must be initiated within five years of the last violation.

Controllers have the right to have an attorney present during investigative interviews and can refuse to answer questions. The Attorney General cannot bring in external experts for interviews or share investigation documents with non-employees.

Until January 1, 2026, controllers have a 30-day cure period during which they can fix OCPA violations. If the issue is not resolved within this time, the Attorney General may pursue civil penalties. The right to cure sunsets January 1, 2026, after which the opportunity to cure will only be at the discretion of the Attorney General.

Fines and penalties for noncompliance under the OCPA

The Attorney General can seek civil penalties up to USD 7,500 per violation. Additional actions may include seeking court orders to stop unlawful practices, requiring restitution for affected consumers, or reclaiming profits obtained through violations.

If the Attorney General succeeds, the court may require the violating party to cover legal costs, including attorney’s fees, expert witness fees, and investigation expenses. However, if the court determines that the Attorney General pursued a claim without a reasonable basis, the defendants may be entitled to recover their attorney’s fees.

How does the Oregon Consumer Privacy Act (OCPA) affect businesses?

The OCPA introduces privacy law requirements that are similar to other state data protection laws. These include obligations around notifying consumers about data practices, granting them access to their data, limiting data use to specific purposes, and implementing reasonable security measures.

One notable distinction is that the law sets different compliance timelines based on an organization’s legal status. The effective date for commercial entities is July 1, 2024, while nonprofit organizations are given an additional year and must comply by July 1, 2025.

Since the compliance deadline for commercial entities has already passed, businesses that fall under the OCPA’s scope should ensure they meet its requirements as soon as possible to avoid penalties. Nonprofits, though they have more time, should actively prepare for compliance.

Businesses covered by federal laws like HIPAA and the GLBA, which may exempt them from other state data privacy laws, should confirm with a qualified legal professional whether they need to comply with the OCPA.

Oregon’s law is based on an opt-out consent model. In other words, consent does not need to be obtained before collecting or processing personal data unless it is sensitive or belongs to a child.

Processors do need to inform consumers about what data is collected and used and for what purposes, as well as with whom it is shared, and if it is to be sold or used for targeted advertising or profiling.

Consumers must also be informed of their rights regarding data processing and how to exercise them. This includes the ability for consumers to opt out of processing of their data or change their previous consent preferences. Typically, this information is presented on a privacy page, which must be kept up to date.

As of 2026, organizations must also recognize and respect consumers’ consent preferences as expressed via a universal opt-out signal.

Websites and apps can use a banner to inform consumers about data collection and enable them to opt out. This is typically done using a link or button. A consent management platform (CMP) like the Usercentrics CMP for website consent management or app consent management also helps to automate the detection of cookies and other tracking technologies that are in use on websites and apps.

A CMP can streamline sharing information about data categories and the specific services in use by the controller and/or processor(s), as well as third parties with whom data is shared.

The United States still only has a patchwork of state-level privacy laws rather than a single federal law. As a result, many companies doing business across the country, or foreign organizations doing business in the US, may need to comply with a variety of state-level data protection laws.

A CMP can make this easier by enabling banner customization and geotargeting. Websites can display data processing, consent information, and choices for specific regulations based on specific user location. Geotargeting can also improve clarity and user experience by presenting this information in the user’s preferred language.

Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or a privacy specialist regarding data privacy and protection issues and operations.

Microsoft Universal Event Tracking (UET) with Consent Mode helps businesses responsibly manage data while optimizing digital advertising efforts. UET is a tracking tool from Microsoft Advertising that collects user behavior data to help businesses measure conversions, optimize ad performance, and build remarketing strategies.

Consent Mode works alongside UET. It’s a feature that adjusts how data is collected based on user consent preferences. This functionality is increasingly important as businesses address global privacy regulations like the GDPR and CCPA.

For companies using Microsoft Ads, understanding and implementing these tools helps them prioritize user privacy, build trust, and achieve better marketing outcomes while respecting data privacy standards.

Microsoft UET Consent Mode is a feature designed to help businesses respect user privacy while maintaining effective advertising strategies. It works alongside Microsoft Universal Event Tracking (UET) by dynamically adjusting how data is collected based on user consent.

When visitors interact with your website, Consent Mode determines whether tracking is activated or limited, depending on their preferences. For instance, if a user opts out of tracking, Consent Mode restricts data collection. This function aligns the tracking process with privacy preferences and applicable regulations.

Consent Mode supports businesses as they balance privacy expectations with effective campaign management. It also helps businesses align their data practices with Microsoft’s advertising policies and regional privacy laws to create a more transparent and user-focused approach to data management.

The role of UET in advertising

Microsoft Universal Event Tracking (UET) offers businesses the tools they need to optimize advertising strategies. With a simple tag integrated into a business’s website, UET helps advertisers monitor essential user actions like purchases, form submissions, and page views. This data is invaluable for building remarketing audiences, tracking conversions, and making data-backed decisions that improve ad performance.

However, effectively collecting and utilizing this data requires alignment with user consent preferences. Without proper consent, businesses risk operating outside privacy regulations, and could face penalties or restrictions. By integrating UET with Consent Mode, businesses can respect user choices while continuing to access the insights needed to run impactful advertising campaigns.

Challenges in advertising compliance

In today’s digital age, businesses must carefully balance data-driven advertising with growing privacy expectations. Regulations like the General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA), and the California Privacy Rights Act (CPRA) have set clear rules about how user data can be collected, stored, and used. Non-compliance can lead to significant consequences, such as hefty fines, restricted access to ad platforms, or even account suspension.

Beyond financial and operational risks, non-compliance can damage a company’s reputation. When businesses fail to address privacy concerns, they risk losing customer trust—a resource that is difficult to rebuild. As users become more aware of how their data is used, businesses that fail to adopt transparent practices may struggle to retain their audience.

Enforcement of Microsoft UET Consent Mode

Microsoft Advertising is requiring that customers start to enforce explicitly obtaining and providing consent signals by May 5, 2025.

Providing consent signals enables Microsoft Ads customers to comply with the requirements of privacy laws like the GDPR, where violations can result in hefty fines and other penalties.

Obtaining explicit consent also demonstrates respect for users’ privacy and rights, building user trust. Consumers increasingly indicate concerns over access to and use of their data online.

Consent benefits advertising performance as part of your Privacy-Led Marketing strategy as well. Continue generating valuable insights into campaigns for effective targeting and conversion tracking.

By integrating Microsoft UET Consent Mode, companies can address user expectations, improve data accuracy, and create a more transparent relationship with their audience. Let’s take a closer look at the benefits of using Microsoft UET Consent Mode. 

Supporting privacy regulations

Privacy laws such as the GDPR, CCPA, and the ePrivacy Directive require businesses to handle user data responsibly. Microsoft UET Consent Mode adjusts data collection practices based on user preferences, helping companies better align with these requirements. By respecting user choices, businesses can reduce the risks associated with non-compliance.

Accurate data collection

Data accuracy is a key component of any successful advertising strategy. With Consent Mode, businesses only collect insights from users who agree to data tracking. This focus helps prevent skewed data caused by collecting information from users who have not consented. These insights are therefore more reliable and actionable.

Optimized ad campaigns

Consent Mode enables businesses to continue leveraging tools like remarketing and conversion tracking while honoring user privacy preferences. This functionality helps advertisers maintain the effectiveness of their campaigns by focusing on audiences who have opted into tracking. As a result, companies can make data-driven decisions without compromising privacy.

Building trust through transparency

Demonstrating respect for user privacy goes beyond privacy compliance — it also fosters trust. Transparency about how data is collected and used enables businesses to strengthen their relationships with customers. A privacy-first approach can set companies apart in a competitive advertising environment by showing users that their choices and rights are valued.

Usercentrics Web CMP provides businesses with a practical solution for integrating Microsoft UET with Consent Mode. By leveraging Usercentrics Web CMP’s unique features, companies can manage user consent effectively while maintaining a seamless advertising strategy.

Streamlined implementation

Usercentrics Web CMP simplifies the process of integrating Microsoft Consent Mode. With automated configuration, businesses can set up their systems quickly and focus on optimizing their campaigns without the complexities of manual implementation.

Seamless compatibility

As among the first consent management platforms to offer automated support for Microsoft Consent Mode, Usercentrics Web CMP is designed for smooth integration with Microsoft UET. This compatibility reduces technical challenges and supports reliable functionality.

The CMP enables businesses to design consent banners that align with their branding, creating a consistent user experience. Clear, branded messaging helps communicate data collection practices effectively while maintaining professionalism.

Privacy-focused data management

Usercentrics Web CMP provides a centralized platform for managing user consent across different regions and regulations. Businesses can easily adapt to global privacy requirements and organize their data collection practices efficiently, all in one place.

Usercentrics Web CMP simplifies the process of setting up Microsoft UET with Consent Mode. As the first platform to offer automated implementation of Microsoft Consent Mode, Usercentrics Web CMP enables companies to focus on their marketing efforts while managing user consent effectively.

To integrate Microsoft UET with Consent Mode using Usercentrics Web CMP, follow these steps:

For a detailed walkthrough, refer to the support article.

Microsoft UET with Consent Mode, supported by Usercentrics Web CMP, provides businesses with a practical approach to balancing effective advertising with user privacy. With this solution, companies can streamline consent management, enhance their advertising strategies, and adapt to ever-changing privacy expectations.

Respecting user choices isn’t just about privacy compliance—it’s an opportunity to build trust and demonstrate a commitment to transparency. Businesses that embrace Privacy-Led Marketing position themselves as trustworthy partners in a competitive digital marketplace.

Adopting Privacy-Led Marketing does more than  support long-term customer relationships. It also enables companies to responsibly leverage valuable insights to optimize their campaigns. Microsoft UET with Consent Mode and Usercentrics Web CMP together create a strong foundation for businesses to effectively navigate the intersection of privacy and performance.