The United States does not have a comprehensive federal data privacy law that governs how businesses access or use individuals’ personal information. Instead, privacy protections and regulation are currently left to individual states. California led the way in 2020 with the California Consumer Privacy Act (CCPA), later strengthened by the California Privacy Rights Act (CPRA). As of January 2025, 20 states have passed similar laws. The variances in consumers’ rights, companies’ responsibilities, and other factors makes compliance challenging for businesses operating in multiple states.
The American Data Privacy and Protection Act (ADPPA) sought to simplify privacy compliance by establishing a comprehensive federal privacy standard. The ADPPA emerged in June 2022 when Representative Frank Pallone introduced HR 8152 to the House of Representatives. The bill gained strong bipartisan support in the House Energy and Commerce Committee, passing with a 53-2 vote in July 2022. It also received amendments in December 2022. However, the bill did not progress any further.
As proposed, the ADPPA would have preempted most state-level privacy laws, replacing the current multi-state compliance burden with a single federal standard.
In this article, we’ll examine who the ADPPA would have applied to, its obligations for businesses, and the rights it would have granted US residents.
What is the American Data Privacy and Protection Act (ADPPA)?
The American Data Privacy and Protection Act (ADPPA) was a proposed federal bill that would have set consistent rules for how organizations handle personal data across the United States. It aimed to protect individuals’ privacy with comprehensive safeguards while requiring organizations to meet strict standards for handling personal data.
Under the ADPPA, an individual is defined as “a natural person residing in the United States.” Organizations that collect, use, or share individuals’ personal data would have been responsible for protecting it, including measures to prevent unauthorized access or misuse. By balancing individual rights and business responsibilities, the ADPPA sought to create a clear and enforceable framework for privacy nationwide.
What data would have been protected under the American Data Privacy and Protection Act (ADPPA)?
The ADPPA aimed to protect the personal information of US residents, which it refers to as covered data. Covered data is broadly defined as “information that identifies or is linked, or reasonably linkable, alone or in combination with other information, to an individual or a device that identifies or is linked or reasonably linkable to an individual.” In other words, any data that would either identify or could be traced to a person or to a device that is linked to an individual. This includes data that may be derived from other information and unique persistent identifiers, such as those used to track devices or users across platforms.
The definition excludes:
- Deidentified data
- Employee data
- Publicly available information
- Inferences made exclusively from multiple separate sources of publicly available information, so long as they don’t reveal private or sensitive details about a specific person
Sensitive covered data under the ADPPA
The ADPPA, like other data protection regulations, would have required stronger safeguards for sensitive covered data that could harm individuals if it was misused or unlawfully accessed. The bill’s definition of sensitive covered data is extensive, going beyond many US state-level data privacy laws.
Protected categories of data include, among other things:
- Personal identifiers, including government-issued IDs like Social Security numbers and driver’s licenses, except when legally required for public display.
- Health information, including details about past, present, or future physical and mental health conditions, treatments, disabilities, and diagnoses.
- Financial data, such as account numbers, debit and credit card numbers, income, and balance information. The last four digits of payment cards are excluded.
- Private communications, such as emails, texts, calls, direct messages, voicemails, and their metadata. This does not apply if the device is employer-provided and individuals are given clear notice of monitoring.
- Behavioral data, including sexual behavior information when collected against reasonable expectations, video content selections, and online activity tracking across websites.
- Personal records, such as private calendars, address books, photos, and recordings, except on employer-provided devices with notice.
- Demographic details, including race, color, ethnicity, religion, and union membership.
- Biological identifiers, including biometric information and genetic information, precise location data, login credentials, and information about minors.
- Security credentials, login details or security or access codes for an account or device.
Who would the American Data Privacy and Protection Act (ADPPA) have applied to?
The ADPPA would have applied to a broad range of entities that handle covered data.
Covered entity under the ADPPA
A covered entity is “any entity or any person, other than an individual acting in a non-commercial context, that alone or jointly with others determines the purposes and means of collecting, processing, or transferring covered data.” This definition matches similar terms like “controller” in US state privacy laws and the European Union’s General Data Protection Regulation (GDPR). To qualify as a covered entity under the ADPPA, the organization would have had to be in one of three categories:
- Businesses regulated by the Federal Trade Commission Act (FTC Act)
- Telecommunications carriers
- Nonprofits
Although the bill did not explicitly address international jurisdiction, its reach could have extended beyond US borders. Foreign companies would have needed to comply if they handle US residents’ data for commercial purposes and meet the FTC Act’s jurisdictional requirements, such as conducting business activities in the US or causing foreseeable injury within the US. This type of extraterritorial scope is common among a number of other international data privacy laws.
Service provider under the ADPPA
A service provider was defined as a person or entity that engages in either of the following:
- Collects, processes, or transfers covered data on behalf of a covered entity or government body
OR
- Receives covered data from or on behalf of a covered entity of government body
This role mirrors what other data protection laws call a processor, including most state privacy laws and the GDPR.
Large data holders under the ADPPA
Large data holders were not considered a third type of organization. Both covered entities and service providers could have qualified as large data holders if, in the most recent calendar year, they had gross annual revenues of USD 250 million or more, and collected, processed, or transferred:
- Covered data of more than 5,000,000 individuals or devices, excluding data used solely for payment processing
- Sensitive covered data from more than 200,000 individuals or devices
Large data holders would have faced additional requirements under the ADPPA.
Third-party collecting entity under the ADPPA
The ADPPA introduced the concept of a third-party collecting entity, which refers to a covered entity that primarily earns its revenue by processing or transferring personal data it did not collect directly from the individuals to whom the data relates. In other contexts, they are often referred to as data brokers.
However, the definition excluded certain activities and entities:
- A business would not be considered a third-party collecting entity if it processed employee data received from another company, but only for the purpose of providing benefits to those employees
- A service provider would also not be classified as a third-party collecting entity under this definition
An entity is considered to derive its principal source of revenue from data processing or transfer if, in the previous 12 months, either:
- More than 50 percent of its total revenue came from these activities
or
- The entity processed or transferred the data of more than 5 million individuals that it did not collect directly
Third-party collecting entities that process data from more than 5,000 individuals or devices in a calendar year would have had to register with the Federal Trade Commission by January 31 of the following year. Registration would require a fee of USD 100 and basic information about the organization, including its name, contact details, the types of data it handles, and a link to a website where individuals can exercise their privacy rights.
Exemptions under the ADPPA
While the ADPPA potentially would have had a wide reach, certain exemptions would have applied.
- Small businesses: Organizations with less than USD 41 million in annual revenue or those that process data for fewer than 50,000 individuals would be exempt from some provisions.
- Government entities: The ADDPA would not apply to government bodies or their service providers handling covered data. It also excluded congressionally designated nonprofits that support victims and families with issues involving missing and exploited children.
- Organizations subject to other federal laws: Organizations already complying with certain existing privacy laws, including the Health Insurance Portability and Accountability Act (HIPAA), the Gramm-Leach-Bliley Act (GLBA), and the Family Educational Rights and Privacy Act (FERPA), among others, were deemed compliant with similar ADPPA requirements for the specific data covered by those laws. However, they would have still been required to comply with Section 208 of the ADPPA, which contains provisions for data security and protection of covered data.
Definitions in the American Data Privacy and Protection Act (ADPPA)
Like other data protection laws, the ADPPA defined several terms that are important for businesses to know. While many — like “collect” or “process” — can be found in other regulations, there are also some that are unique to the ADPPA. We look at some of these key terms below.
Knowledge under the ADPPA
“Knowledge” refers to whether a business is aware that an individual is a minor. The level of awareness required depends on the type and size of the business.
- High-impact social media companies: These are large platforms that are primarily known for user-generated content. They would have to have at least USD 3 billion in annual revenue and 300 million monthly active users over 3 months in the preceding year. They would be considered to have knowledge if they were aware or should have been aware that a user was a minor. This is the strictest standard.
- Large data holders: These are organizations that have significant data operations but do not qualify as high-impact social media. They have knowledge if they knew or willfully ignored evidence that a user was a minor.
- Other covered entities or service providers: Those that do not fall into the above categories are required to have actual knowledge that the user is a minor.
Some states — like Minnesota and Nebraska — define “known child” but do not adjust the criteria for what counts as knowledge based on the size or revenue of the business handling the data. Instead, they apply the same standard to all companies, regardless of their scale.
Affirmative express consent under the GDPR
The ADPPA uses the term “affirmative express consent,” which refers to “an affirmative act by an individual that clearly communicates the individual’s freely given, specific, and unambiguous authorization” for a business to perform an action, such as collecting or using their personal data. Consent for data collection would have to be obtained after the covered entity provides clear information about how it will use the data.
Like the GDPR and other data privacy regulations, consent would have needed to be freely given, informed, specific, and unambiguous.
Under this definition, consent cannot be inferred from an individual’s inaction or continued use of a product or service. Additionally, covered entities cannot trick people into giving consent through misleading statements or manipulative design. This includes deceptive interfaces meant to confuse users or limit their choices.
Transfer under the ADPPA
Most data protection regulations include a definition for the sale of personal data or personal information. While the ADPPA did not define sale, it instead defined “transfer” as “to disclose, release, disseminate, make available, license, rent, or share covered data orally, in writing, electronically, or by any other means.”
What are consumers’ rights under the American Data Privacy and Protection Act (ADPPA)?
Under the ADPPA, consumers would have had the following rights regarding their personal data.
- Right of awareness: The Commission must publish and maintain a webpage describing the provisions, rights, obligations, and requirements of the ADPPA for individuals, covered entities, and service providers. This information must be:
- Published within 90 days of the law’s enactment
- Updated quarterly as needed
- Available in the ten most commonly used languages in the US
- Right to transparency: Covered entities must provide clear information about how consumer data is collected, used, and shared. This includes which third parties would receive their data and for what purposes.
- Right of access: Consumers can access their covered data (including data collected, processed, or transferred within the past 24 months), categories of third parties and service providers who received the data, and the purpose(s) for transferring the data.
- Right to correction: Consumers can correct any substantial inaccuracies or incomplete information in their covered data and instruct the covered entity to notify all third parties or service providers that have received the data.
- Right to deletion: Consumers can request that their covered data processed by the covered entity be deleted. They can also instruct the covered entity to notify all third parties or service providers that have received the data of the deletion request.
- Right to data portability: Consumers can request their personal data in a structured, machine-readable format that enables them to transfer it to another service or organization.
- Right to opt out: Consumers can opt out of the transfer of their personal data to third parties and its use for targeted advertising. Businesses are required to provide a clear and accessible mechanism for exercise of this right.
- Private right of action: Consumers can sue companies directly for certain violations of the act, with some limitations and procedural requirements. (California is the only state to provide this right as of early 2025.)
What are privacy requirements under the American Data Privacy and Protection Act (ADPPA)?
The ADPPA would have required organizations to meet certain obligations when handling individuals’ covered data. Here are the key privacy requirements under the bill.
Consent
Organizations must obtain clear, explicit consent through easily understood standalone disclosures. Consent requests must be accessible, available in all service languages, and give equal prominence to accept and decline options. Organizations must provide mechanisms to withdraw consent that are as simple as giving it.
Organizations must avoid using misleading statements or manipulative designs, and must obtain new consent for different data uses or significant privacy policy changes. While the ADPPA works alongside the Children’s Online Privacy Protection Act (COPPA)’s parental consent requirements for children under 13, it adds its own protections for minors up to age 17.
Privacy policy
Organizations must maintain clear, accessible privacy policies that detail their data collection practices, transfer arrangements, retention periods, and rights granted to individuals. These policies must specify whether data goes to countries like China, Russia, Iran, or North Korea, which could present a security risk, and they must be available in all languages where services are offered. When making material changes, organizations must notify affected individuals in advance and give them a chance to opt out.
Data minimization
Organizations can only collect and process data that is reasonably necessary to provide requested services or for specific allowed purposes. These allowed purposes include activities like completing transactions, maintaining services, protecting against security threats, meeting legal obligations, and preventing harm or if there is a risk of death, among others. Collected data must also be proportionate to these activities.
Privacy by design
Privacy by design is a default requirement under the ADPPA. Organizations must implement reasonable privacy practices that consider the organization’s size, data sensitivity, available technology, and implementation costs. They must align with federal laws and regulations and regularly assess risks in their products and services, paying special attention to protecting minors’ privacy and implementing appropriate safeguards.
Data security
Organizations must establish, implement, and maintain appropriate security measures, including vulnerability assessments, preventive actions, employee training, and incident response plans. They must implement clear data disposal procedures and match their security measures to their data handling practices.
Privacy and data security officers
Organizations with more than 15 employees must appoint both a privacy officer and data security officer, who must be two distinct individuals. These officers are responsible for implementing privacy programs and maintaining ongoing ADPPA compliance.
Privacy impact assessments
Organizations — excluding large data holders and small businesses — must conduct regular privacy assessments that evaluate the benefits and risks of their data practices. These assessments must be documented and maintained, and consider factors like data sensitivity and potential privacy impacts.
Loyalty with respect to pricing
Organizations cannot discriminate against individuals who exercise their privacy rights. While they can adjust prices based on necessary financial information and offer voluntary loyalty programs, they cannot retaliate through changes in pricing or service quality, e.g. if an individual exercises their rights and requests their data or does not consent to certain data processing.
Special requirements for large data holders
In addition to their general obligations, large data holders would have had unique responsibilities under the proposed law.
Privacy policy
Large data holders would have been required to maintain and publish 10-year archives of their privacy policies on their websites. They would need to keep a public log documenting significant privacy policy changes and their impact. Additionally, they would need to provide a short-form notice (under 500 words) highlighting unexpected practices and sensitive data handling.
Privacy and data security officers
At least one of the appointed officers would have been designated as a privacy protection officer who reports directly to the highest official at the organization. This officer, either directly or through supervised designees, would have been required to do the following:
- Establish processes to review and update privacy and security policies, practices, and procedures
- Conduct biennial comprehensive audits to ensure compliance with the proposed law and make them accessible to the Commission upon request
- Develop employee training programs about ADPPA compliance
- Maintain detailed records of all material privacy and security practices
- Serve as the point of contact for enforcement authorities
Privacy impact assessments
While all organizations other than small businesses would be required to conduct privacy impact assessments under the proposed law, large data holders would have had additional requirements.
- Timing: While other organizations must conduct assessments within one year of the ADPPA’s enactment, large data holders would have been required to do so within one year of either becoming a large data holder or the law’s enactment, whichever came first.
- Scope: Both must consider nature and volume of data and privacy risks, but large data holders would need to specifically assess “potential adverse consequences” in addition to “substantial privacy risks.”
- Approval: Large data holders’ assessments would need to be approved by their privacy protection officer, while other entities would have no specific approval requirement.
- Technology review: Large data holders would need to include reviews of security technologies (like blockchain and distributed ledger), this review would be optional for other entities.
- Documentation: While both would need to maintain written assessments until the next assessment, large data holders’ assessments would also need to be accessible to their privacy protection officer.
Metrics reporting
Large data holders would be required to compile and disclose annual metrics related to verified access, deletion, and opt-out requests. These metrics would need to be included in their privacy policy or published on their website.
Executive certification
An executive officer would have been required to annually certify to the FTC that the large data holder has internal controls and a reporting structure in place to achieve compliance with the proposed law.
Algorithm impact assessments
Large data holders using covered algorithms that could pose a consequential risk of harm would be required to conduct an annual impact assessment of these algorithms. This requirement would be in addition to privacy impact assessments and would need to begin no later than two years after the Act’s enactment.
American Data Privacy and Protection Act (ADPPA) enforcement and penalties for noncompliance
The ADPPA would have established a multi-layered enforcement approach that set it apart from other US privacy laws.
- Federal Trade Commission: The FTC would serve as the primary enforcer, treating violations as unfair or deceptive practices under the Federal Trade Commission Act. The proposed law required the FTC to create a dedicated Bureau of Privacy for enforcement.
- State Attorneys General: State Attorneys General and State Privacy Authorities could bring civil actions on behalf of their residents if they believed violations had affected their state’s interest.
- California Privacy Protection Authority (CPPA): The CPPA, established under the California Privacy Rights Act, would have special enforcement authority. The CPPA could enforce the ADPPA in California in the same manner as it enforces California’s privacy laws.
Starting two years after the law would have taken effect, individuals would gain a private right of action, or the right to sue for violations. However, before filing a lawsuit, they would need to notify both the Commission and their state Attorney General.
The ADPPA itself did not establish specific penalties for violations. Instead, violations of the ADPPA or its regulations would be treated as violations of the Federal Trade Commission Act, subject to the same penalties, privileges, and immunities provided under that law.
The American Data Privacy and Protection Act (ADPPA) compared to other data privacy regulations
As privacy regulations continue to evolve worldwide, it’s helpful to understand how the ADPPA would compare with other comprehensive data privacy laws.
The EU’s GDPR has set the global standard for data protection since 2018. In the US, the CCPA (as amended by the CPRA) established the first comprehensive state-level privacy law and has influenced subsequent state legislation. Below, we’ll look at how the ADPPA compares with these regulations.
The ADPPA vs the GDPR
There are many similarities between the proposed US federal privacy law and the EU’s data protection regulation. Both require organizations to implement privacy and security measures, provide individuals with rights over their personal data (including access, deletion, and correction), and mandate clear privacy policies that detail their data processing activities. Both also emphasize data minimization principles and purpose limitation.
However, there are also several important differences between the two.
Aspect | ADPPA | GDPR |
Territorial scope | Would have applied to individuals residing in the US. | Applies to EU residents and any organization processing their data, regardless of location. |
Consent | Not a standalone legal basis; required only for specific activities like targeted advertising and processing sensitive data. | One of six legal bases for processing; can be a primary justification. |
Government entities | Excluded federal, state, tribal, territorial and local government entities. | Applies to public bodies and authorities. |
Privacy officers | Required “privacy and security officers” for covered entities with more than 15 employees, with stricter rules for large data holders. | Requires a Data Protection Officer (DPO) for public authorities or entities engaged in large-scale data processing. |
Data transfers | No adequacy requirements; focus on transfers to specific countries (China, Russia, Iran, North Korea). | Detailed adequacy requirements and transfer mechanisms. |
Children’s data | Extended protections to minors up to age 17. | Focuses on children under 16 (can be lowered to 13 by member states). |
Penalties | Violations would have been treated as violations of the Federal Trade Commission Act. | Imposes fines up to 4% of annual global turnover or €20 million, whichever is higher. |
The ADPPA vs the CCPA/CPRA
There are many similarities between the proposed US federal privacy law and California’s existing privacy framework. Both include comprehensive transparency requirements, including privacy notices in multiple languages and accessibility for people with disabilities. They also share similar approaches to prohibiting manipulative design practices and requirements for regular security and privacy assessments.
However, there are also differences between the ADPPA and CCPA/CPRA.
Aspect | ADPPA | CCPA/CPRA |
Covered entities | Would have applied to organizations under jurisdiction of the Federal Trade Commission, including nonprofits and common carriers; excluded government agencies. | Applies only to for-profit businesses meeting any of these specific thresholds:gross annual revenue of over USD 26,625,000receive, buy, sell, or share personal information of 100,000 or more consumers or householdsearn more than half of their annual revenue from the sale of consumers’ personal information |
Private right of action | Broader right to sue for various violations. | Limited to data breaches only. |
Data minimization | Required data collection and processing to be limited to what is reasonably necessary and proportionate. | Similar requirement, but the CPRA allows broader processing for “compatible” purposes. |
Algorithmic impact assessments | Required large data holders to conduct annual assessments focusing on algorithmic risks, bias, and discrimination. | Requires risk assessments weighing benefits and risks of data practices, with no explicit focus on bias. |
Executive accountability | Required executive certification of compliance. | No executive certification requirement. |
Enforcement | Would have been enforced by the Federal Trade Commission, State Attorney Generals, and the California Privacy Protection Authority (CPPA). | CPPA and local authorities within California. |
Consent management and the American Data Privacy and Protection Act (ADPPA)
The ADPPA would have required organizations to obtain affirmative express consent for certain data processing activities through clear, conspicuous standalone disclosures. These consent requests would need to be easily understood, equally prominent for either accepting or declining, and available in all languages where services are offered. Organizations would also need to provide simple mechanisms for withdrawing consent that would be as easy to use as giving consent was initially. The bill also required organizations to honor opt-out requests for practices like targeted advertising and certain data transfers. These opt-out mechanisms would need to be accessible and easy to use, with clear instructions for exercising these rights.
Organizations would need to clearly disclose not only the types of data they collect but also the parties with whom this information is shared. Consumers would also need to be informed about their data rights and how to act on them, such as opting out of processing, through straightforward explanations and guidance.
To support transparency, organizations would also be required to maintain privacy pages that are regularly updated to reflect their data collection, use, and sharing practices. These pages would help provide consumers with access to the latest information about how their data is handled. Additionally, organizations would have been able to use banners or buttons on websites and apps to inform consumers about data collection and provide them with an option to opt out.
Though the ADPPA was not enacted, the US does have an increasing number of state-level data privacy laws. A consent management platform (CMP) like the Usercentrics CMP for website consent management or app consent management can help organizations streamline compliance with the many existing privacy laws in the US and beyond. The CMP securely maintains records of consent, automates opt-out processes, and enables consistent application of privacy preferences across an organization’s digital properties. It also helps to automate the detection and blocking of cookies and other tracking technologies that are in use on websites and apps.
Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or privacy specialists regarding data privacy and protection issues and operations.
Protecting personal data is more critical than ever. As organizations handle vast amounts of information, understanding the distinctions between various data types — such as Personally Identifiable Information (PII), Personal Information (PI), and sensitive data — becomes essential.
These classifications play a significant role in data privacy and security, helping companies determine compliance requirements with global privacy regulations while safeguarding individual privacy.
By differentiating among these types of data, organizations and website owners can implement appropriate security measures and build trust with their customers.
Understanding various data types
Understanding the nuances among different data types is essential for effective data privacy and security management. Distinguishing between Personally Identifiable Information (PII) vs Personal Information (PI) vs sensitive data enables companies to safeguard individuals’ privacy and comply with relevant regulations.
Before we delve into the specifics of each data type, here’s a brief overview of PII vs PI vs sensitive data:
- PII: This includes any information that can identify an individual, like names, Social Security numbers, or email addresses.
- PI: This broader category covers any information related to a person, even if it doesn’t identify them on its own, such as a common name or web browsing activity.
- Sensitive data: This subset of PI requires extra protection due to its potential for harm if exposed, like medical records, sexual orientation, or financial information.
Recognizing these data types is essential for regulatory compliance, as laws like the General Data Protection Regulation (GDPR) and the California Privacy Rights Act (CPRA) have specific requirements for handling personal data.
Accurate classification supports compliance and enhances risk management by enabling organizations to implement tailored security measures that mitigate the risk of data breaches and data exposures. Moreover, a deep understanding of data types strengthens user trust, as companies that implement smart data collection strategies and prioritize data protection foster stronger, more reliable relationships with their customers.
What you need to know about Personally Identifiable Information (PII)
What is PII?
Personally Identifiable Information (PII) refers to any data that can be used to identify a specific individual. This includes information that can directly identify a person or can be used in combination with other data to identify someone.
This definition is widely used by privacy professionals and aligns with interpretations from organizations like the National Institute of Standards and Technology (NIST) in the United States. We specify this because there is not a single, global definition of Personally Identifiable Information or what types of information it encompasses. As a result, specific definitions of PII can differ across organizations and borders. Different regulations also use different language and have different levels of detail in describing these categories.
What are the different types of PII?
There are two main types of PII:
- Direct identifiers: Information that can immediately identify an individual, such as full name, Social Security number, or passport number.
- Indirect identifiers: Data that, when combined with other information, can lead to the identification of an individual, like date of birth, place of work, or job title.
Additionally, PII can be classified as sensitive or non-sensitive, depending on the potential harm that could result from its disclosure or misuse.
Sensitive PII refers to information that, if disclosed or breached, could result in substantial harm, embarrassment, inconvenience, or unfairness to an individual. This type of PII requires stricter protection measures due to its potential for misuse. Many data privacy laws specifically address sensitive data and apply additional restrictions and protection requirements to it.
Non-sensitive PII, on the other hand, is information that can be transmitted in an unencrypted form without resulting in harm to the individual. While it still requires protection, the security measures may not be as stringent as those for sensitive PII.
Examples of PII
PII encompasses a wide range of data points that can be used to identify an individual. So it’s important to understand specific examples for each category. Doing so enables your company to implement appropriate security measures and make it a consideration of data strategy for marketing and other operations.
Sensitive PII includes information that, if disclosed, could lead to significant harm or privacy violations. Examples of sensitive PII are:
- Social Security number
- driver’s license number
- financial account numbers (e.g., bank account, credit card)
- passport number
- biometric data (fingerprints, retinal scans)
- medical records
- genetic information
On the other hand, non-sensitive PII refers to information that is less likely to cause harm if disclosed but still requires protection. Examples of non-sensitive PII include:
- full name
- email address
- phone number
- physical address
- IP address
- date of birth
- place of birth
- race or ethnicity
- educational records
- employment information
It’s important to note that even non-sensitive PII can pose privacy risks when combined with other data. Therefore, it’s recommended that companies aim to protect all types of PII data that they collect and handle.
PII under GDPR
While the term “Personally Identifiable Information” is not explicitly used in the GDPR, the regulation encompasses this concept within its broader definition of “personal data.”
However, there are some key differences in how PII is treated under the GDPR compared to other data privacy laws:
- Expanded scope: The GDPR takes a more expansive view of what constitutes identifiable information. It includes data that might not traditionally be considered PII in other contexts, such as IP addresses, cookie identifiers, and device IDs.
- Context-dependent approach: Under the GDPR, whether information is classified as personal data (and thus protected) depends on the context and the potential to identify an individual, rather than fitting into specific predefined categories of PII.
- Pseudonymized data: The GDPR introduces pseudonymization, a process that changes personal data so it can’t be linked to a specific individual without additional information. While pseudonymized data is still classified as personal data under GDPR, it is subject to slightly relaxed requirements.
- Data minimization principle: The GDPR emphasizes the importance of data minimization, which aligns with but goes beyond traditional PII protection practices. Organizations are required to collect and process only the personal data that is necessary for the specific purpose they have declared.
- Risk-based approach: The GDPR requires companies to evaluate the risk of processing personal data, including what is traditionally considered PII. This assessment determines the necessary security measures and safeguards.
The key takeaway brands should understand is that the GDPR offers a detailed framework for protecting personal data, covering more types of identifiable information than traditional PII definitions. Companies need to understand these distinctions to achieve compliance and protect individuals’ privacy.
PII compliance best practices
To effectively protect PII data and enable compliance with relevant regulations, organizations can implement best practices tailored to their specific data handling processes. Doing so not only helps mitigate risks associated with data breaches but also fosters trust among customers and stakeholders.
Here are some key best practices for PII compliance:
- Conduct regular data audits to identify and classify PII.
- Use encryption and access controls to protect sensitive information.
- Develop and enforce clear policies for how PII is collected, processed, and stored.
- Train employees regularly on data protection and privacy best practices.
- Apply data minimization techniques to collect only necessary information.
- Implement secure methods for disposing of PII when it is no longer needed.
- Keep privacy policies updated and obtain user consent for data collection and processing.
- Perform periodic risk assessments and vulnerability scans to identify and address security weaknesses.
- Have an incident response plan ready to manage potential data breaches effectively.
PII violation and its consequences
Violations of PII protection can have serious consequences for both individuals and organizations. For individuals, this can lead to identity theft, financial fraud, and reputational damage, causing emotional and financial stress.
For organizations, the risks are significant. Non-compliance can result in hefty legal penalties, such as fines of up to EUR 20 million or 4 percent of global annual revenue under regulations like the GDPR. Companies may also face reputational damage, loss of customer trust, and reduced revenue. You could also experience operational disruptions and increased costs from addressing data breaches, including legal fees, new reporting requirements to data protection authorities, and the need to implement stronger security measures.
What you need to know about PI (personal information)
What is personal data?
Personal data is any information that can identify an individual. It encompasses a broader range of data points than PII. It also includes both direct identifiers (like names and Social Security numbers) and indirect identifiers (like location data and online IDs) that can identify someone when combined with other information.
In short, all PII is personal data, but not all personal data is considered PII.
Personal data is a key concept in data protection laws, including the GDPR and the California Consumer Privacy Act (CCPA).
Personal information examples
Personal information can include a variety of data types, both objective and subjective:
Objective data types are factual, measurable, and verifiable. This includes:
- full name
- date of birth
- Social Security number
- phone number
- email address
- IP address
- financial information (e.g., bank account numbers, credit card details)
- biometric data (e.g., fingerprints, facial recognition data)
Subjective data types are based on personal opinions, interpretations, or evaluations. This involves:
- Performance reviews
- Customer feedback
- Personal preferences
- Medical symptoms described by a patient
- Personality assessments
Both objective and subjective data can be considered personal information if they can be linked to an identifiable individual.
It’s important to note that even publicly available information can be considered personal data in some jurisdictions. For instance, under the CCPA, publicly available information is generally excluded from the definition of personal information. However, even publicly available information can be considered personal data under the GDPR.
Personal data under the GDPR
The GDPR defines personal data in Article 4(1) as, “‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.”
This definition encompasses a broad scope and includes both direct identifiers (like names) and indirect identifiers (like location data). Given this definition, here are the key features of personal data as defined under the GDPR:
- Direct and indirect identifiers: Both are considered personal data, emphasizing the need to understand the context of information to identify individuals.
- Data collection context: The specifics of how and why data is collected and processed determine if it qualifies as personal data.
- Pseudonymized data: Even if data is pseudonymized, it is still classified as personal data if it can be re-identified. In contrast, anonymized data, where the possibility of re-identification has been eliminated, falls outside the scope of the GDPR.
- Applicability: The GDPR covers both automated and manual processing of personal data.
- Special categories: The regulation also includes sensitive data such as racial or ethnic origin, political opinions, religious beliefs, and health information.
PI compliance and best practices
To achieve and maintain compliance with data protection regulations and safeguard people’s personal information, companies can adopt the following best practices.
- Conduct regular data audits: Identify and classify all personal information within your company.
- Implement data minimization: Collect and retain only the personal data necessary for specific and legitimate purposes. Regularly delete unnecessary data.
- Manage consent and preferences: Use a consent management platform (CMP) to clearly explain how you’ll use personal information. Provide easy-to-use opt-in and opt-out options, allowing people to control their data preferences. A CMP can help automate this process, making it easier to comply with regulations and manage user choices across your digital properties.
- Check partners’ data collection: Make sure any third parties you work with protect personal information properly. Be transparent about your data-selling practices, and confirm that all partners have strong safeguards, as you could still be held responsible for how they handle data on your behalf.
- Train your team: Regularly educate all employees about the importance of protecting personal information and how to do it.
- Handle requests efficiently: Set up a system to quickly respond when people ask to see, change, or delete their personal information, depending on their particular rights.
- Assign responsibility: If required by law or as a best practice, designate a Data Protection Officer to oversee data protection compliance.
By implementing these best practices, companies can better protect personal information, build trust with their customers, and reduce the risk of data breaches and penalties.
What you need to know about sensitive information
What is sensitive data?
Sensitive data is confidential information that requires protection from unauthorized access or disclosure. If this data is compromised, it could lead to harm, discrimination, or negative consequences for the affected individual or organization. Sensitive information includes a broad range of information, such as certain kinds of PII, and also financial records, health data, and proprietary business details.
Examples of sensitive information
Sensitive information comes in various forms, and understanding these categories is essential for effective data protection. Common examples of sensitive personal data include:
- Personal data: Full names, home addresses, phone numbers, Social Security numbers, driver’s license numbers
- Financial information: Bank account numbers, credit card details, payment information
- Health data: Medical records, health insurance information, protected health information (PHI)
- Employee data: Payroll information, performance reviews, background checks
- Intellectual property: Trade secrets, proprietary code, product specifications
- Access credentials: Usernames, passwords, PINs, biometric data
- Industry-specific data: Retail sales figures, legal case information, research data
- Identity data: Political affiliation, religious beliefs, sexual or gender orientation
How GDPR treats sensitive data
Under the GDPR, sensitive personal data, also known as special categories of data, includes information about a person’s race, political beliefs, religion, union membership, genetic and biometric data, health, and sexual orientation.
Processing this type of data is generally only allowed if specific conditions are met. For instance, individuals must give explicit consent for their sensitive data to be used. It can also be processed if necessary for employment, legal claims, public interest, healthcare, or research.
How to safeguard sensitive data
Organizations must take extra precautions to protect sensitive data. So to safeguard sensitive information, here are some recommendations for companies.
- Implement data classification: Categorize data based on sensitivity levels to minimize processing and apply appropriate security measures.
- Limit access: Restrict access to sensitive data on a need-to-know basis and implement strong authentication methods.
- Use encryption: Encrypt sensitive data both at rest and in transit to prevent unauthorized access.
- Conduct regular audits: Perform security assessments to identify vulnerabilities, identify processes or data that are no longer needed, and maintain compliance with data protection regulations.
- Train employees: Educate staff on an ongoing basis about data security best practices and the importance of protecting sensitive information.
- Implement security technologies: Utilize firewalls, intrusion detection systems, and data loss prevention tools to safeguard sensitive data.
- Develop incident response plans: Create and maintain policies and procedures for responding to data breaches or unauthorized access attempts and communicating with authorities and affected data subjects.
By following these practices, companies can significantly reduce the risk of sensitive data exposure and maintain compliance with relevant data protection regulations
PII vs. PI vs. sensitive data comparison
Know your data types to better comply with global privacy laws
Safeguarding personal data — whether it falls under PII, PI, or sensitive data — is a fundamental responsibility of any organization. Each data type requires specific protection strategies, from encryption to strict access controls, to prevent unauthorized access and potential breaches.
Understanding the nuances between these data categories not only ensures compliance with global privacy laws but also fortifies the trust between your company and your customers. As the regulatory landscape continues to evolve, maintaining a proactive approach to data protection will be key to securing both sensitive information and organizational reputation.
Oregon was the twelfth state in the United States to pass comprehensive data privacy legislation with SB 619. Governor Tina Kotek signed the bill into law on July 18, 2023, and the Oregon Consumer Privacy Act (OCPA) came into effect for most organizations on July 1, 2024. Nonprofits have an extra year to prepare, so their compliance is required as of July 1, 2025.
In this article, we’ll look at the Oregon Consumer Privacy Act’s requirements, who they apply to, and what businesses can do to achieve compliance.
What is the Oregon Consumer Privacy Act (OCPA)?
The Oregon Consumer Privacy Act protects the privacy and personal data of over 4.2 million Oregon residents. The law establishes rules for any individual or entity conducting business in Oregon or those providing goods and services to its residents and processing their personal data. Affected residents are known as “consumers” under the law.
The OCPA protects Oregon residents’ personal data when they act as individuals or in household contexts. It does not cover personal data collected in a work context. This means information about individuals acting in their professional roles, rather than as consumers, is not covered under this law.
Consistent with the other US state-level data privacy laws, the OCPA requires businesses to inform residents about how their personal data is collected and used. This notification — usually included in a website’s privacy policy — must cover key details such as:
- What data is collected
- How the data is used
- Whether the data is shared and with whom
- Information about consumers’ rights
The Oregon privacy law uses an opt-out consent model, which means that in most cases, organizations can collect consumers’ personal data without prior consent. However, they must make it possible for consumers to opt out of the sale of their personal data and its use in targeted advertising or profiling. The law also requires businesses to implement reasonable security measures to protect the personal data they handle.
Who must comply with the Oregon Consumer Privacy Act (OCPA)?
Similar to many other US state-level data privacy laws, the OCPA establishes thresholds for establishing which organizations must comply with its requirements. However, unlike some other laws, it does not contain a revenue-only threshold.
To fall under the OCPA’s scope, during a calendar year an organization must control or process the personal data of:
- 100,000 consumers, not including consumers only completing payment transactionsor
or
- 25,000 consumers if 25 percent or more of the organization’s annual gross revenue comes from selling personal data
Exemptions to OCPA compliance
The OCPA is different from some other data privacy laws because many of its exemptions focus on the types of data being processed and what processing activities are being conducted, rather than just on the organizations themselves.
For example, instead of exempting healthcare entities under the Health Insurance Portability and Accountability Act (HIPAA), the OCPA exempts protected health information handled in compliance with HIPAA. This means protected health information is outside of the OCPA’s scope, but other data that a healthcare organization handles could still fall under the law. Organizations that may be exempt from compliance with other state-level consumer privacy laws should consult a qualified legal professional to determine if they are required to comply with the OCPA.
Exempted organizations and their services or activities include:
- Governmental agencies
- Consumer reporting agencies
- Financial institutions regulated by the Bank Act and their affiliates or subsidiaries, provided they focus exclusively on financial activities
- Insurance companies
- Nonprofit organizations established to detect and prevent insurance fraud
- Press, wire, or other information services (and the non-commercial activities of media entities)
Personal data collected, processed, sold, or disclosed under the following federal laws is also exempt from the OCPA’s scope:
- Health Insurance Portability and Accountability Act (HIPAA)
- Gramm-Leach-Bliley Act (GLBA)
- Health Care Quality Improvement Act
- Fair Credit Reporting Act (FCRA)
- Driver’s Privacy Protection Act
- Family Educational Rights and Privacy Act (FERPA)
- Airline Deregulation Act
Definitions in the Oregon Consumer Privacy Act (OCPA)
This Oregon data privacy law defines several key terms related to the data it protects and relevant data processing activities.
What is personal data under the OCPA?
The Oregon privacy law protects consumers’ personal data, which it defines as “data, derived data or any unique identifier that is linked to or is reasonably linkable to a consumer or to a device that identifies, is linked to or is reasonably linkable to one or more consumers in a household.”
The law specifically excludes personal data that is:
- Deidentified data
- made legally available through government records or widely distributed media
- made public by the consumer
The law does not specifically list what constitutes personal data. Common types of personal data that businesses collect include a consumer’s name, phone number, email address, Social Security Number, or driver’s license number.
It should be noted that personal data (also called personal information under some state privacy laws) and personally identifiable information are not always the same thing, and distinctions between the two are often made in data privacy laws.
What is sensitive data under the OCPA?
Sensitive data is personal data that requires special handling because it could cause harm or embarrassment if misused or unlawfully accessed. It refers to personal data that would reveal an individual’s:
- Racial or ethnic background
- National origin
- Religious beliefs
- Mental or physical condition or diagnosis
- Genetic or biometric data
- Sexual orientation
- Status as transgender or non-binary
- Status as a victim of crime
- Citizenship or immigration status
- Precise present or past geolocation (within 1,750 feet or 533.4 meters)
All personal data belonging to children is also considered sensitive data under the OCPA.
Oregon’s law is the first of the US privacy laws to include either transgender or non-binary gender expression or the status as a victim of crime as sensitive data. The definition of biometric data excludes facial geometry or mapping unless it is done for the purpose of identifying an individual.
An exception to the law’s definition of sensitive data includes “the content of communications or any data generated by or connected to advanced utility metering infrastructure systems or equipment for use by a utility.” In other words, the law does not consider sensitive information to include communications content, like that in emails or messages, or data generated by smart utility meters and related systems used by utilities.
What is consent under the OCPA?
Like many other data privacy laws, the Oregon data privacy law follows the European Union’s General Data Protection Regulation (GDPR) regarding the definition of valid consent. Under the OCPA, consent is “an affirmative act by means of which a consumer clearly and conspicuously communicates the consumer’s freely given, specific, informed and unambiguous assent to another person’s act or practice…”
The definition also includes conditions for valid consent:
- the consumer’s inaction does not constitute consent
- the user interface used to request consent must not attempt to obscure, subvert, or impair the consumer’s choice
These conditions are highly relevant to online consumers and reflect that the use of manipulative dark patterns are increasingly frowned upon by data protection authorities, and increasingly prohibited. The Oregon Department of Justice (DOJ) website also clarifies that the use of dark patterns may be considered a deceptive business practice under Oregon’s Unlawful Trade Practices Act.
What is processing under the OCPA?
Processing under the OCPA means any action or set of actions performed on personal data, whether manually or automatically. This includes activities like collecting, using, storing, disclosing, analyzing, deleting, or modifying the data.
Who is a controller under the OCPA?
The OCPA uses the term “controller” to describe businesses or entities that decide how and why personal data is processed. While the law uses the word “person,” it applies broadly to both individuals and organizations.
The OCPA definition of controller is “a person that, alone or jointly with another person, determines the purposes and means for processing personal data.” In simpler terms, a controller is anyone who makes the key decisions about why personal data is collected and how it will be used.
Who is a processor under the OCPA?
The OCPA defines a processor as “a person that processes personal data on behalf of a controller.” Like the controller, while the law references a person, it typically refers to businesses or organizations that handle data for a controller. Processors are often third parties that follow the controller’s instructions for handling personal data. These third parties can include advertising partners, payment processors, or fulfillment companies, for example. Their role is to carry out specific tasks without deciding how or why the data is processed.
What is profiling under the OCPA?
Profiling is increasingly becoming a standard inclusion in data privacy laws, particularly as it can relate to “automated decision-making” or the use of AI technologies. The Oregon privacy law defines profiling as “an automated processing of personal data for the purpose of evaluating, analyzing or predicting an identified or identifiable consumer’s economic circumstances, health, personal preferences, interests, reliability, behavior, location or movements.”
What is targeted advertising under the OCPA?
Targeted advertising may involve emerging technologies like AI tools. It is also becoming a standard inclusion in data privacy laws. The OCPA defines targeted advertising as advertising that is “selected for display to a consumer on the basis of personal data obtained from the consumer’s activities over time and across one or more unaffiliated websites or online applications and is used to predict the consumer’s preferences or interests.” In simpler terms, targeted advertising refers to ads shown to a consumer based on their interests, which are determined by personal data that is collected over time from different websites and apps.
However, some types of ads are excluded from this definition, such as those that are:
- Based on activities within a controller’s own websites or online apps
- Based on the context of a consumer’s current search query, visit to a specific website, or app use
- Shown in response to a consumer’s request for information or feedback
The definition also excludes processing of personal data solely to measure or report an ad’s frequency, performance, or reach.
What is a sale under the OCPA?
The OCPA defines sale as “the exchange of personal data for monetary or other valuable consideration by the controller with a third party.” This means a sale doesn’t have to involve money. Any exchange of data for something of value, even if it’s non-monetary, qualifies as a sale under the law.
The Oregon privacy law does not consider the following disclosures of personal data to be a “sale”:
- Disclosures to a processor
- Disclosures to an affiliate or a third party to help the controller provide a product or service requested by the consumer
- Disclosures or transfers of personal data as part of a merger, acquisition, bankruptcy, or similar transaction in which a third party takes control of the controller’s assets, including personal data
- Disclosures of personal data that occur because the consumer:
- directs the controller to disclose the data
- intentionally discloses the data while directing the controller to interact with a third party
- intentionally discloses the data to the public, such as through mass media, without restricting the audience
Consumers’ rights under the Oregon Consumer Privacy Act (OCPA)
The Oregon privacy law grants consumers a range of rights over their personal data, comparable to other US state-level privacy laws.
- Right to access: consumers can request confirmation of whether their personal data is being processed and the categories of personal data being processed, gain access to the data, and receive a list of the specific third parties it has been shared with (other than natural persons), all subject to some exceptions.
- Right to correction: consumers can ask controllers to correct inaccurate or outdated information they have provided.
- Right to deletion: consumers can request the deletion of their personal data held by a controller, with some exceptions.
- Right to portability: consumers can obtain a copy of the personal data they have provided to a controller, in a readily usable format, with some exceptions.
- Right to opt out: consumers can opt out of the sale of their personal data, targeted advertising, or profiling used for decisions with legal or similarly significant effects.
Consumers can designate an authorized agent to opt out of personal data processing on their behalf. The OCPA also introduces a requirement for controllers to to recognize universal opt-out signals, further simplifying the opt-out process.
This Oregon data privacy law stands out by giving consumers the right to request a specific list of third parties that have received their personal data. Unlike many other privacy laws, this one requires controllers to maintain detailed records of the exact entities they share data with, rather than just general categories of recipients.
Children’s personal data has special protections under the OCPA. Parents or legal guardians can exercise rights for children under the age of 13, whose data is classified as sensitive personal data and subject to stricter rules. For minors between 13 and 15, opt-in consent is required for specific processing activities, including its use for targeted advertising or profiling. “Opt-in” means that explicit consent is required before the data can be used for these purposes.
Consumers can make one free rights request every 12 months, to which an organization has 45 days to respond. They can extend that period by another 45 days if reasonably necessary. Organizations can deny consumer requests for a number of reasons. These include cases in which the consumer’s identity cannot reasonably be verified, or if the consumer has made too many requests within a 12-month period.
Oregon’s privacy law does not include private right of action, so consumers cannot sue data controllers for violations. California remains the only state that allows this provision.
What are the privacy requirements under the Oregon Consumer Privacy Act (OCPA)
Controllers must meet the following OCPA requirements to protect the personal data they collect from consumers.
Privacy notice and transparency under the OCPA
The Oregon privacy law requires controllers to be transparent about their data handling practices. Controllers must provide a clear, easily accessible, and meaningful privacy notice for consumers whose personal data they may process. The privacy notice, also known as the privacy policy, must include the following:
- Purpose(s) for processing personal data
- Categories of personal data processed, including the categories of sensitive data
- Categories of personal data shared with third parties, including categories of sensitive data
- Categories of third parties with which the controller shares personal data and how each third party may use the data
- How consumers can exercise their rights, including:
- How to opt out of processing for targeted advertising or profiling
- How to submit a consumer rights request
- How to appeal a controller’s denial of a rights-related request
- The identity of the controller, including any business name the controller uses or has registered in Oregon
- At least one actively monitored online contact method, such as an email address, for consumers to directly contact the organization
- A “clear and conspicuous description” for any processing of personal data for the purpose of targeted advertising or profiling “in furtherance of decisions that produce legal effects or effects of similar significance”
According to the Oregon DOJ website, the third-party categories requirement must strike a particular balance. It should offer consumers meaningful insights into the relevant types of businesses or processing activities, without making the privacy notice overly complex. Acceptable examples include ”analytics companies,” “third-party advertisers,” and ”payment processors,” among others.
The privacy notice or policy must be easy for consumers to access. It is typically linked in the website footer for visibility and accessibility from any page.
Data minimization and purpose limitation under the OCPA
The OCPA requires controllers to limit the personal data they collect to only what is “adequate, relevant, and reasonably necessary” for the purposes stated in the privacy notice. If the purposes for processing change, controllers must notify consumers and, where applicable, obtain their consent.
Data security under the OCPA
The Oregon data privacy law requires controllers to establish, implement, and maintain reasonable safeguards for protecting “the confidentiality, integrity and accessibility” of the personal data under their control. The data security measures also apply to deidentified data.
Oregon’s existing laws about privacy practices remain in effect as well. These laws include requirements for reasonable administrative, technical, and physical safeguards for data storage and handling, IoT device security features, and truth in privacy and consumer protection notices.
Data protection assessments (DPA) under the OCPA
Controllers must perform data protection assessments (DPA), also known as data protection impact assessments, for processing activities that present “a heightened risk of harm to a consumer.” These activities include:
- Processing for the purposes of targeted advertising
- Processing sensitive data
- The sale of personal data
- Processing for the purposes of profiling if there is a reasonably foreseeable risk to the consumer of:
- Unfair or deceptive treatment
- Financial, physical, or reputational injury
- Intrusion into a consumer’s private affairs
- Other substantial injury
The Attorney General may also require a data controller to conduct a DPA or share the results of one in the course of an investigation.
Consent requirements under the OCPA
The OCPA primarily uses an opt-out consent model. This means that in most cases controllers are not required to obtain consent from consumers before collecting or processing their personal data. However, there are specific cases where consent is required:
- Processing sensitive data requires explicit consent from consumers.
- For children’s data, the OCPA follows the federal Children’s Online Privacy Protection Act (COPPA) and requires consent from a parent or legal guardian before processing the personal data of any child under 13.
- Controllers must obtain explicit consent to use the personal data of minors between the ages of 13 and 15 for targeted ads, profiling, or sale.
- Controllers must obtain consent to use personal data for purposes other than those originally disclosed in the privacy notice.
To help consumers to make informed decisions about their consent, controllers must clearly disclose details about the personal data being collected, the purposes for which it is processed, who it is shared with, and how consumers can exercise their rights. Controllers must also provide clear, accessible information on how consumers can opt out of data processing.
Consumers must be able to revoke consent at any time, as easily as they gave it. Data processing must stop after consent has been revoked, and no later than 15 days after receiving the revocation.
Nondiscrimination under the OCPA
The OCPA prohibits controllers from discriminating against consumers who exercise their rights under the law. This includes actions such as:
- Denying goods or services
- Charging different prices or rates than those available to other consumers
- Providing a different level of quality or selection of goods or services to the consumer
For example, if a consumer opts out of data processing on a website, that individual cannot be blocked from accessing that website or its functions.
Some website features and functions do not work without certain cookies or trackers being activated, so if a consumer does not opt in to their use because they collect personal data, the site may not work as intended. This is not considered discriminatory.
This Oregon privacy law permits website operators and other controllers to offer voluntary incentives for consumers’ participation in activities where personal data is collected. These may include newsletter signups, surveys, and loyalty programs. Offers must be proportionate and reasonable to the request as well as the type and amount of data collected. This way, they will not look like bribes or payments for consent, which data protection authorities frown upon.
Third party contracts under the OCPA
Before starting any data processing activities, controllers must enter into legally binding contracts with third-party processors. These contracts govern how processors handle personal data on behalf of the controller, and must include the following provisions:
- The processor must ensure that all individuals handling personal data are bound by a duty of confidentiality
- The contract must provide clear instructions for data processing, detailing:
- The nature and purpose of processing
- The types of data being processed
- The duration of the processing
- The rights and obligations of both the controller and the processor
- The processor must delete or return the personal data at the controller’s direction or after the services have ended, unless legal obligations require the data to be retained
- Upon request, the processor must provide the controller with all necessary information to verify compliance with contractual obligations
- If the processor hires subcontractors, they must have contracts in place requiring the subcontractors to meet the processors’ obligations
- The contract must allow the controller or their designee to conduct assessments of the processor’s policies and technical measures to ensure compliance
These contracts are known as data processing agreements under some data protection regulations like the GDPR.
Universal opt-out mechanism under the OCPA
As of January 1, 2026, organizations subject to the OCPA must comply with a universal opt-out mechanism. Also called a global opt-out signal, it includes tools like the Global Privacy Control.
This mechanism enables a consumer to set their data processing preferences once and have those preferences automatically communicated to any website or platform that detects the signal. Preferences are typically set via a web browser plugin.
While this requirement is not yet standard across all US or global data privacy laws, it is becoming more common in newer legislation. Other states that require controllers to recognize global opt-out signals include California, Minnesota, Nebraska, Texas, and Delaware.
How to comply with the Oregon Consumer Privacy Act (OCPA)
Below is a non-exhaustive checklist to help your business and website address key OCPA requirements. For advice specific to your organization, consulting a qualified legal professional is strongly recommended.
- Provide a clear and accessible privacy notice detailing data processing purposes, shared data categories, third-party recipients, and consumer rights.
- Maintain a specific list of third parties with whom you share consumers’ personal data.
- Limit data collection to what is necessary for the specified purposes, and notify consumers if those purposes change.
- Obtain consent from consumers if you plan to process their data for purposes other than those that have been communicated to them.
- Implement reasonable safeguards to protect the confidentiality, integrity, and accessibility of personal and deidentified data.
- Conduct data protection assessments for processing activities with heightened risks, such as targeted advertising, activities involving sensitive data, or profiling.
- Implement a mechanism for consumers to exercise their rights, and communicate this mechanism to consumers.
- Obtain explicit consent for processing sensitive data, children’s data, or for purposes not initially disclosed.
- Provide consumers with a user-friendly method to revoke consent.
- Once consumers withdraw consent, stop all data processing related to that consent within the required 15-day period.
- Provide a simple and clear method for consumers to opt out of data processing activities.
- Avoid discriminatory practices against consumers exercising their rights, while offering reasonable incentives for data-related activities.
- Include confidentiality, compliance obligations, and terms for data return or deletion in binding contracts with processors.
- Comply with global opt-out signals like the Global Privacy Control by January 1, 2026.
Enforcement of the Oregon Consumer Privacy Act (OCPA)
The Oregon Attorney General’s office is the enforcement authority for the OCPA. Consumers can file complaints with the Attorney General regarding data processing practices or the handling of their requests. The Attorney General’s office must notify an organization of any complaint and in the event that an investigation is launched. During investigations, the Attorney General can request controllers to submit data protection assessments and other relevant information. Enforcement actions must be initiated within five years of the last violation.
Controllers have the right to have an attorney present during investigative interviews and can refuse to answer questions. The Attorney General cannot bring in external experts for interviews or share investigation documents with non-employees.
Until January 1, 2026, controllers have a 30-day cure period during which they can fix OCPA violations. If the issue is not resolved within this time, the Attorney General may pursue civil penalties. The right to cure sunsets January 1, 2026, after which the opportunity to cure will only be at the discretion of the Attorney General.
Fines and penalties for noncompliance under the OCPA
The Attorney General can seek civil penalties up to USD 7,500 per violation. Additional actions may include seeking court orders to stop unlawful practices, requiring restitution for affected consumers, or reclaiming profits obtained through violations.
If the Attorney General succeeds, the court may require the violating party to cover legal costs, including attorney’s fees, expert witness fees, and investigation expenses. However, if the court determines that the Attorney General pursued a claim without a reasonable basis, the defendants may be entitled to recover their attorney’s fees.
How does the Oregon Consumer Privacy Act (OCPA) affect businesses?
The OCPA introduces privacy law requirements that are similar to other state data protection laws. These include obligations around notifying consumers about data practices, granting them access to their data, limiting data use to specific purposes, and implementing reasonable security measures.
One notable distinction is that the law sets different compliance timelines based on an organization’s legal status. The effective date for commercial entities is July 1, 2024, while nonprofit organizations are given an additional year and must comply by July 1, 2025.
Since the compliance deadline for commercial entities has already passed, businesses that fall under the OCPA’s scope should ensure they meet its requirements as soon as possible to avoid penalties. Nonprofits, though they have more time, should actively prepare for compliance.
Businesses covered by federal laws like HIPAA and the GLBA, which may exempt them from other state data privacy laws, should confirm with a qualified legal professional whether they need to comply with the OCPA.
The Oregon Consumer Privacy Act (OCPA) and consent management
Oregon’s law is based on an opt-out consent model. In other words, consent does not need to be obtained before collecting or processing personal data unless it is sensitive or belongs to a child.
Processors do need to inform consumers about what data is collected and used and for what purposes, as well as with whom it is shared, and if it is to be sold or used for targeted advertising or profiling.
Consumers must also be informed of their rights regarding data processing and how to exercise them. This includes the ability for consumers to opt out of processing of their data or change their previous consent preferences. Typically, this information is presented on a privacy page, which must be kept up to date.
As of 2026, organizations must also recognize and respect consumers’ consent preferences as expressed via a universal opt-out signal.
Websites and apps can use a banner to inform consumers about data collection and enable them to opt out. This is typically done using a link or button. A consent management platform (CMP) like the Usercentrics CMP for website consent management or app consent management also helps to automate the detection of cookies and other tracking technologies that are in use on websites and apps.
A CMP can streamline sharing information about data categories and the specific services in use by the controller and/or processor(s), as well as third parties with whom data is shared.
The United States still only has a patchwork of state-level privacy laws rather than a single federal law. As a result, many companies doing business across the country, or foreign organizations doing business in the US, may need to comply with a variety of state-level data protection laws.
A CMP can make this easier by enabling banner customization and geotargeting. Websites can display data processing, consent information, and choices for specific regulations based on specific user location. Geotargeting can also improve clarity and user experience by presenting this information in the user’s preferred language.
Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or a privacy specialist regarding data privacy and protection issues and operations.
In the years since the General Data Protection Regulation (GDPR) came into force in 2018, there have been some dramatic headlines about fines levied on companies for violating its requirements.
For the most part these headlines have involved influential tech platforms with potentially billions of users. However, organizations of all sizes have been penalized for not adequately obtaining user consent for processing personal data, not meeting the requirements of their chosen legal basis, experiencing a data breach, or other issues.
Apps developers and publishers haven’t been in the news as much, even though our research showed that 90 percent of apps available in the EU that we looked at were not compliant with the GDPR. One exception was France’s data protection authority CNIL, which fined Apple and Voodoo Games in 2023 for using an advertising identifier without users’ consent.
There have been other fines levied by data protection authorities around Europe for apps’ GDPR violations. That regulation and other laws do not distinguish between websites and apps with regards to compliance requirements. We take a look at several examples to explore what happened, what the penalties were, and how to do business compliantly.
What are the requirements for legal data processing under the GDPR?
The GDPR applies to organizations that process the personal data of EU residents, whether or not the company is located in the EU. That processing could be to enable apps to function, to deliver personalized advertising, or to provide analytics data to improve performance, for example.
Companies need to abide by “lawfulness of processing”, i.e. meet the requirements of a relevant legal basis to justify their collection and processing of personal data.
Art. 6 GDPR covers these six legal bases:
- explicit, informed consent from the data subject
- performing a contract with the data subject
- compliance with a legal obligation to which the data controller is subject
- protecting the vital interests of the data subject or of another natural person
- in the public interest, or if the data controller is exercising official authority
- legitimate interests pursued by the controller or by a third party
Consent is a common choice of legal basis, though the GDPR requires user consent to be “freely given, specific, informed and unambiguous”. As we will see, this is where a number of companies have violated the law.
Organizations are also required to collect, store, and document users’ consent choices securely, and provide important information to users about data processing, their rights, and other factors.
Meeting these requirements on a website in a way that’s clear, compliant, and user-friendly can be challenging, and managing it in apps on small mobile screens elevates the challenge, especially when companies both need to comply with GDPR requirements and need access to quality data for advertising, analytics, and other purposes.
Who is responsible for GDPR enforcement?
GDPR enforcement is a collective effort across several authorities within the EU and is mainly in the hands of national Data Protection Authorities (DPA) within each EU member state. These supervisory authorities, established under Chapter 6 GDPR, are independent public authorities.
They have the power to handle complaints, investigate compliance, and issue fines or other penalties for established violations. DPAs also issue guidelines and provide resources on GDPR compliance.
These groups work together to ensure that the GDPR’s requirements are consistently applied across the EU, and are supported by the European Data Protection Board (EDPB), which increases collaboration and cooperation among DPAs and advises on key matters of data privacy and protection.
What are the fines and penalties for GDPR violations?
Some data privacy laws around the world provide a “cure period” if an organization has been found to have violated the law. This enables them to correct the issue and ensure it won’t happen again while avoiding fines and other penalties.
The GDPR does not require provision of a cure period, though arrangements are at the discretion of EU member countries’ data protection authorities. GDPR enforcement is handled at a national level, and countries can also add their own specific data privacy and protection requirements.
Art. 83 GDPR covers penalties for violations. These include:
- warnings or reprimands
- temporary or permanently imposed restrictions on data processing
- ordering the erasure of personal data
- suspending international data transfers to third countries
- imposing administrative fines
- imposing criminal penalties
Administrative fines are probably the most well known GDPR penalty and what tends to make the headlines. There are two levels of administrative fines, depending on severity of the infraction.
Tier one administrative fines
The first tier of GDPR fines are most commonly used for first time or less severe infractions. They can be up to EUR 10 million or two percent of global annual revenue for the preceding financial year, whichever is higher.
Tier two administration fines
The second tier GDPR fines are generally for repeat violators or more severe infractions. They can be up to EUR 20 million or four percent of global annual revenue for the preceding financial year, whichever is higher.
The smallest GDPR fines have been “three-digit amounts”. To date, as of early 2025, the largest GDPR fine has been levied on Meta, parent company of Facebook, Instagram, and WhatsApp, for EUR 1.2 billion.
GDPR fines for app publishers
Over the past several years, data protection authorities around the world have increasingly turned their attention to mobile apps privacy compliance. The California Attorney General announced increased focus on mobile apps compliance in 2023. In September 2024, France’s CNIL published recommendations to enable better privacy compliance in apps, with increased enforcement beginning in 2025.
Let’s look at some notable enforcement actions that European DPAs have levied on prominent mobile apps and platforms.
Norwegian Data Protection Authority Datatilsynet vs. Grindr
Norway’s Datatilsynet fined social networking and online dating app Grindr approximately EUR 6.5 million in 2021 for disclosing user data to third parties for behavioral advertising without a legal basis. The data shared included:
- GPS location
- IP address
- Advertising ID
- Age
- Gender
- Status as a Grindr user
The DPA also considered that use of Grindr is sensitive personal information, as it strongly indicates the user’s sexual orientation or preferences, which would merit additional protections under the law.
The Norwegian Consumer Council filed a complaint against Grindr in 2020. The company claimed to have collected valid consent information from users to enable sharing their personal data with advertising partners.
However, the consents were not valid as users did not have consent choices — e.g. to opt out of sharing data with third parties for advertising — and had to accept the privacy policy in its entirety to be able to use the app. Additionally, users were not properly notified about the app’s sharing of personal data. Both of these issues violated the GDPR’s requirements.
Italian Data Protection Authority Garante vs. Clubhouse
In December 2022, Italian DPA Garante fined social audio chat app Clubhouse EUR 2 million for multiple GDPR infractions:
- Lack of transparency about the use of users’ personal data and information about connections among users
- Storage and sharing or user-generated audio without users’ consent
- Indefinite retention periods for recordings
- Not identifying an accurate legal basis prior to profiling users and sharing their account information
Clubhouse was also required to adopt measures to comply with the GDPR, in addition to being prohibited from further processing of personal data for marketing or profiling purposes without obtaining informed and explicit user consent.
Clubhouse is owned by Alpha Exploration, which is a US company with no EU presence, however, Clubhouse services were available to users in the EU, making the app subject to GDPR compliance.
Irish Data Protection Commission vs. WhatsApp
Ireland’s Data Protection Commission fined instant messaging and VoIP service WhatsApp EUR 5.5 million in 2023. As noted earlier, WhatsApp’s parent company is US-based Meta, which also owns Facebook and Instagram, among other platforms and services.
WhatsApp Ireland was given six months from when the decision was handed down to bring their data processing operations into compliance with the GDPR.
In advance of the GDPR coming into effect on May 25, 2018, WhatsApp Ireland updated its Terms of Service, forcing users to click “agree and continue “ to accept the new terms to be able to access the app.
Users were forced to accept the terms in whole and consent to processing of their personal data for security and service improvement purposes. They had no granular consent options. Declining the terms prevented users from accessing the app’s services entirely. The initial complaint was filed by a German WhatsApp user.
WhatsApp also didn’t provide users with adequate information about the legal basis for data processing, preventing clear understanding of how their personal data was being used or shared, or for what purposes.
WhatsApp had considered users’ acceptance of the updated Terms of Service to be entering into a contract with the company. Fulfilling a contract is an acceptable legal basis under the GDPR, and the company took the position that processing users’ personal data for delivering its services was necessary to perform that contract.
The complaint, however, argued that by requiring users’ acceptance of the updated Terms of Service, the company was forcing user consent, and thus consent was their legal basis, not contract fulfillment. However, the conditions of the consent invalidated it under the GDPR, as it was not adequately informed or voluntary.
Usercentrics helps you stay GDPR-compliant and growing monetization
Increased GDPR enforcement for apps compliance and ever more savvy users mean that it’s not worth risking trying to get around data privacy requirements. Especially since there are robust, user-friendly tools like Usercentrics App CMP that streamline consent management. Collect consent compliantly on your apps and get the data you need to grow your monetization, without getting in your users’ way.
Usercentrics delivers an SDK that enables fast setup. Access over 2,200 pre-built legal templates for your data processing services, and use the App Scanner to seamlessly detect and integrate your vendors, SSPs, and SDKs. Our expert team is also here for you every step of the way with expert guidance and detailed documentation.
Learn more about how Usercentrics can help grow your business. Check out our case study with Homa Games and how they achieved a 10% increase in Ad LTV with user consent and achieved and maintained privacy compliance.
The California Consumer Privacy Act (CCPA), enacted in 2020, was the first and is one of the strictest data privacy laws in the US. Noncompliance can result in hefty fines, which can harm a company’s revenue and reputation.
The California Consumer Privacy Act (CCPA) has reshaped the way businesses operating in California handle consumer data since its enactment in 2020. Designed to give California residents more control over their personal information, the CCPA has serious consequences for companies that fail to comply. Penalties can range from USD 2,663 to USD 7,988 per violation (adjusted biannually to the Consumer Price Index), and if businesses neglect their obligations, fines can add up quickly.
Whether it’s by failing to respond to consumer rights requests or mishandling sensitive data, there are several ways businesses can fall short of compliance. With new requirements introduced under the California Privacy Rights Act (CPRA), which amended the CCPA, diligent CCPA compliance is crucial for any company that collects or shares personal data of California residents.
What is the California Consumer Privacy Act (CCPA)?
The California Consumer Privacy Act (CCPA) was a groundbreaking privacy law that grants California residents several rights and increased control over their personal information.
It applies to businesses that meet certain thresholds, such as gross annual revenues of over USD 26,625,000, handling data for 50,000 or more consumers, households, or devices (updated to 100,000 under the CPRA), or deriving 50 percent or more of annual revenue from selling personal information.
California residents can request detailed information about how their personal data is used and shared. Businesses, in turn, must ensure compliance by implementing policies and procedures to meet these requirements.
Who is subject to CCPA penalties?
CCPA penalties apply to companies headquartered inside or outside California if they process California residents’ data, meet the CCPA thresholds, and have violated the law’s requirements. Additionally, service providers and third-party vendors may face consequences for CCPA noncompliance if their actions contribute to violations by a qualified business.
Organizations of all sizes and industries, from tech giants to small retailers, are subject to the law’s requirements if they process data belonging to California residents. Noncompliance risks are significant, particularly for businesses that rely heavily on consumer data for operations or marketing.
Types of CCPA violations
CCPA violations can take various forms, each with its own set of legal and financial consequences. Common violations include:
- Failure to provide required notices: Businesses must inform consumers about their data collection practices. Failing to provide clear and accessible privacy notices is a common violation.
- Ignoring consumer rights requests: Under the CCPA, consumers have the right to request access to their data, have their data deleted, or opt out of data sales. Failure to honor these requests is a significant violation.
- Inadequate data security: Companies must implement reasonable security measures to protect personal information. A lack of adequate safeguards can lead to data breaches and noncompliance.
- Selling personal information without consent: Selling personal data without obtaining proper consumer consent (like for data belonging to minors) where required is another serious violation.
These violations often stem from issues like poor data governance, outdated systems, or insufficient staff training. Any of these breaches can result in significant penalties — both financial and reputational — which means compliance should be a top priority for businesses.
Penalties for violating the CCPA
The California Attorney General and the California Privacy Protection Agency (CPPA) are responsible for enforcing California’s privacy laws, including the CCPA as amended by the California Privacy Rights Act (CPRA).
Civil penalties for violating the CCPA range from USD 2,663 to USD 7,988 per violation. Each affected person’s data can be considered a separate violation, so fines can escalate quickly. For instance, a violation, like a data breach, affecting thousands of customers at a company, could lead to fines in the millions.
Consumers also have a limited right to take legal action if they are impacted by a violation like a data breach, such as when encrypted personal data is disclosed. They can sue businesses for:
- Compensation: USD 107 to USD 799 per person, per incident, or reimbursement for actual damages caused.
- Court orders: Injunctions or declaratory relief to prevent further violations or clarify legal obligations.
- Additional remedies: Any other relief the court deems reasonable.
It’s worth noting that the CPRA eliminated the 30-day cure period that was previously applied under the CCPA. Allowing a cure period, or opportunity to correct a violation without penalties, can still happen, but is only at the authorities’ discretion. This can mean increased enforcement faster for companies in violation of the law.
How big are CCPA fines for noncompliance?
The financial impact of CCPA fines depends on the scale and nature of the violation. For example, a data breach affecting 1,000 individuals could result in fines exceeding USD 2.5 million. Similarly, failing to address consumer requests in a timely manner can result in penalties that accumulate rapidly.
Examples of companies fined under CCPA
Several high-profile cases exemplify the risks of noncompliance with the CCPA.
French beauty retailer Sephora became the first company to face a significant fine under the CCPA in 2022. The beauty retailer agreed to pay USD 1.2 million to settle allegations that it failed to disclose the sale of consumer data and did not offer a proper mechanism for consumers to opt out of these sales. In addition to the fine, Sephora was required to update its privacy policies and implement stronger data handling practices. This case demonstrates the CCPA’s emphasis on transparency and consumer control over their personal information.
Another notable case involved American food ordering and delivery company DoorDash, which faced a fine of USD 375,000 in 2024. The company was found to have violated the CCPA by sharing its customers’ personal information with other businesses as part of a marketing cooperative in exchange for advertising opportunities.
This enforcement action highlighted the CPRA’s key requirement to obtain explicit consent before sharing consumer data. The CCPA enabled consumers to opt out of the sale of their data, and the CPRA added the rights to also opt out of sharing of data, targeted advertising, or profiling.
CPRA penalties: Updates you need to know
The California Privacy Rights Act (CPRA) amended the CCPA, and introduced expanded enforcement mechanisms and higher standards for data protection. A key change is the establishment of the California Privacy Protection Agency (CPPA), which oversees compliance and enforcement. In addition to being able to opt out of sharing, targeted advertising, or profiling, consumers now have the right to limit the use of their sensitive personal information, such as biometric or health data.
The CPRA also imposes stricter requirements for contracts with third-party vendors and increases scrutiny towards businesses that share data. These changes necessitate updates to compliance frameworks to mitigate new risks.
Reasons for penalties under CCPA and CPRA
Penalties under the CCPA and CPRA typically stem from a failure to meet the privacy requirements of these laws. Whether it’s by failing to respond to consumer requests, mishandling sensitive data, or lacking proper security measures, each violation reflects a company’s failure to meet the CCPA/CPRA standards for consumer protection and data privacy.
Below are some of the main reasons businesses are typically penalized under the CCPA and CPRA.
CCPA penalties
Under the CCPA, businesses can face penalties for several key violations, including:
- Failure to honor consumer rights requests: Also called data subject access requests (DSAR), not responding to consumer requests for access to, deletion of, or opting out of data sales is one of the most common infractions.
- Lack of transparency: Businesses must disclose how they collect, use, and sell personal data. Failing to provide clear and accessible privacy notices can result in significant penalties.
- Inadequate data security: Companies are required to implement reasonable measures to protect personal data. Additionally, in the event of a breach, there are requirements for adequate response. Failure to meet these requirements, especially in the event of a data breach, can lead to significant enforcement actions.
CPRA penalties
The CPRA, which came into effect in January 2023, builds on the foundation laid by the CCPA and introduces stricter requirements. Businesses can face penalties for:
- Mishandling sensitive personal information: The CPRA imposes additional safeguards for sensitive data, such as health and financial information. Mishandling or failing to properly protect this data can lead to significant fines.
- Failure to comply with new consumer rights: The CPRA introduced the right to correct inaccurate personal information, to obtain a copy of one’s data (portability), and to access information about automated decision-making and opt out of its use. Not supporting these rights can result in penalties.
- Not meeting vendor contract requirements: The CPRA requires businesses to have more stringent data protection contracts with third-party vendors. Failure to comply with these contract requirements is grounds for enforcement.
These updates reflect a shift toward more rigorous data privacy practices, which, along with a dedicated agency for enforcement, require many businesses to adopt a more proactive approach to compliance.
Comparing CCPA fines to GDPR penalties
While both the CCPA and the General Data Protection Regulation (GDPR) aim to protect consumer privacy, their penalties differ significantly.
The GDPR stipulates fines of up to EUR 20 million or 4 percent of annual global turnover, whichever is higher, which is a more severe financial deterrent than the CCPA’s USD 7,500 per violation. Additionally, the GDPR applies to organizations worldwide if they process data from EU residents, making its scope broader than the CCPA, which focuses only on California residents.
However, the CCPA’s enforcement has gained momentum, particularly with the CPRA’s enhanced scope and penalties. While the GDPR is often regarded as the gold standard for privacy regulations, the evolving landscape of US privacy laws demonstrates an increasing focus on consumer rights and data accountability.
Strategies to avoid CCPA penalties
Updates to the CCPA have made its requirements stricter. Here are practical tips organizations can use to help maintain compliance and avoid fines.
Implement comprehensive data mapping
Compliance begins with understanding the data your business collects, processes, and shares. Create a detailed inventory of personal information, including where it is stored, how it is used, and who has access. Regularly update this inventory to account for changes in business operations or data practices.
Enhance transparency with clear privacy notices
Make privacy policies and notices detailed and easy to understand. Clearly inform consumers about what data you collect, how it is used, what their rights are under the CCPA, and how to exercise them. Transparency builds trust and helps avoid misunderstandings that could lead to violations.
Strengthen data security measures
Implement security protocols to protect personal information from unauthorized access, breaches, and misuse. These protocols may include encryption, access controls, and regular security audits. Also securely delete, anonymize, or return data that is no longer needed. Data can’t be accessed in a breach if a company no longer holds it. Adequate data protection reduces the likelihood of incidents that could result in penalties or lawsuits.
Train employees on compliance requirements
Educate your staff about CCPA regulations and their role in maintaining compliance, especially with relevant policies and examples for your business for clarity. Regular training sessions can help employees recognize risks as well as manage consumer requests quickly and appropriately to avoid accidental violations and uphold best practices.
Conduct regular compliance audits
Perform periodic reviews of your compliance efforts to identify and address potential gaps. These audits should evaluate your data practices, security measures, and consumer rights management for alignment with CCPA and CPRA requirements.
How Usercentrics Consent Management Platform can help you avoid CCPA penalties
Managing privacy compliance can be complex, and it’s important to avoid costly CCPA penalties and damage to your company’s operations and reputation. Implementing the right tools to manage consumer consent, track data usage, and maintain transparency is key to avoiding penalties.
A solution like Usercentrics Consent Management Platform (CMP) can help your business meet CCPA requirements. By providing the required notification and opt-out link and streamlining consent collection, companies can more easily navigate compliance challenges.
For companies looking to stay ahead of privacy compliance requirements and reduce the risk of CCPA penalties, using Usercentrics solutions can be an effective step toward meeting legal obligations and protecting customer relationships.
2024 saw the number of new data privacy regulations continue to grow, especially in the United States. It also saw the effects of laws passed earlier as they came into force and enforcement began, like with the Digital Markets Act (DMA). But perhaps the biggest impact of data privacy in 2024 was how quickly and deeply it’s become embedded in business operations.
Companies that may not have paid a lot of attention to regulations have rapidly changed course as data privacy requirements have been handed down by companies like Google and Facebook. The idea of “noncompliance” stopped being complicated yet nebulous and became “your advertising revenue is at risk.”
We expect this trend of data privacy becoming a core part of doing business to continue to grow through 2025 and beyond. More of the DMA’s gatekeepers and other companies are likely to ramp up data privacy and consent requirements throughout their platform ecosystems and require compliance from their millions of partners and customers. Let’s not forget that data privacy demands from the public continue to grow as well.
We also expect to see more laws that include or dovetail with data privacy as they regulate other areas of technology and its effect on business and society. AI is the biggest one that comes to mind here, particularly with the EU AI Act having been adopted in March 2024. Similarly, data privacy in marketing will continue to influence initiatives across operations and digital channels. Stay tuned to Usercentrics for more about harnessing Privacy-Led Marketing.
Let’s peer into the future and look at how the data privacy landscape is likely to continue to evolve in the coming year, where the best opportunities for your company may lie, and what challenges you should plan for now.
2025 in global data privacy regulation
For the last several years, change has been the only constant in data privacy regulation around the world. Gartner predicted that 75 percent of the world’s population would be protected by data privacy law by the end of 2024. Were they right?
According to the International Association of Privacy Professionals (IAPP), as of March 2024, data privacy coverage was already close to 80 percent. So the prediction had been exceeded even before we were halfway through the year.
Data privacy regulation in the United States
The United States passed a record number of state-level data privacy regulations in 2024, with Kentucky, Maine, Maryland, Minnesota, Nebraska, New Hampshire, New Jersey, Rhode Island, and Vermont coming on board to bring the number of state-level US data privacy laws to 21. By contrast, six states passed laws in 2023, which was a record number to date then.
The privacy laws in Florida, Montana, Oregon, and Texas went into effect in 2024. The privacy laws in Delaware, Iowa, Maryland, Minnesota, Nebraska, New Hampshire, New Jersey, and Tennessee go into effect in 2025.
Since the majority of US states still don’t have data privacy regulations, more of these laws are likely to be proposed, debated, and (at least sometimes) passed. It will be interesting to see if certain states that have wrangled with privacy legislation repeatedly, like Washington, will make further progress in that direction.
April 2024 saw the release of a discussion draft of the American Privacy Rights Act (APRA), the latest federal legislation in the US to address data privacy. It made some advances during the year, with new sections added addressing children’s data privacy (“COPPA 2.0”), privacy by design, obligations for data brokers, and other statutes. However, the legislation has not yet been passed, and with the coming change in government in January 2025, the future of APRA is unclear.
Data privacy regulation in Europe
The European Union continues to be at the forefront of data privacy regulation and working to keep large tech platforms in check. Two recent regulations, particularly, will continue to shape the tech landscape for some time.
The Digital Markets Act (DMA) and its evolution
With the Digital Markets Act in effect, the first six designated gatekeepers (Alphabet, Amazon, Apple, ByteDance, Meta, and Microsoft) had to comply as of March 2024. Booking.com was designated in May, and had to comply by November.
There is a good chance that additional gatekeepers will be designated in 2025, and that some current ones that have been dragging their metaphorical feet will start to accept the DMA’s requirements. We can expect to see the gatekeepers roll out new policies and requirements for their millions of customers in 2025 to help ensure privacy compliance across their platforms’ ecosystems.
More stringent consent requirements are also being accompanied by expanded consumer rights, including functions like data portability, which will further enhance competitive pressures on companies to be transparent, privacy-compliant, and price competitive while delivering great customer experiences.
The AI Act and its implementation
While the entirety of the AI Act will not be in effect until 2026, some key sections are already in effect in 2024, or coming shortly, so we can expect to see their influence. These include the ban on prohibited AI systems in EU countries and the rules for general purpose AI systems.
Given that training large language models (LLMs) requires an almost endless supply of data, and organizations aren’t always up front about getting consent for it, it’s safe to say that there will continue to be clashes over the technology’s needs and data privacy rights.
Data privacy around the world
There was plenty in the news involving data privacy around the world in 2024, and the laws and lawsuits reported on will continue to make headlines and shape the future of privacy in 2025.
AI, privacy, and consent
There have been complaints reported and lawsuits filed throughout 2024 regarding data scraping and processing without consent. Canadian news publishers and the Canadian Legal Information Institute most recently joined the fray. We don’t expect these issues to be resolved any time soon, though there should be some influential case law resulting once these cases have made their way through the courts. (Unlikely that all of them will be resolved by settlements.) The litigation may have significant implications for the future of these AI companies as well, and not just for their products.
Social media and data privacy
As noted, laws that dovetail with data privacy are also becoming increasingly notable. One recent interesting development is Australia passing a ban on social media for children under 16. In addition to mental health concerns, some social media platforms — including portfolio companies of Alphabet, Meta, and TikTok parent company ByteDance — have run afoul of data privacy regulators, with penalties for collecting children’s data without consent, among other issues. It will be very interesting to see how this ban rolls out, how it’s enforced, and if it serves as inspiration elsewhere for comparable legislation.
The latest generation of data privacy laws and regulatory updates
The UK adopted its own customized version of the General Data Protection Regulation (GDPR), the UK GDPR, upon leaving the EU. It has recently published draft legislation for the UK Data (Use and Access) Bill, which is meant to further modernize the UK GDPR and reform the way data is used to benefit the economy. We will see if the law does get passed and what its practical effects may be.
Further to recent laws and updates for which we are likely to see the effects in 2025, in September 2024, Vietnam issued the first draft of its Personal Data Protection Law (PDPL) for public consultation.
Malaysia passed significant updates to its Personal Data Protection Act (PDPA) via the Personal Data Protection (Amendment) Act. The PDPA was first passed in 2010, so it was due for updates, and companies doing business in the country can expect the new guidelines to be enforced.
Also, the two-year grace period on Law No. 27 in Indonesia’s Personal Data Protection law (PDP Law) ended in October 2024, so we can expect enforcement to ramp up there as well.
Asia already has considerable coverage with data privacy regulation, as countries like China, Japan, South Korea, and India all have privacy laws in effect as well.
The future of privacy compliance and consent and preference management
Just as the regulation of data privacy is reaching an inflection point of maturity and becoming mainstream, so are solutions for privacy compliance, consent, and preference management.
Integrated solutions for compliance requirements and user experience
Companies that are embracing Privacy-Led Marketing in their growth strategy want solutions that can meet several needs, support growth, and seamlessly integrate into their martech stack. Simply offering a cookie compliance solution will no longer be enough.
Managing data privacy will require solutions that enable companies to obtain valid consent — for requirements across international jurisdictions — and signal it to ad platforms and other important tools and services. In addition to consent, companies need to centralize privacy user experience to provide customers with clear ways to express their preferences and set permissions in a way that respects privacy and enables organizations to deliver great experiences with customized communications, offers, and more.
Customer-centric data strategies
It may take some time for third-party cookie use and third-party data to go away entirely, but zero- and first-party data is the future, along with making customers so happy they never want to leave your company. Rather than trying to collect every bit of data possible and preventing them from taking their business elsewhere.
We may see more strategies like Meta’s “pay or ok” attempt where users can pay a subscription fee to avoid having their personal data used for personalized ads, but given EU regulators’ response to the scheme, similar tactics are likely to have an uphill battle, at least in the EU.
Delivering peace of mind while companies to stay focused on their core business
SMBs, particularly, also have a lot to do with limited resources, in addition to focusing on growing their core business. We can expect to see further deep integration of privacy compliance tools and services. These solutions will automate not only obtaining and signaling consent to third-party services, but also notifying users about data processing services in use and data handling, e.g. via the privacy policy, responding to data subject access requests (DSAR), and other functions.
Further to international compliance requirements, as companies grow they are going to need data privacy solutions that scale with them, and enable them to easily handle the complexities of complying with the requirements of multiple privacy laws and other relevant international and/or industry-specific polices and frameworks.
Frameworks like the IAB’s Global Privacy Platform (GPP) are one way of achieving this, enabling organizations to select relevant regional privacy signals to include depending on their business needs.
Usercentrics in 2025
Our keyword to encapsulate data privacy for 2024 was “acceleration”. For 2025 it’s “maturity.” Data privacy laws and other regulations that include data privacy (like AI). Companies’ needs for solutions that enable multi-jurisdictional compliance and data management. The widespread embrace of data privacy as a key part of doing business, and strategizing Privacy-Led Marketing for sustainable growth and better customer relationships. The financial and operational risks of noncompliance moving beyond regulatory penalties to revenues from digital advertising, customer retention, and beyond.
The Usercentrics team is on it. We’ll continue to bring you easy to use, flexible, reliable solutions to manage consent, user preferences, and permissions, and enable you to maintain privacy compliance and be transparent with your audience as your company grows. With world-class support at every step, of course. Plus we have a few other things up our sleeves. (Like this.) Stay tuned! Here’s to the Privacy-Led Marketing era. We can’t wait to help your company thrive.
The Video Privacy Protection Act (VPPA) is a federal privacy law in the United States designed to protect individuals’ privacy regarding their video rental and viewing histories. The VPPA limits the unauthorized sharing of video rental and purchase records. It was passed in 1988 after the public disclosure of Supreme Court nominee Robert Bork’s video rental records raised concerns about the lack of safeguards for personal information.
At the time of the Act’s enactment, video viewing was an offline activity. People would visit rental stores, borrow a tape, and return it after watching. Today, streaming services and social media platforms mean that watching videos is a largely digital activity. In 2023, global revenue from online video streaming reached an estimated USD 288 billion, with the US holding the largest share of that market.
Still, the VPPA has remained largely unchanged since its enactment, apart from a 2013 amendment. However, recent legal challenges to digital video data collection have led courts to reinterpret how the law applies to today’s video viewing habits.
In this article, we’ll examine what the VPPA law means for video platforms, the legal challenges associated with the law, and what companies can do to enable compliance while respecting users’ privacy.
Scope of the Video Privacy Protection Act (VPPA)
The primary purpose of the Video Privacy Protection Act (VPPA) is to prevent the unauthorized disclosure of personally identifiable information (PII) related to video rentals or purchases. PII under the law “includes information which identifies a person as having requested or obtained specific video materials or services from a video tape service provider.”
The law applies to video tape service providers, which are entities involved in the rental, sale, or delivery of prerecorded video materials. Courts have interpreted this definition to include video streaming platforms like Hulu and Netflix, which have widely replaced physical video tape service providers.
The VPPA protects the personal information of consumers. The law defines consumers as “any renter, purchaser, or subscriber of goods or services from a video tape service provider.”
Video tape service providers are prohibited from knowingly disclosing PII linking a consumer to specific video materials, except in the following cases:
- direct disclosure to the consumer
- to a third party with informed, written consent provided by the consumer
- for legal purposes, such as in response to a valid warrant, subpoena, or court order
- limited marketing disclosures, but only if:
- consumers are given a clear opportunity to opt out, and
- the shared data includes only names and addresses and not specific video titles, unless it is for direct marketing to the customer
- as part of standard business operations, such as processing payments
- under a court order, if a court determines the information is necessary and cannot be met through other means, and the consumer is given the opportunity to contest the claim
The 2013 amendment expanded the conditions for obtaining consent, including through electronic means using the Internet. This consent must:
- be distinct and separate from other legal or financial agreements
- let consumers provide consent either at the time of disclosure or in advance for up to two years, with the option to revoke it sooner
- offer a clear and conspicuous way for consumers to withdraw their consent at any time, whether for specific instances or entirely
Tracking technologies and Video Privacy Protection Act (VPPA) claims
Tracking technologies like pixels are central to many claims alleging violations of the VPPA. Pixels are small pieces of code embedded on websites to monitor user activities, including interactions with online video content. These technologies can collect and transmit data, such as the titles of videos someone viewed, along with other information that may identify individuals. This combination of data may meet the VPPA’s definition of personally identifiable information (PII).
VPPA claims often arise when companies use tracking pixels on websites with video content and transmit information about users’ video viewing activity to third parties without requesting affirmative consent. Courts have debated what constitutes a knowing disclosure under the VPPA, but installing tracking pixels that collect and share video data has been found sufficient to potentially establish knowledge in some cases.
Lawsuits under the Video Privacy Protection Act (VPPA)
Many legal claims under the VPPA focus on one or more of three critical questions:
- Does the party broadcasting videos qualify as a video tape service provider?
- Is the individual claiming their rights were violated considered a consumer?
- Does the disclosed information qualify as PII?
Below, we’ll look at how courts have considered these questions and interpreted the law in the context of digital video consumption.
Does the party broadcasting video qualify as a video tape service provider?
Who is considered a video tape service provider under the law may depend on multiple factors. Courts have established that online streaming services qualify, but some rulings have considered other factors, which we’ll outline below, to decide whether a business meets the law’s definition.
Live streaming
The VPPA law defines a video tape service provider as a person engaged in the business of “prerecorded video cassette tapes or similar audiovisual materials.” In 2022, a court ruled that companies do not qualify as video tape service providers for any live video broadcasts, as live streaming does not involve prerecorded content.
However, if a company streams prerecorded content, it may qualify as a video tape service provider in relevant claims.
“Similar audio visual materials”
The definition of a video tape service provider in the digital age includes more than just video platforms that broadcast movies and TV shows. In a 2023 case, a court ruled that a gaming and entertainment website offering prerecorded streaming video content fell within the scope of the VPPA definition of a video tape service provider.
Focus of work
Another 2023 ruling found that the VPPA does not apply to every company that happens to deliver audiovisual materials “ancillary to its business.” Under this decision, a video tape service provider’s primary business must involve providing audiovisual materials. Businesses using video content only as part of their marketing strategy would not qualify as a video tape service provider under this reading of the law.
Is the individual claiming rights violations considered a consumer?
Online video services frequently operate on a subscription-based business model. Many legal challenges under the VPPA focus on whether an individual qualifies as a “subscriber of goods and services from a video tape service provider.”
Type of service subscribed to
Courts have varied in their opinions on whether being a consumer depends on subscribing to videos specifically. In a 2023 ruling, a court held that subscribing to a newsletter that encourages recipients to view videos, but is not a condition to accessing them, does not qualify an individual as a subscriber of video services under the VPPA.
By contrast, a 2024 ruling took a broader approach, finding that the term “subscriber of goods and services” is not limited to audiovisual goods or services. The Second Circuit Federal Court of Appeal determined that subscribing to an online newsletter provided by a video tape service provider qualifies an individual as a consumer. This decision expanded the definition to recognize individuals who subscribe to any service offered by a video tape service provider as consumers.
Payment
Courts have generally agreed that providing payment to a video tape service provider is not necessary for an individual to be considered a subscriber. However, other factors play a role in establishing this status.
A 2015 ruling held that being a subscriber requires an “ongoing commitment or relationship.” The court found that merely downloading a free mobile app and watching videos without registering, providing personal information, or signing up for services does not meet this standard.
However, in a 2016 case, the First Circuit Federal Court of Appeal determined that providing personal information to download a free app — such as an Android ID and GPS location — did qualify the individual as a subscriber. Similarly, in the 2024 ruling above, the Second Circuit found that providing an email address, IP address, and device cookies for newsletter access constituted a meaningful exchange of personal information, qualifying the individual as a subscriber.
Does the disclosed information qualify as PII?
Courts have broadly interpreted PII to include traditional identifiers like names, phone numbers, and addresses, as well as digital data that can reasonably identify a person in the context of video consumption.
In the 2016 ruling referenced above, the First Circuit noted that “[m]any types of information other than a name can easily identify a person.” The court held that GPS coordinates and device identifier information can be linked to a specific person, and therefore qualified as PII under the VPPA.
Just two months later, the Third Circuit Court of Appeal ruled more narrowly, stating that the law’s prohibition on disclosing PII applies only to information that would enable an ordinary person to identify a specific individual’s video-watching behavior. The Third Circuit held that digital identifiers like IP addresses, browser fingerprints, and unique device IDs do not qualify as PII because, on their own, they are not enough for an ordinary person to identify an individual.
These conflicting rulings highlight the ongoing debate about what constitutes PII, especially as digital technologies continue to evolve.
Consumers’ rights under the Video Privacy Protection Act (VPPA)
Although not explicitly framed as consumer rights under the law, the VPPA does grant consumers several rights to protect their information.
- Protection against unauthorized disclosure: Consumers’ PII related to video rentals, purchases, or viewing history cannot be disclosed without consent or other valid legal basis.
- Right to consent: Consumers must provide informed, written consent before a video tape service provider can disclose their PII. This consent must be distinct and separate from other agreements and can be given for a set period (up to two years) or revoked at any time.
- Right to opt out: Consumers must be given a clear and conspicuous opportunity to opt out of the disclosure of their PII.
- Right to notice in legal proceedings: If PII is to be disclosed under a court order, consumers must be notified of the proceeding and given an opportunity to appear and contest the disclosure.
- Right to private action: Consumers can file civil proceedings against video tape service providers for violations of the VPPA.
Penalties under the Video Privacy Protection Act (VPPA)
The VPPA law allows individuals affected by violations to file civil proceedings. Remedies available under the law include damages up to USD 2,500 per violation.
Courts may also award punitive damages to penalize particularly egregious or intentional misconduct. Additionally, plaintiffs can recover reasonable attorneys’ fees and litigation costs. Courts may also grant appropriate preliminary or equitable relief.
The VPPA statute of limitations requires that any lawsuit be filed within two years from the date the violation, or two years from when it was discovered.
Compliance with the Video Privacy Protection Act (VPPA)
Businesses that act as video tape service providers under the VPPA can take several steps to meet their legal obligations.
1. Conduct a data privacy audit
A data privacy audit can help businesses understand what personal data they collect, process, and store, and whether these practices comply with the VPPA. The audit should include assessing the use of tracking technologies like pixels and cookies to confirm whether they are correctly set up and classified.
2. Obtain informed, specific user consent
The VPPA requires businesses to obtain users’ informed, written consent before sharing PII. Implementing a consent management platform (CMP) like Usercentrics CMP can make it easier to collect, manage, and store consent from users.
VPPA compliance also requires businesses to provide clear and easy to find options for consumers to opt out of data sharing, which a CMP can also facilitate. The VPPA amendment outlines that consent records should not be stored for more than two years, and businesses must have a process for renewing consent before it expires.
3. Implement transparent communication practices
Businesses should help consumers understand how their data is used so they can make an informed decision about whether to consent to its disclosure. Cookie banners used to obtain consent should contain simple, jargon-free language to explain the purpose of cookies. They should clearly indicate if third-party cookies are used and identify the parties with whom personal information is shared.
Businesses should include a direct link to a detailed privacy policy, both in the cookie banner and in another conspicuous location on their website or mobile app. Privacy policies must explain how PII is collected, used, and shared, along with clear instructions on how consumers can opt out of PII disclosures.
4. Consult qualified legal counsel
Legal experts can help businesses achieve VPPA compliance and offer tailored advice based on specific business operations. Counsel can also help businesses keep up with current litigation to understand how courts are interpreting the VPPA, which is critical as the law continues to face new challenges and evolving definitions.
Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or privacy specialists regarding data privacy and protection issues and operations.
Watch our on-demand session to learn essential strategies for staying ahead of US privacy regulations and aligning your business with Privacy-Led Marketing.
Discover key insights into the latest US privacy legislation updates for 2025 and their impact on businesses, marketing strategies, and user behavior. This comprehensive session, featuring Usercentrics experts Tilman Harmeling and Will Newmark, along with Andy Crestodina from Orbit Media, explores how privacy laws are shaping the MarTech landscape and why respecting data privacy enhances brand trust and reputation.
Gain practical knowledge on proactive privacy strategies and actionable steps to ensure compliance while maintaining a competitive edge.
Who Should Watch?
- Key US privacy legislation updates and their implications.
- How privacy trends are reshaping MarTech and influencing consumer behavior.
- Best practices for aligning with Privacy-Led Marketing and staying compliant in 2025..
Who Should Watch?
- Marketing Professionals – Learn how privacy laws reshape data-driven strategies.
- Compliance Officers – Stay ahead of evolving legal requirements and avoid risks.
- Business Owners – Understand how privacy regulations impact operations and reputation.
- Tech Leaders – Gain insights into how privacy trends drive MarTech evolution.
In 2023, an estimated 101 million people in the European Union (EU) aged 16 and older — 27 percent of the population or one in four — were living with a disability. This group often faces significant challenges, including discrimination, poverty, and social exclusion, which can limit their full participation in society.
The European Accessibility Act (EAA) seeks to address these issues by improving access to products and services for people with disabilities and older adults across the EU. In this article, we examine the scope of the EAA, the products and services it covers, which businesses are required to comply, and the potential consequences of noncompliance.
What is the European Accessibility Act (EAA)?
The European Accessibility Act (EAA), passed in 2019, is an EU Directive created to standardize accessibility requirements across member states. Its goal is to improve access to products and services for people with disabilities, benefiting not only the millions of individuals living with disabilities, but also older adults who often face similar barriers. Member states are required to incorporate the EAA’s provisions into their own laws.
The EAA was passed in part to help fulfill the obligations of EU member states under the United Nations Convention on the Rights of Persons with Disabilities (CRPD). It adopts the CRPD’s definition of persons with disabilities: “persons […] who have long-term physical, mental, intellectual or sensory impairments which in interaction with various barriers may hinder their full and effective participation in society on an equal basis with others.”
The EAA is not the first directive to address the issue of accessibility in the EU. Directive (EU) 2016/2102, known as the EU Web Accessibility Directive, focuses on the accessibility of websites and mobile apps of public sector bodies. Unlike the EU Web Accessibility Directive, which applies to public sector bodies only, the EAA applies to the private sector as well, with certain exceptions.
What is the scope of the EAA?
The EAA applies to a wide range of products and services, helping to rectify previously inconsistent accessibility requirements for them across EU member states. These products and services are widely used in daily life by most people, so making them more accessible to people with disabilities helps with enabling them to be independent at home, more productive at work, etc.
The table below includes examples of categories and items to which the EAA applies, and shows the breadth of products and services that are required to be accessible to individuals (and, by contrast, how limiting a lack of accessibility can be).
Industry | Category | Examples |
---|---|---|
Consumer technology | Consumer general purpose computer hardware systems and operating systems | Laptops, desktop computers, operating systems |
Consumer terminal equipment for electronic communications services | Smartphones and tablets | |
Consumer terminal equipment for accessing audiovisual media services | Smart TVs, streaming devices, video players, gaming consoles | |
E-readers | Digital books, devices specifically designed for reading digital books | |
Financial services | Consumer banking services | Online and mobile banking platforms, ATMs, point of sale devices for card transactions |
Retail and ecommerce | Ecommerce services | Online marketplaces and retailer websites |
Public transport and travel | Self-service terminals | Payment terminals, ATMs, ticketing machines at stations, check-in kiosks at airports, and information kiosks (not integrated into vehicles, aircraft, or ships) |
Passenger transport services | Online booking websites and apps, mobile apps for real-time travel updates, e-tickets and ticketing services, and self-service terminals in transport hubs (not integrated into vehicles, aircraft, or ships) | |
Media and communications | Electronic communications services | Internet service providers, mobile networks, VoIP services |
Services providing access to audiovisual media services | Streaming platforms, public broadcasters, video-on-demand services, and satellite TV | |
Emergency and essential services | 112 emergency number |
Who must comply with the EAA?
The EAA requires all products and services sold in the EU to be accessible to people with disabilities. This obligation applies to companies both inside and outside the EU if their products or services are available to EU-based consumers.
The directive identifies five categories, collectively referred to as economic operators, that must meet its requirements.
- Manufacturers: Individuals or businesses that market products under their own name or trademark that they either manufacture or have designed and manufactured.
- Authorized representatives: EU-based individuals or businesses authorized through a written agreement to act on behalf of a manufacturer for specific tasks.
- Importers: Individuals or businesses within the EU responsible for placing products from outside the EU onto the EU market for the first time.
- Distributors: Individuals or businesses making products available for distribution, consumption, or use in the EU but are not the manufacturer or importer.
- Service providers: Individuals or businesses offering services to consumers within the EU market.
Microenterprises are partly exempt from EAA compliance. They are defined as businesses with fewer than 10 employees and:
- annual turnover not exceeding EUR 2 million
or
- an annual balance sheet total not exceeding EUR 2 million
Microenterprises providing services are exempt from some accessibility requirements under the EAA. Microenterprises that manufacture products and claim that compliance would lead to significant changes or impose an unreasonable burden do not need to formally document this assessment. However, these manufacturers are still required to consider accessibility principles when designing their products and must still provide relevant facts to authorities, upon request, if they choose to rely on these exceptions.
When does the EAA go into effect?
The EAA required Member States to adopt its provisions into their national laws by June 28, 2022. These laws must take effect and apply starting June 28, 2025. Because of this implementation date, the EAA is sometimes informally referred to as the “European Accessibility Act 2025.”
Exemptions to EAA effective date
The EAA applies to products placed on the market and services provided to consumers after June 28, 2025, but there are specific exemptions.
The directive provides for a transition period ending on June 28, 2030. During the transition period, service providers may continue using products that were already in use to provide similar services. Additionally, contracts signed before June 28, 2025 can remain in effect without changes, but they must end by June 28, 2030.
Transitional measures also apply to self-service terminals like ticket kiosks and ATMs. EU member states may decide that terminals in use before June 28, 2025 can remain operational until they are no longer economically viable. However, they cannot be used for more than 20 years from the date when they were first used, even if they are still functional.
The following website and mobile app content is also exempt from the EAA:
- pre-recorded time-based media published before June 28, 2025
- office file formats published before June 28, 2025
- online maps and mapping services, if essential information is provided in an accessible digital manner for maps intended for navigational use
- third-party content that is not funded, developed, or controlled by the organization required to comply
- archived content that is not updated or edited after June 28, 2025
Requirements for EAA compliance
The EAA requires products to be designed and manufactured so that they can be used by as many people with disabilities as possible. Products must also include clear, accessible information about how they work and their accessibility features, either on the product itself or in another accessible format, where possible.
Annex I of the directive details specific accessibility requirements, including design features, compatibility with assistive technologies, and accessible information.
Not all of the EAA’s requirements apply universally. The directive allows exceptions in cases where compliance would:
- result in a fundamental alteration to the nature of a product or service
- impose a disproportionate burden on businesses
Larger businesses must formally assess these situations using the criteria outlined in Annex VI. Microenterprises, while exempt from formal assessments, must still provide relevant facts to authorities if requested.
In addition to these requirements, the EAA outlines specific obligations of manufacturers, authorized representatives, importers, distributors, and service providers, which we’ll detail below.
Obligations of manufacturers
Manufacturers must meet several obligations under the EAA to make their products accessible. They must:
- design and manufacture products to meet accessibility requirements
- conduct a conformity assessment to verify that the product meets the relevant accessibility standards, and maintain supporting technical documentation
- prepare an EU Declaration of Conformity, which confirms that the product complies with all applicable EU directives
- affix the CE marking to products, indicating that they meet EU safety, health, environmental, and accessibility requirements
- establish processes to ensure that every product in a production run meets accessibility requirements
Products must include a type, batch, serial number, or other means of identification. If this is not possible because of the product’s size or nature, the information must be on the packaging or in an accompanying document.
The manufacturer’s name, registered trade name, or trademark, along with a contact address, must appear on the product. If these details cannot be placed directly on the product, they must appear on the packaging or an accompanying document. These details should be easy to understand and provide a way for consumers or authorities to reach the manufacturer.
Products must also include instructions and safety information in a language that users in the relevant country can easily understand. Labels and documents should be clear, simple, and accessible.
Obligations of authorized representatives
A manufacturer can appoint an authorized representative to handle specific tasks through a written agreement. The representative is not responsible for the manufacturer’s obligations related to the accessible design and manufacture of products or their technical documentation, but must:
- make the EU Declaration of Conformity and technical documentation available to relevant authorities for five years
- provide necessary information and documentation that demonstrates compliance to authorities upon request
- cooperate with authorities to address any issues or noncompliance with accessibility requirements for products under their mandate
Obligations of importers
Importers must ensure that the products they bring into the EU comply with the EAA. This includes verifying that:
- the manufacturer has conducted a conformity assessment
- the product complies with the accessibility standards outlined in Annex I
- the CE marking is correctly applied
- the products include relevant identification information and the manufacturer’s contact information
The importer’s name, trade name, or trademark and contact address should appear either on the product or its packaging or accompanying documents. Information and safety instructions in a language that is easy for consumers to understand must also accompany the product.
Importers must ensure proper storage and transport conditions to maintain compliance while the product is under their responsibility.
Importers are also responsible for keeping a copy of the EU Declaration of Conformity for five years and making technical documentation available to the relevant authorities upon request. These details should be clear and easy for both consumers and authorities to understand.
Additionally, importers must cooperate with national authorities during inspections and investigations, or when addressing noncompliance issues. They must take necessary measures to correct noncompliance or withdraw the product from the market if corrections cannot be made. They must also notify authorities immediately if they recognize that a product they have placed on the market is noncompliant, and maintain a record of such products and any related complaints.
Obligations of distributors
Distributors must ensure that the products they offer in the EU meet EAA accessibility standards. Specifically, they must confirm that:
- the product displays the CE marking to indicate compliance
- required accompanying documents, user instructions, and safety guidelines are provided in the official language(s) of the country where the product is sold
- the contact information of both the manufacturer and the importer as well as the product’s identification information are included on the product, its packaging, or accompanying documents
Like importers, distributors are also required to ensure proper storage and transport conditions to maintain compliance with accessibility requirements. If a distributor identifies or suspects noncompliance, they must withhold the product from the market until it meets accessibility requirements. In such cases, distributors must notify the manufacturer or importer and inform relevant authorities about the noncompliance and any corrective actions taken.
Distributors must cooperate with authorities during investigations and provide all necessary documentation to demonstrate the product’s compliance upon request. Unlike importers, distributors are not required to keep a formal record of noncompliant products or complaints.
Obligations of service providers
Service providers must design and deliver their services to meet the EAA’s accessibility requirements. They must prepare information that explains how their services meet these requirements, following the guidance in Annex V. This information must be made available in accessible written and spoken formats for people with disabilities, and must be kept available for as long as the service operates.
Service providers must also establish processes to maintain compliance with accessibility standards. These processes should account for any changes to service features, accessibility requirements, or relevant standards and specifications.
If a service is found to be noncompliant, service providers must take corrective measures and immediately notify the relevant authorities. They must provide details of the noncompliance, including the corrective actions taken.
Upon request, service providers must supply authorities with all necessary information to demonstrate compliance and cooperate fully with any actions taken to address noncompliance.
Enforcement of the EAA
Member states are responsible for developing procedures to verify whether products and services meet the European compliance standards. This includes appointing a specialized body to handle the duties of market surveillance or to conduct compliance checks for services.
If market surveillance authorities find a product noncompliant, they must immediately require the economic operator concerned to correct the issue. If the issue remains unresolved, authorities can require the product to be withdrawn from the market.
When market surveillance authorities believe noncompliance extends beyond their country, they must inform the European Commission and other member states of their findings and detail the actions that have been taken to address the issue.
Examples of noncompliance include:
- incorrectly affixing or failing to affix the CE marking
- missing or incorrectly prepared EU Declaration of Conformity
- missing or incomplete technical documentation
- missing, incorrect, or incomplete contact details for the manufacturer or importer
- failing to meet other obligations required of manufacturers or importers under the EAA
Consequences of noncompliance with the EAA
Failing to comply with the EAA can lead to serious consequences for businesses. Authorities may order noncompliant products to be removed from the market, and economic operators may face penalties under the national laws that implement the directive.
Each EU member state is required to set its own rules for penalties, which must be “effective, proportionate and dissuasive.” They should also include measures to address and correct noncompliance.
Penalties may include fines, which vary based on the country, the severity of the violation, the numbers of persons affected, and how many noncomplying products or services are involved. Some national laws, such as Ireland’s European Union (Accessibility Requirements of Products and Services) Regulations 2023, can also impose prison sentences. In Ireland, businesses face fines of up to EUR 60,000 or imprisonment of up to 18 months for violations.
Beyond legal penalties, noncompliance can harm a company’s reputation, erode user trust, and lead to customer losses. Organizations could also face legal actions from individuals or advocacy groups representing people with disabilities, which can lead to additional legal fees and damages.
Steps to prepare for EAA compliance
With national laws implementing the EAA taking effect on June 28, 2025, organizations offering relevant products and services in the EU must get ready to meet accessibility requirements.
1. Determine if the EAA applies to your business
Start by identifying whether your products or services fall under the EAA’s scope. Review the directive’s requirements and consider if any exemptions apply to your operations.
2. Conduct an accessibility audit
If your business is covered by the EAA, use the EAA’s requirements to evaluate your products, services, and processes for potential accessibility gaps.
3. Consult qualified legal professionals and accessibility experts
Engage legal and accessibility experts to better understand how the EAA applies to your operations. Their expertise can help align your compliance efforts with both legal and technical standards.
4. Document compliance efforts
Maintain clear records of all actions taken to meet the EAA’s requirements, including accessibility audits, testing outcomes, and updates. These records can demonstrate compliance in the event of regulatory inquiries or inspections.
5. Monitor regulatory updates
Stay informed about any changes to accessibility standards or enforcement rules. Keeping up to date with regulatory developments helps keep your compliance measures aligned with current requirements.
Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or privacy specialists regarding data privacy and protection issues and operations, and accessibility specialists regarding accessibility compliance.
Introduced in 2018, the General Data Protection Regulation (GDPR) applies to any company handling the personal data of individuals within the European Union, regardless of where the business is based. At its core, there are seven guiding principles that dictate how organizations collect, store, and use personal data.
Understanding these principles is more than just a legal requirement. It’s a vital step toward building trust with your customers and safeguarding your company from potential regulatory violations and penalties. These principles outline how to approach data processing transparently and ethically, helping you align your business practices with legal obligations and customer expectations.
Let’s explore these 7 data protection principles in detail and look at how they can be practically applied to your operations. By understanding these foundations, you’re one step closer to compliance and strengthening your company’s reputation.
What are the principles of GDPR?
The GDPR incorporates 7 principles, as outlined in Article 5 of the regulation. These form the backbone of compliant data protection practices. They serve as a set of rules for how organizations should handle and process personal data ethically, transparently, and securely.
The 7 principles of the GDPR are:
- Lawfulness, fairness, and transparency
- Purpose limitation
- Data minimization
- Accuracy
- Storage limitations
- Integrity and confidentiality
- Accountability
These principles act as a framework, guiding businesses in everything from collecting various types of consent to implementing security measures.
These 7 core principles of the GDPR are not optional guidelines. Instead, they are legally binding standards for any company that handles personal data, whether it’s an ecommerce business, healthcare provider, or multinational corporation.
Why are GDPR principles important?
The importance of these data protection principles extends far beyond avoiding fines. Still, the financial penalties for noncompliance can be significant, reaching up to EUR 20 million or 4 percent of a business’s annual global turnover. However, adhering to these principles is also critical for building customer trust, avoiding reputational damage, and meeting comparable requirements increasingly being levied by important tech platforms, like those for advertising.
Consumers are also more aware of their privacy rights than ever. Businesses that demonstrate transparency and accountability in their data practices are more likely to increase engagement and gain loyalty. In addition, adhering to the principles of the GDPR provides operational clarity, helping companies streamline processes and reduce inefficiencies. Clear, well-defined policies reduce confusion about data handling, helping employees and systems work in harmony to meet compliance standards.
On a broader scale, these principles help create a culture of accountability. When companies consistently prioritize data protection, they encourage responsible behavior and set a high standard for others in their industries. This proactive approach not only safeguards individual rights, but also positions businesses as leaders in privacy and ethics.
The 7 GDPR principles your company needs to know about (with examples)
1. Lawfulness, fairness, and transparency
This principle forms the bedrock of GDPR compliance. It requires that all processing of personal data must be lawful, fair, and transparent.
Let’s take a closer look at what that means in practice. Lawfulness means having a valid legal basis for collecting and using data. The GDPR provides six legal bases, of which companies (the controller) must validly use and document at least one to justify the collection of personal data:
- informed consent from the data subject
- performance of a contract with the data subject
- compliance with a legal obligation to which the data controller is subject
- protecting the vital interests of the data subject or of another natural person
- in the public interest, or if the data controller is exercising official authority
- legitimate interests pursued by the controller or by a third party
Fairness requires that data is not used in a way that is misleading or harmful to the individual. Transparency refers to providing clear, accessible information about how data will be used and secured, and about data subjects’ rights and how they can exercise them.
For instance, when an ecommerce site collects an email address for marketing purposes, it must also provide information to the user. It should include details like how frequently emails will be sent and what topics or information they will contain. Without clarity, users may feel misled, which damages trust and can breach GDPR principles.
Your company’s privacy policy should clearly explain how customer data is used, including for delivery and marketing emails. You also need to provide an easy opt-out option.
It’s worth noting that transparency goes hand in hand with effective communication. Privacy policies should be written in plain language and made easily accessible. This helps individuals understand their rights and the organization’s data practices.
2. Purpose Limitation
Purpose limitation requires that personal data be collected for a specific, legitimate, and communicated purpose and not used for anything else. Businesses must clearly define why they are collecting data and stick to those boundaries. Repurposing data for different uses, or collecting new types of data for an existing use — without explicit consent — violates this core GDPR principle.
Imagine a fitness app that collects user data to track exercise habits. If the company later uses this data to target ads for unrelated products without user consent, it breaches this GDPR principle. Purpose limitation prevents such misuse, and gives individuals control over how their data is handled.
Businesses must document the intended use of data at the point of collection. A consent management platform helps with this, enabling companies to granularly list out the data processing services in use, along with what data they collect and for what purposes. Regular audits can also help ensure that data usage aligns with declared purposes. If new purposes arise, obtain additional consent before proceeding.
3. Data Minimization
Data minimization dictates that businesses should only collect the data that is strictly necessary for their declared purpose. Over-collection not only increases the risk in the event of a data breach or other compliance violation, but also complicates compliance efforts and can raise concerns from customers about the actual need for the data. Collecting excessive or irrelevant information is strongly discouraged under the GDPR.
For example, a job application form should only ask for the details necessary to assess a candidate’s qualifications. Questions about personal hobbies, family status, or unrelated credentials could violate this principle. By minimizing data collection, businesses reduce exposure to risks while streamlining their operations.
Data minimization also involves periodically reviewing stored information to ensure relevance. Outdated or unnecessary data should be securely deleted or anonymized. This not only enhances compliance but also improves data management efficiency.
4. Accuracy
The accuracy principle requires that personal data be accurate and, where necessary, kept up to date. Companies must take every reasonable step to erase or rectify inaccurate personal data without delay, particularly when requested by the data subject. Inaccurate information can lead to poor decision-making and, in some cases, harm to the individual whose data is being processed.
For instance, a delivery service relying on outdated customer addresses may waste resources and inconvenience customers. Instead, the delivery service could offer customers the option to update their delivery preferences and personal information via their online account. By establishing mechanisms for regular updates and corrections, businesses can maintain accuracy and efficiency.
Enabling individuals to easily update their information or easily make requests for it to be done is key for privacy compliance. Whether through self-service portals or responsive customer support, maintaining data accuracy benefits both the business and its customers.
5. Storage limitation
Storage limitation requires organizations to retain personal data for only as long as is necessary for the specified purpose. Keeping data indefinitely increases security risks and can negatively impact customer (or former customer) experience.
For example, a subscription service might delete user accounts after two years of inactivity, since their information is no longer necessary. They may also anonymize purchase data after five years so it can be used for long-term trend analysis, but can no longer be linked to individual customers. Businesses should maintain clearly defined retention schedules, and should have policies in place for secure deletion or anonymization.
It’s important to balance operational needs with privacy obligations. Regular audits can help identify data that is no longer needed, which reduces risks and supports compliance.
6. Integrity and confidentiality (security)
This principle of integrity and confidentiality seeks to protect data against unauthorized access, loss, or destruction. Businesses must implement robust security measures, including encryption, firewalls, and access controls.
Consider a healthcare provider storing sensitive patient records. Without proper encryption and restricted access, these records could be exposed in a data breach, causing significant harm to individuals, a major investigation by authorities, and reputational damage to the organization.
Implementing measures like strong encryption for data both in transit and at rest is recommended. Use access controls to ensure that only authorized personnel can access customer data, and only within the bounds of their job responsibilities, and conduct regular security audits.
Training employees in data protection practices is equally important, and needs to be done regularly to build strong habits. Human error is a leading cause of data breaches and sensitive data exposure. Raise awareness about security protocols to help mitigate this risk. Regularly updating systems and conducting security assessments will further enhance compliance.
7. Accountability
Accountability requires organizations to take responsibility for their data practices and demonstrate compliance with the core GDPR principles. This involves documenting data processing activities, conducting regular audits, and appointing a Data Protection Officer (DPO) if necessary.
For instance, a marketing agency might maintain detailed records of how client data is processed, in addition to conducting regular data protection impact assessments and creating a comprehensive data protection policy. These records provide a clear trail of accountability, which supports transparency and aids compliance efforts.
Accountability also means staying informed about evolving data protection regulations, frameworks, and requirements, and adapting practices accordingly. Demonstrating a proactive approach to compliance builds trust with customers and partners and strengthens your organization’s reputation.
Follow the 7 principles for GDPR compliance
The seven principles of the GDPR provide a clear roadmap for responsible data management. This is not only required to do business in the EU, but provides a valuable framework for businesses anywhere in the world that want to take strong measures for data privacy and protection. By adhering to these principles, businesses can not only comply with legal requirements but also build stronger relationships with customers and partners, foster transparency, and reduce risks.
Integrating these principles into your operations supports secure and ethical handling of personal data, and sets your company apart as a leader in data privacy and protection.