In the European Union (EU) and European Economic Area (EEA), the General Data Protection Regulation (GDPR) has been in effect since May 2018. Its goal is to protect EU residents’ privacy and personal data and give them control over how that data is used.
Since its implementation, the GDPR has become the world’s most influential data privacy law, impacting legislation in other countries and significantly affecting how companies do business in Europe.
One of the most newsworthy aspects of GDPR enforcement is the fines levied against companies found to have violated the law.
Organizations of any size can be fined for violations, but the news stories that make headlines often involve tech giants with global reach and billions of users. Fines for misusing personal data in those cases have risen into the billions.
Who is responsible for GDPR compliance?
There are several levels of responsibility for General Data Protection Regulation compliance, and their degree of responsibility varies based on various factors.
These can include the type of data processing or whether they’re an entity requesting personal data and using it for stated purposes, or a third-party entity working for someone else.
At a more granular level, there are often privacy experts within organizations who are responsible for data privacy operations. In some cases, appointing a Data Protection Officer is a legal requirement.
Data controllers and data processors
Data controllers and data processors are people or organizations actually collecting and processing the personal data of EU residents. This processing can include using, sharing, or selling data.
Those entities have day-to-day responsibility for data privacy and security. They must have a viable legal basis for collecting data, use that data per GDPR guidelines, and only for the purpose(s) they communicate, maintain reasonable security, and inform data subjects about their rights and the use of their data.
Data controllers’ responsibilities require them to:
- Securely maintain records of consent preferences
- Maintain data accuracy
- Respond to data-related requests, including requests for correction or deletion (with exceptions)
- Implement and maintain reasonable organizational and technical measures for data protection
Data processors, on the other hand, typically work for data controllers. An example would be a third-party vendor handling advertising or communications for a company. Data processors’ responsibilities include:
- Implementing appropriate technical and organizational measures to protect data
- Notifying the data controller of any data breaches
- Keeping records of their processing activities
- Complying with opt-out or data deletion requests after processing has started
While both controllers and processors have responsibilities under the GDPR, ultimately, data security and privacy compliance responsibilities belong to the controller.
Data protection authorities
Each EU member state has its own authoritative body to investigate alleged violations and enforce compliance with the GDPR. These independent public agencies are known as data protection authorities (DPAs). These organizations also enforce other local or regional privacy-related laws.
Read more about who is responsible for GDPR compliance within your company.
What is considered a violation under the GDPR?
A violation of the GDPR occurs when a data controller or processor fails to meet one or more of the regulation’s requirements. Violations range from administrative oversights to serious breaches of data protection principles. Examples include:
- Failing to obtain valid consent before collecting or processing personal data
- Not notifying data protection authorities and affected individuals of a data breach within the required timeframe
- Collecting or using personal data for purposes not disclosed to the user, including if the original purposes change
- Failing to implement appropriate security measures to protect data
- Not providing users with access to their personal data or the ability to delete it
Even well-meaning companies can be fined if they neglect basic privacy practices or are unaware of their compliance obligations. These kinds of oversights — intentional or not — are exactly what regulators look for when deciding whether a fine is warranted.
Small oversights in privacy practices can trigger scrutiny, especially if they are repeated or ongoing. This is why it’s crucial to understand your organization’s responsibilities, especially when violations lead to financial penalties and reputational damage.
What are the criteria for imposing GDPR fines?
When a breach is identified, data protection authorities evaluate certain criteria to determine the appropriate fine. These include:
- Nature, gravity, and duration of the infringement
- Whether the violation was intentional, negligent, or repeated
- Any action taken by the organization to mitigate the damage
- Degree of cooperation with authorities during investigations
- Categories of personal data affected
- Any previous infringements by the organization
- How the supervisory authority became aware of the infringement
- Whether the company followed approved codes of conduct or certification mechanisms
This framework seeks to make fines proportionate to the offense and consider each case’s unique circumstances.
What are fines and penalties under GDPR?
If an organization that processes personal data belonging to EU residents is found to have violated the GDPR, there are several types of potential penalties, outlined in Art. 83 GDPR.
Data protection authorities can impose administrative fines, including the maximum penalty for a GDPR breach, depending on the severity. Beyond fines, the DPA can:
- Issue warnings or reprimands
- Temporarily or permanently impose restrictions on data processing
- Order the erasure of personal data
- Suspend international data transfers to third countries
- Impose administrative fines
- Impose criminal penalties
Administrative fines are probably the most well-known penalty of the GDPR. There are two levels of administrative fines, depending on the severity of the infraction.
Tier one administrative fines
First-tier GDPR fines are generally for first-time or less severe infractions. They can be up to EUR 10 million per infraction or two percent of global annual revenue for the preceding financial year, whichever is greater.
Tier two administration fines
Second-tier GDPR fines are generally for repeat violators or more severe infractions. They can be up to EUR 20 million per infraction or four percent of global annual revenue for the preceding financial year, whichever is higher. These maximum GDPR fines are high because they are reserved for serious or repeat offenses.
Who can be fined under the GDPR?
Any organization that processes the data of EU residents and fails to comply with GDPR requirements can be fined, whether or not the entity is also located in the EU.
This includes data controllers and processors or joint controllers, applicable when two or more entities jointly determine the purposes and means of processing personal data.
While violations tend to affect commercial entities, other types of organizations can be fined for data privacy violations under the GDPR as well. This includes nonprofit organizations and charities. Few are exempt from GDPR penalties.
Enforcement action against smaller entities is also more common than many people think, largely because only massive fines levied against big tech companies tend to garner headlines.
However, even a fine of less than a billion dollars can be a substantial financial hit for a small business.
Can data processors be fined under GDPR?
In short, yes. Data processors process personal data on behalf of and under the instruction and authority of data controllers, but are not immune from penalties.
GDPR compliance failures for data processors could include not implementing appropriate security measures, processing data for purposes not stated or for which there is not a valid legal basis, or failing to work with the data controller to fulfill obligations under the GDPR.
Can employees be fined under the GDPR?
Generally, employees of organizations would not be fined under the GDPR, as responsibility tends to fall on the company (controller) or the data processor(s), not individuals.
Employees certainly play a role in GDPR compliance, and can be partly responsible for a violation, like a data breach. Where there is a deliberate or recklessly damaging action that results in a GDPR violation, an employee could be subject to disciplinary action by their employer, and could be penalized by other relevant laws.
Organizations are expected to provide employees with appropriate training and guidelines for data security and handling, and companies should have clear, accessible policies in place around data access, security, and related concerns.
Can individuals be fined under GDPR?
Private persons cannot be fined under the GDPR, but can be held liable for actions or negligence regarding data protection. Many countries have additional data privacy and security laws, and individuals involved in a data breach, for example, could face criminal or civil legal consequences.
How many companies have been fined for GDPR?
There have been hundreds of thousands of breach notifications sent to organizations under GDPR rules. Enforcement activity has been increasing each year since the law came into effect in 2018.
According to the GDPR Enforcement Tracker, authorities continue to issue GDPR fines at a steady rate. More than 2,200 fines have been recorded, and the total number of fines is growing.
Spain has issued the most GDPR fines to date, with at least 899 fines totaling over EUR 82 million. However, Ireland leads in the total value of fines, having imposed approximately EUR 3.5 billion in penalties across about 25 cases, mostly targeting major technology companies with EU headquarters located there.
What is the biggest GDPR fine to date?
To date, the maximum fine for a data breach was issued on May 22, 2023. Ireland’s Data Protection Commission issued a new record-largest GDPR fine of EUR 1.2 billion (USD 1.3 billion) to Meta (Meta Platforms, Inc.), parent company of social platforms Facebook, Instagram, WhatsApp, Threads, and other services. This fine exceeds the previous maximum GDPR fine issued to Amazon Europe in 2021 by EUR 454 million.
Meta was also ordered to stop transferring data from Facebook users in Europe to the United States.
The reason for the ruling was that Meta’s transfers of Facebook users’ data to the US violated the GDPR’s international data transfer guidelines.
The US and EU were without an adequate agreement for data transfers for a couple of years following the court ruling invalidating the EU/US Privacy Shield. However, a new agreement was finalized in 2023, and the EU-U.S. Data Privacy Framework came into effect on July 10.
There are new concerns in light of changes made by the current US government administration, however, which once again put adequacy agreements between the EU and US into question.
What happens when the GDPR is breached?
When a GDPR breach occurs, the affected organization must act quickly. Under the regulation, any personal data breach that may pose a risk to individuals’ rights and freedoms must be reported to the relevant data protection authority within 72 hours.
In some cases, the organization must also inform affected individuals without undue delay.
The breach response process typically includes:
- Investigating the cause and scope of the breach
- Notifying the appropriate authorities and individuals if necessary
- Taking steps to contain and mitigate the breach’s impact
- Documenting all details of the incident and the response
The supervisory authority may launch an investigation. If the organization is found to have failed in its data protection duties, fines or corrective measures may follow.
These can include warnings, orders to change data processing practices, temporary data restrictions, or the fines up to the maximum financial penalty for a GDPR breach.
Beyond financial penalties, a breach can have serious reputational, operational, and legal consequences. A swift, transparent, and effective response can help minimize damage and maintain trust.
UK GDPR fines and penalties
GDPR enforcement doesn’t stop at EU borders. Post-Brexit, the UK enforces its own version of the regulation with similar consequences.
Upon leaving the European Union on January 31, 2020, the United Kingdom adopted a near-identical version of the GDPR, commonly referred to as the UK GDPR.
Fines and penalties for noncompliance remain aligned with the original EU regulation. UK GDPR enforcement is the responsibility of the Information Commissioner’s Office (ICO).
As with the EU GDPR, there are two tiers of fines.
Tier one administrative fines
First-tier UK GDPR fines are for first time or less severe infractions. They can be up to GBP 8.7 million or two percent of global annual revenue for the preceding financial year, whichever is greater.
Tier two administration fines
Second-tier UK GDPR fines are for repeat violators or more severe infractions. They can be up to GBP 17.5 million or four percent of global annual revenue for the preceding financial year, whichever is greater.
How to avoid GDPR fines
Whether you’re processing data belonging to residents in the EU or the UK, the most effective strategy is the same: avoid fines by prioritizing compliance from the start. Your company must understand its responsibilities to achieve and maintain compliance with the law’s requirements.
To get ahead of GDPR compliance, implement data protection and privacy best practices. In addition, consider regularly consulting with a privacy expert like a Data Protection Officer (required under the GDPR in many cases) or qualified legal counsel.
Some compliance actions are required in certain countries, but are just recommendations elsewhere. It is important to verify which requirements are applicable to your business.
There are a number of recommendations for organizations to achieve and maintain GDPR compliance and avoid fines:
- Conduct regular data audits to fully understand data collection and processing activities
- Conduct data protection impact assessments (DPIA)
- Implement data protection policies and procedures
- Train employees on GDPR compliance and data security practices
- Appoint a qualified and well-informed DPO when required, which can be an internal or external hire, as long as they have sufficient GDPR expertise
- Work with trusted third-party vendors and service providers that are GDPR-compliant, and implement contracts prior to starting data processing operations
- Use a comprehensive consent management solution to collect and store valid user consent on websites, apps, connected TV, etc.
How Usercentrics consent management can help your company
The maximum fine for a GDPR breach can be financially devastating. In the UK, the maximum financial penalty for breaching the UK GDPR is just as serious.
These aren’t rare occurrences. The large GDPR fines issued to companies like Meta and Amazon might seem unrelatable to smaller businesses, but they are not immune to consequences for violating the law.
Here’s the good news: compliance doesn’t have to be overwhelming.
Usercentrics Consent Management Platform (CMP) helps companies like yours simplify GDPR compliance. With our robust and scalable consent management solution, we make it easy to manage user consent, understand what data your website is collecting, and prove compliance when it matters most.
No legal jargon or guesswork, just clear, practical solutions to reduce your risk and support your data strategy.
Whether you’re trying to avoid the maximum penalty for a GDPR breach, prepare for audits, or simply build user trust, we give you the visibility and control you need to manage data responsibly.
Who is responsible for enforcing the General Data Protection Regulation (GDPR)? The answer is more complex than just regulatory authorities.
The GDPR is one of the most comprehensive data privacy laws in the world, and enforcement isn’t limited to external authorities. Responsibility for GDPR compliance belongs to organizations, departments, and even individuals.
We’ll look at who is responsible for data privacy and protection and how to implement best practices. We will also outline GDPR enforcement from a government level down to day-to-day corporate operations.
What is GDPR?
The General Data Protection Regulation (GDPR) is the European Union’s foundational data privacy law. It was introduced in 2016 and took effect in May 2018, replacing the 1995 Data Protection Directive. Unlike directives, which require national governments to pass their own local versions, the GDPR is a regulation that applies directly and uniformly across all EU and European Economic Area (EEA) member states.
The GDPR was designed to give individuals more control over their personal data and to align data protection laws across Europe. It governs how personal data is collected, processed, stored, shared, and deleted. It also introduces strict requirements around user consent, transparency, security, and organizational accountability.
The regulation affects any organization, regardless of location, that processes the personal data of EU residents. This means that whether you’re based in Berlin, Boston, or Bangalore, if you have users in the EU, you have to comply with the GDPR.
Learn more about the EU’s General Data Protection Regulation (GDPR).
Who is responsible for GDPR compliance in companies?
GDPR compliance is not solely the job of regulators or legal advisors. It should be built into businesses’ day-to-day operations. Two individuals hold the most responsibility: data controllers and data processors.
Data controllers, data processors, and GDPR compliance
Data controllers and data processors collect and process users’ personal data, and are thus responsible at the day-to-day level for data security and privacy.
Under the GDPR, a data controller is a person or organization that collects personal data and determines the purposes and means of its processing. Data processing can mean anything from creating customer profiles to aggregating demographic information for sale.
A data processor is a person or organization that processes personal data on behalf of a data controller. Advertising partners are a good example of data processors.
GDPR requirements apply to both data controllers and data processors, but their specific responsibilities differ. Ultimately, data security and privacy compliance are usually the controller’s responsibility, including for the actions (or negligence) of contracted processors.
This is why it’s critical, and to a degree required, to enter into clear, comprehensive contracts with all prospective data processors and to review their activities.
Responsibilities of data controllers under the GDPR
Data controllers are primarily responsible for GDPR compliance, so they must obtain valid consent, as defined in Art. 7 GDPR, from individuals for data processing. Their additional responsibilities include:
- Maintaining secure records of consent preferences
- Keeping data accurate and up to date
- Correcting or deleting data when requested, under certain circumstances
- Implementing appropriate technical and organizational measures to protect data
Data controllers must also verify with contractual agreements that any third-party data processors they work with are GDPR-compliant.
In practice, this means that the controller doesn’t just decide how data is used. They also have to demonstrate accountability at every stage of the data lifecycle. This includes transparency with users, cooperation with supervisory authorities, and full documentation of compliance measures.
In short, the data controller sets the tone for how an organization approaches data privacy and is ultimately the one who bears the most legal responsibility.
Responsibilities of data processors under the GDPR
Data processors must process personal data only according to the instructions of the contractual agreement with the data controller. Their additional responsibilities include:
- Implementing appropriate technical and organizational measures to protect data
- Notifying the data controller of any data breaches
- Keeping records of processing activities
- Compliance with data deletion requirements after processing
Processors do not have the freedom to decide how personal data is used, but they still play a critical role in keeping it safe. This includes handling data with care, applying encryption and access controls, and executing proper deletion once processing is complete.
If a data breach occurs or if a processor fails to follow the agreed upon terms, they can be held legally responsible, especially if negligence is involved. That’s why it’s crucial for processors to stay current on security best practices and to regularly review their compliance procedures.
Data Protection Authority (DPA)
Data Protection Authorities (DPAs) are independent public authorities that oversee GDPR compliance and enforcement in each EU member state. Typically, each EU member country has its own DPA that enforces the GDPR and other local or regional privacy laws, like the CNIL in France or Datatilsynet in Denmark. DPAs have the power to investigate GDPR violations, issue fines, and order organizations to take corrective actions.
Who has a duty to monitor compliance with the GDPR? DPAs, certainly, but organizations need to monitor data processing and security themselves every day. This includes which third-party vendors are handling user data.
Additionally, companies should enlist the help of legal counsel or a privacy expert to keep up with changes to the legal landscape as more countries implement and update data privacy laws.
Another way is with a consent management solution, which can help to automate compliance with the GDPR and its requirements surrounding cookies.
How does GDPR enforcement work?
GDPR enforcement is decentralized but coordinated. Each EU member state designates a national DPA to oversee compliance within its borders. These authorities investigate complaints, conduct audits, and issue penalties when organizations fail to meet GDPR requirements.
In cross-border cases — when a company operates in more than one EU country or processes data from individuals across several member states — a lead supervisory authority is appointed. This authority streamlines enforcement. Oversight is further supported by the European Data Protection Board (EDPB), which helps apply the law consistently across Europe.
Enforcement can begin through various channels: user complaints, data breach notifications, proactive DPA audits, or cooperation among authorities.
DPAs have broad power to investigate, restrict processing activities, or impose corrective actions. But they also serve in an advisory role, helping organizations improve their data handling and avoid future violations.
What are the exemptions under GDPR?
While the GDPR applies broadly, there are a few specific exemptions that limit its scope in certain contexts.
- Personal or household activities: If data is processed purely for personal use, such as keeping a private contact list or sharing family photos, that processing is exempt from the GDPR.
- Law enforcement and public security: Activities involving crime prevention, national security, or public safety are typically regulated by separate legislation, such as the Law Enforcement Directive.
- Journalism, academia, art, and literature: These sectors may receive limited exemptions when data processing is necessary to balance freedom of expression with privacy rights.
Even in these cases, however, basic data protection principles apply to some degree, like fairness, transparency, and security. Organizations should seek legal advice if they believe their processing might fall into an exempt category.
What are the penalties for noncompliance with the GDPR?
GDPR penalties can be significant and reflect the severity of the violation. The regulation outlines a two-tiered structure.
- Up to EUR 10 million or two percent of the organization’s annual global turnover, whichever is greater, for violations related to record keeping, security, and data breach notifications.
- Up to EUR 20 million or four percent of global turnover, whichever is greater, for more serious breaches, such as unlawful data processing, lack of user consent, or violating data subject rights.
These fines are not automatic. DPAs take multiple factors into account when determining penalties, such as:
- Nature, gravity, and duration of the infringement
- Whether the violation was intentional or due to negligence
- Categories of data affected
- Efforts made to mitigate the damage
- Any past violations and/or history of compliance
In addition to financial penalties, data protection authorities can impose corrective actions. These may include temporary or permanent bans on processing, mandatory data deletion, or requirements to adjust data handling practices.
Reputational damage can also be substantial, another reason why proactive compliance should be both a legal and strategic priority.
The largest GDPR fine to date was issued to US-based tech company Meta — parent company of Facebook, Instagram, WhatsApp, and others — in response to its handling of user data. The fine amounted to USD 1.3 billion.
EU privacy regulators gave the company five months to stop transferring data from EU-based users to the United States. The EU and US have an “on again, off again” relationship with regards to international data transfers and adequacy agreements regarding data protection.
However, unlike some other data privacy laws, the GDPR does not include a “cure period.” In some jurisdictions, organizations may be allowed time to fix issues and avoid facing penalties.
Under the GDPR, however, once a violation is identified, fines and corrective actions can be applied even if the organization remediates the issue right away.
Common GDPR compliance issues and challenges
GDPR compliance can be challenging, especially for small and medium-sized businesses. In many cases, it requires the appointment of a Data Protection Officer (DPO). In smaller organizations, that may mean assigning those duties to someone who already holds another role.
Common compliance challenges include:
- Understanding the organization’s specific compliance responsibilities
- Obtaining valid user consent
- Setting up and maintaining a consent management solution
- Implementing appropriate data security measures
- Complying with data subject rights requests in a timely manner
- Reporting data breaches to DPAs within 72 hours
Best practices for GDPR compliance
To stay compliant, companies should follow data protection and privacy best practices. Some actions are legally required in certain countries, while in others they are only recommended. It’s important to review both GDPR and local regulatory requirements to understand what applies to your business.
Best practices include:
- Conducting audits to fully understand the data you hold and data processing activities
- Conducting data protection impact assessments
- Implementing data protection policies and procedures
- Training employees on GDPR compliance
- Appointing a qualified and well-informed DPO where required
- Working with trusted third-party vendors and service providers that are GDPR-compliant and implementing clear and comprehensive contracts before data processing begins
- Using a comprehensive consent management solution to collect and store valid user consent on websites and apps
Want to know more? Here’s everything you need to know about GDPR compliance.
GDPR responsibilities and enforcement
Data controllers and data processors each have defined roles under the GDPR, and organizations should take steps to make sure those responsibilities are being met.
That includes limiting how much personal data is collected, securing it properly and limiting access to it, and working only with trusted partners. Falling short can lead to more than just fines — it can erode user trust and hurt your reputation.
To stay on track, appoint a Data Protection Officer if needed, review your security practices, and make sure your vendor contracts are specific about data protection.
A consent management platform can also help keep things simple, enabling you to collect valid consent and stay transparent with users across your website and marketing tools.
The General Data Protection Regulation (GDPR) sets strict standards for how organizations must handle personal data collected from individuals in the European Union (EU) and European Economic Area (EEA). This comprehensive data protection regulation applies to all organizations that collect or process this data — regardless of where the organization is located — if they offer goods or services to EU/EEA residents or monitor their behavior.
Among its many requirements, the GDPR places specific legal obligations on how organizations may handle special categories of personal data or sensitive personal data. These data categories receive additional protections due to their potential impact on an individual’s rights and freedoms if they are misused.
In this article, we’ll look at what constitutes sensitive personal data under the GDPR, what additional protections it receives, and the steps organizations can take to achieve compliance with the GDPR’s requirements.
What is sensitive personal data under the GDPR?
Sensitive personal data includes specific categories of data that require heightened protection under the GDPR, because their misuse could significantly impact an individual’s fundamental rights and freedoms.
Under Art. 9 GDPR, sensitive personal data is:
- data revealing an individual’s racial or ethnic origin
- information related to a person’s political opinions or affiliations
- data concerning a person’s religious or philosophical beliefs
- information indicating whether a person is a member of a trade union
- data that provides unique insights into a natural person’s inherent or acquired genetic characteristics
- biometric data that can be used to uniquely identify a natural person, such as fingerprints or facial recognition data
- information regarding an individual’s past, current, or future physical or mental health
- data concerning a person’s sex life or sexual orientation
Recital 51 GDPR elaborates that the processing of photographs is not automatically considered processing of sensitive personal data. Photographs fall under the definition of biometric data only when processed through specific technical means that allow the unique identification or authentication of a natural person.
By default, the processing of sensitive personal data is prohibited under the GDPR. Organizations must meet specific conditions to lawfully handle such information.
This higher standard of protection reflects the potential risks associated with the misuse of sensitive personal data, which could lead to discrimination, privacy violations, or other forms of harm.
What is the difference between personal data and sensitive personal data?
Under the GDPR, personal data includes any information that can identify a natural person — known as a data subject under the regulation — either directly or indirectly. This may include details such as an individual’s name, phone number, email address, physical address, ID numbers, and even IP address and information collected via browser cookies.
While all personal data requires protection, sensitive personal data faces stricter processing requirements and heightened protection standards. Organizations must meet specific conditions before they can collect or process it.
The distinction lies in both the nature of the data and its potential impact if misused. Regular personal data helps identify an individual, while sensitive personal data can reveal intimate details about a person’s life, beliefs, health, financial status, or characteristics that could lead to discrimination or other serious consequences if compromised.
Conditions required for processing GDPR sensitive personal data
Under the GDPR, processing sensitive personal data is prohibited by default. However, Art. 9 GDPR outlines specific conditions under which processing is allowed.
- Explicit consent: The data subject can provide explicit consent for specific purposes, unless EU or member state law prohibits consent. Data subjects must also have the right to withdraw consent at any time (Art. 7 GDPR).
- Employment and social protection: Processing is required for employment, social security, and social protection obligations or rights under law or collective agreements.
- Vital interests: If processing protects the vital interests of the data subject or another natural person who physically or legally cannot give consent.
- Nonprofit activities: A foundation, association, or other nonprofit body with a political, philosophical, religious, or trade union aim can process sensitive data, but only in relation to members, former members, or individuals in regular contact with the organization. The data cannot be disclosed externally without consent.
- Public data: Data may be processed if the data subject has made the personal data publicly available.
- Legal claims: Processing is required for establishing, exercising, or defending legal claims, or when courts are acting in their judicial capacity.
- Substantial public interest: Processing may be necessary for substantial public interest reasons, based on law that is proportionate and includes safeguards.
- Healthcare: Processing may be required for medical purposes, including preventive or occupational medicine, medical diagnosis, providing health or social care treatment, or health or social care system management. The data must be handled by professionals bound by legal confidentiality obligations under EU or member state law, or by others subject to similar secrecy requirements.
- Public health: Processing may be necessary for public health reasons, such as ensuring high standards of quality and the safety of health care, medicinal products, or medical devices.
- Archiving and research: Processing may be required for public interest archiving, scientific or historical research, or statistical purposes.
The GDPR authorizes EU member states to implement additional rules or restrictions for processing genetic, biometric, or healthcare data. They may establish stricter standards or safeguards beyond the regulation’s requirements.
What is explicit consent under the GDPR?
Art. 4 GDPR defines consent as “any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her.”
Although the GDPR does not separately define explicit consent, it does require a clear and unambiguous action from users to express their acceptance of data processing. In other words, users must take deliberate steps to consent to their personal data being collected. Pre-ticked boxes, inactivity, or implied consent through continued use of a service do not meet GDPR requirements for explicit consent.
Common examples of explicit consent mechanisms include:
- ticking an opt-in checkbox, such as selecting “I Agree” in a cookie banner
- confirming permission for marketing emails, particularly with a double opt-in process.
- permitting location tracking for a map application by responding to a direct authorization request
Additional compliance requirements for processing sensitive personal data under the GDPR
Organizations processing personal data under the GDPR must follow several core obligations. These include maintaining records of processing activities, providing transparent information on data practices, and adhering to principles such as data minimization and purpose limitation. However, processing sensitive personal data requires additional safeguards due to the potential risks involved.
Data Protection Officer (DPO)
Organizations with core activities that involve large-scale processing of sensitive personal data must appoint a Data Protection Officer (DPO) under Art. 37 GDPR. The DPO may be an employee of the organization or an outside consultant.
Among other responsibilities, the DPO monitors GDPR compliance, advises on data protection obligations, and acts as a point of contact for regulatory authorities.
Data Protection Impact Assessment (DPIA)
Art. 35 GDPR requires a Data Protection Impact Assessment (DPIA) for processing operations that are likely to result in high risks to individuals’ rights and freedoms. A DPIA is particularly important when processing sensitive data on a large scale. This assessment helps organizations identify and minimize data protection risks before beginning processing activities.
Restrictions on automated processing and profiling
Art. 22 GDPR prohibits automated decision-making, including profiling, based on sensitive personal data unless one of the following applies:
- the data subject has explicitly consented
- the processing is necessary for reasons of substantial public interest under the law
If automated processing of sensitive personal data is permitted under these conditions, organizations must implement safeguards to protect individuals’ rights and freedoms.
Penalties for noncompliance with the GDPR
GDPR penalties are substantial. There are two tiers of fines based on the severity of the infringement or if it’s a repeat offense.
For severe infringements, organizations face fines up to:
- EUR 20 million, or
- four percent of total global annual turnover of the preceding financial year, whichever is higher
Less severe violations can result in fines up to:
- EUR 10 million, or
- two percent of global annual turnover of the preceding financial year, whichever is higher
While violations involving sensitive personal data are often categorized as severe, supervisory authorities will consider the specific circumstances of each case when determining penalties.
Practical steps for organizations to protect GDPR sensitive personal data
Organizations handling sensitive personal data must take proactive measures to meet GDPR requirements and protect data subjects’ rights.
Conduct data mapping
Organizations should identify and document all instances in which sensitive personal data is collected, processed, stored, or shared. This includes tracking data flows across internal systems and third-party services. A thorough data inventory helps organizations assess risks, implement appropriate safeguards, and respond to data subject requests efficiently.
Develop internal policies
Establish clear internal policies and procedures to guide employees through the proper handling of sensitive personal data. These policies should cover, among other things, data access controls, storage limitations, security protocols, and breach response procedures, as well as specific procedures for data collection, storage, processing, and deletion. Organizations should conduct regular training programs to help employees understand their responsibilities and recognize potential compliance risks.
Obtain explicit consent
The GDPR requires businesses to obtain explicit consent before processing sensitive personal data. Consent management platforms (CMPs) like Usercentrics CMP provide transparent mechanisms for users to grant or withdraw explicit consent, which enables organizations to be transparent about their data practices and maintain detailed records of consent choices.
Manage third-party relationships
Many businesses rely on third-party vendors to process sensitive personal data, so it’s essential that these partners meet GDPR standards. Organizations should implement comprehensive data processing agreements (DPAs) that define each party’s responsibilities, outline security requirements, and specify how data will be handled, stored, and deleted. Businesses should also conduct due diligence on vendors to confirm their compliance practices before engaging in data processing activities.
Perform regular audits
Conducting periodic reviews of data processing activities helps businesses identify compliance gaps and address risks before they become violations. Review consent management practices, security controls, and third-party agreements on a regular basis to maintain GDPR compliance and respond effectively to regulatory scrutiny.
Checklist for GDPR sensitive personal data handling compliance
Below is a non-exhaustive checklist to help your organization handle sensitive personal data in compliance with the GDPR. This checklist includes general data processing requirements as well as additional safeguards specific to sensitive personal data.
For advice specific to your organization, we strongly recommend consulting a qualified legal professional or data privacy expert.
- Obtain explicit consent before processing sensitive personal data. Do so using a transparent mechanism that helps data subjects understand exactly what they’re agreeing to.
- Create straightforward processes for users to withdraw consent at any time, which should be as easy as giving consent. Stop data collection or processing immediately or as soon as possible if consent is withdrawn.
- Implement robust security measures such as encryption, access controls, and anonymization to protect sensitive personal data from unauthorized access or breaches.
- Keep comprehensive records of all data processing activities involving sensitive personal data. Document the purpose, legal basis, and retention periods.
- Publish clear and accessible privacy policies that inform users how their sensitive data is collected, used, stored, and shared.
- Update your data protection policies regularly to reflect changes in processing activities, regulations, or organizational practices.
- Train employees on GDPR requirements and proper data handling procedures, emphasizing security protocols and compliance obligations.
- Create clear protocols for detecting, reporting, and responding to data breaches. Include steps for notifying affected individuals and supervisory authorities when required.
- Conduct data protection impact assessments (DPIAs) before starting new processing activities involving sensitive data.
- Determine if your organization requires a Data Protection Officer based on the scale of sensitive personal data processing.
- Verify that all external processors that handle sensitive data meet GDPR requirements through formal agreements and regular audits.
Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or privacy specialists regarding data privacy and protection issues and operations.
Google Analytics is a powerful tool for understanding website performance, user behavior, and traffic patterns. However, its compliance with the General Data Protection Regulation (GDPR) has been a subject of concern and controversy, particularly in the European Union (EU). The data protection authorities of several European Union (EU) countries have weighed in on privacy compliance issues with Google Analytics, with similar complaints that focus on its insufficient protections and data transfer practices.
In this article, we’ll examine the timeline of EU-US data transfers and the law, the relationship between Google Analytics and data privacy, and whether Google’s popular service is — or can be — GDPR-compliant.
Google Analytics and data transfers between the EU and US
One of the key compliance issues with Google Analytics is its storage of user data, including EU residents’ personal information, on US-based servers. Because Google is a US-owned company, the data it collects is subject to US surveillance laws, potentially creating conflicts with EU privacy rights.
The EU-US Privacy Shield was invalidated in 2020 with the Schrems II ruling, and there was no framework or Standard Contractual Clauses (SCC) in place for EU to US data transfers until September 2021 when new SCCs were implemented. These were viewed as a somewhat adequate safeguard if there were additional measures like encryption or anonymization in place to make data inaccessible by US authorities.
A wave of rulings against Google Analytics after the invalidation of the Privacy Shield
The Schrems II ruling sparked a series of legal issues and decisions by European Data Protection Authorities (DPAs), which declared the use of Google Analytics as noncompliant with the GDPR.
- Austria: Austrian DPA Datenschutzbehörde (DSB) ruled Google Analytics violated the Schrems II ruling.
- France: Commission Nationale de l’Informatique et des Libertés (CNIL) found that the use of Google Analytics was not compliant with Art. 44 GDPR due to international data transfers without adequate protection; organizations were given one month to update their usage.
- Italy: Garante ruled that the transfer of data to the US via Google Analytics violated the GDPR and legal bases and reasonable protections were required.
- Netherlands: Dutch data protection authority AP announced investigations into two complaints against Google Analytics, with the complaints echoing issues raised in other EU countries.
- United Kingdom: Implemented the UK version of the GDPR after Brexit, UK data protection authority removed Google Analytics from its website after the Austrian ruling.
- Norway: Datatilsynet stated it would align with Austria’s decision against Google Analytics and publicly advised Norwegian companies to seek alternatives to the service.
- Denmark: Datatilsynet stated that lawful use of Google Analytics “requires the implementation of supplementary measures in addition to the settings provided by Google.” Companies that could not implement additional measures were advised to stop using Google Analytics.
- Sweden: IMY ordered four companies to stop using Google Analytics on the grounds that these companies’ additional security measures were insufficient for protecting personal data.
- European Parliament: European Data Protection Supervisor (EDPS) sanctioned the European Parliament for using Google Analytics on its COVID testing sites due to insufficient data protections.
A week before the Austrian ruling, the European Data Protection Supervisor (EDPS) sanctioned the European Parliament for using Google Analytics on its COVID testing sites due to insufficient data protections. This is viewed as one of the earliest post-Schrems II rulings and set the tone for additional legal complaints.
The EU-U.S. Data Privacy Framework
On July 10, 2023, the European Commission adopted its adequacy decision for the EU-U.S. Data Privacy Framework, which covers data transfers among the EU, European Economic Area (EEA) and the US in compliance with the GDPR.
The framework received some criticism from experts and stakeholders. Some privacy watchdogs, including the European Data Protection Board (EDPB), pointed out striking similarities between the new and the previous agreements, raising doubts about its efficacy in protecting EU residents’ data.
As of early 2025, the EU-U.S. Data Privacy Framework and adequacy for EU/U.S. data transfers are in jeopardy. President Trump fired all of the Democratic party members of the Privacy and Civil Liberties Oversight Board (PCLOB). As a result, the number of PCLPB board members is below the threshold that enables the PCLOB to operate as an oversight body for the EU-U.S. Data Privacy Framework.
This action will likely undermine the legal validity of the Framework for EU authorities, particularly the courts. The EU Commission could withdraw its adequacy decision for the EU-U.S. Data Privacy Framework, which would invalidate it. The Court of Justice of the EU (CJEU) could also overturn the Commission’s adequacy decision following a legal challenge. The last option is how the preceding agreements to the Framework were struck down, e.g. with Schrems II.
Should the EU-U.S. Data Privacy Framework be struck down, it could have significant effects on data transfers, cloud storage, and the function of platforms based outside of the EU, like those from Google, including Analytics. At the very least, Google may be required to make further changes to the function of tools like Google Analytics, along with related data storage, to meet European privacy standards.
Google Analytics GDPR compliance?
Google Analytics 4 has several significant changes compared to Universal Analytics. The new version adopts an event-based measurement model, contrasting the session-based data model of Universal Analytics. This shift enables Google Analytics 4 to capture more granular user interactions, better capturing the customer journey across devices and platforms. Website owners can turn this off to stop it from collecting data such as city or latitude or longitude, among others. Website owners also have the option to delete user data upon request.
Another notable feature is that Google Analytics 4 does not log or store IP addresses from EU-based users. According to Google, this is part of Google Analytics 4’s EU-focused data and privacy measures. This potentially addresses one of the key privacy concerns raised by the Data Protection Authorities, which found that anonymizing IP addresses was not an adequate level of protection.
The EU-U.S. Data Privacy Framework alone doesn’t make Google Analytics 4 GDPR-compliant. The framework can make data transfers to the US compliant, if they are with a certified US company, but the onus is on website owners to ensure that the data was collected in compliance with the legal requirements of the GDPR in the first place.
How to make Google Analytics GDPR compliant
1. Enable explicit or opt-in consent
All Google Analytics cookies should be set up and controlled so they only activate after users have granted explicit consent. Users should also have granular control so that they can choose to allow cookies for one purpose while rejecting cookies for another.
A consent management platform (CMP) like Usercentrics can enable blocking of the activation of services until user consent has been obtained. Google Analytics couldn’t transfer user data because it would never have collected it.
2. Use Google Consent Mode
Google Consent Mode allows websites to dynamically adjust the behavior of Google tags based on the user’s consent choices regarding cookies. This feature ensures that measurement tools, such as Google Analytics, are only used for specific purposes if the user has given their consent, even though the tags are loaded onto the webpage before the cookie consent banner appears. By implementing Google Consent Mode, websites can modify the behavior of Google tags after the user allows or rejects cookies so that it doesn’t collect data without consent.
Read about consent mode GA4 now
3. Have a detailed privacy policy and cookie policy
Website operators must provide clear, transparent data processing information for users on the website. This information is included in the privacy policy. Information related specifically to cookies should be provided in the cookie policy, with details of the Google Analytics cookies and other tracking technologies that are used on the site, including the data collected by these cookies, provider, duration and purpose. The cookie policy is often a separate document, but can be a section within the broader privacy policy.
The GDPR requires user consent to be informed, which is what the privacy policy is intended to enable. To help craft a GDPR-compliant privacy policy, extensive information on the requirements can be found in Articles 12, 13 and 14 GDPR.
4. Enter into a Data Processing Agreement with Google
A data processing agreement (DPA) is a legally binding contract and a crucial component of GDPR compliance. The DPA covers important aspects such as confidentiality, security measures and compliance, data subjects’ rights, and the security of processing. It helps to ensure that both parties understand their responsibilities and take appropriate measures to protect personal data. Google has laid down step-by-step instructions on how to accept its DPA.
Can server-side tracking make Google Analytics more privacy-friendly?
Server side tracking allows for the removal or anonymization of personally identifiable information (PII) before it reaches Google’s servers. This approach can improve data accuracy by circumventing client-side blockers, and it offers a way to better align with data protection regulations like the GDPR. By routing data through your own server first, you gain more control over what eventually gets sent to Google Analytics.
Impact of the Digital Markets Act on Google Analytics 4
The implementation of the Digital Markets Act (DMA) has had some impact on Google Analytics 4, affecting functions, data collection practices, and privacy policies. Website owners who use the platform have been encouraged to take the following steps for ongoing compliance.
- Audit your privacy policy, cookies policy and data practices.
- Conduct a data privacy audit to check compliance with GDPR, and take any corrective steps if necessary.
- Install a CMP that enables GDPR compliance to obtain valid user consent per the regulation’s requirements.
- Seek advice from qualified legal counsel and/or a privacy expert, like a Data Protection Officer, on measures required specific to your business.
Learn more about DMA compliance.
How to use Google Analytics 4 and achieve GDPR compliance with Usercentrics CMP
Taking steps to meet the conditions of Art. 7 GDPR for valid user consent, website operators must obtain explicit end-user consent for all Google Analytics cookies set by the website. Consent must be obtained before these cookies are activated and in operation. Using Usercentrics’ DPS Scanner helps identify and communicate to users all cookies and tracking services in use on websites to ensure full consent coverage options.
Next steps with Google Analytics and Usercentrics
Google Analytics helps companies pursue growth and revenue goals, so understandably, businesses are caught between not wanting to give that up, but also not wanting to risk GDPR violation penalties or the ire of their users over lax privacy or data protection.
The Usercentrics team closely monitors regulatory changes and legal rulings, makes updates to our services and posts recommendations and guidance as appropriate.
However, website operators should always get relevant legal advice from qualified counsel regarding data privacy, particularly in jurisdictions relevant to them. This includes circumstances where there could be data transfers outside of the EU to countries without adequacy agreements for data privacy protection.
As the regulatory landscape and privacy compliance requirements for companies are complex and ever-changing, we’re here to help.
The General Data Protection Regulation (GDPR) is one of the strictest privacy laws globally, and since it came into effect in 2018 it has reshaped how businesses handle personal data. It has also influenced subsequent data privacy legislation around the world.
But the GDPR is more than a legal requirement, it’s also an opportunity to demonstrate to customers that you prioritize their privacy.
Although GDPR compliance may seem complex, the long-term benefits it can bring to your business are worth it. It offers an opportunity to build trust and enhance data management practices.
Below is all the essential information you need to know about GDPR compliance. From its technical requirements to the key steps to take to achieve and maintain compliance.
What is the General Data Protection Regulation (GDPR)?
The General Data Protection Regulation (GDPR) is an EU law designed to protect individuals’ personal data and privacy. At its core, it gives people more control over how their information is collected, stored, and used. It also establishes strict requirements for companies regarding the collection and processing of personal data. And regulates organizations that handle personal data.
The GDPR replaced the Data Protection Directive in May 2018. Unlike a directive, which commonly requires individual countries to pass their own laws using the directive guidelines, the GDPR applies across all EU member states and the European Economic Area (EEA). This means that all organizations must comply with a single set of rules. Even though each member state handles its own enforcement, doing so is more consistent.
What is considered personal data under the GDPR?
Personal data under the GDPR includes any information that can identify an individual, either directly or indirectly. This includes:
- Names, addresses, and phone numbers
- Email addresses and online identifiers (e.g., IP addresses, cookie data)
- Financial data (e.g., credit card details, banking information)
- Health and biometric data
- Employment details
Sensitive personal data includes information like racial or ethnic origin, political opinions, religious beliefs, and genetic data. It has different and more stringent requirements, like greater protections, than other personal personal data because of the harm that could occur if it is shared.
What is GDPR compliance?
Compliance with the GDPR means following the regulation’s rules for handling personal data. Organizations must ensure transparency, security, and accountability in their data processing practices. This includes obtaining user consent when required, implementing security measures, and responding to requests from data subjects.
Achieving GDPR compliance involves an ongoing combination of legal, technical, and organizational measures.
Who needs to comply with the GDPR?
If you collect or process personal data of people in the EU — whether through a website, app, or other services — you need to comply with the GDPR.
This requirement applies to companies worldwide. Even if your business isn’t physically based in the EU, offering services or products to people within the EU means you must follow GDPR compliance rules.
Understanding whether your company falls under the jurisdiction of the GDPR is the first step in protecting your business. It’s important to note that compliance isn’t determined by the size of your company but rather by how you manage and process customer data.
Do US companies have to comply with the GDPR?
Many US companies must comply with the GDPR, even if they do not have a physical presence in the EU. The regulation applies to any organization that:
- Offers goods or services to individuals in the EU, regardless of whether a payment is involved
- Monitors the behavior of individuals in the EU, such as tracking website visitors or analyzing customer data
For example, a US-based ecommerce store selling to customers in Germany must comply with the GDPR. Similarly, a marketing company using online tracking tools to analyze the behavior of European users is subject to the regulation.
Why is complying with the GDPR important?
EU GDPR compliance isn’t just about avoiding penalties, though that’s certainly a reason to take it seriously. It can also give your business a competitive edge. Demonstrating to customers that you prioritize their privacy and security builds trust and loyalty, leading to return business and recommendations.
But the benefits don’t stop with your customers. Privacy compliance helps streamline your data practices, making it easier to organize and secure data. This reduces the risk of mishandling and helps prevent data breaches, so your business remains protected. Also importantly, it reduces legal risks from data protection authorities or consumer complaints.
When implemented properly, GDPR compliance can enhance your company’s efficiency and boost your reputation, so it’s a smart investment for long-term success.
What are key GDPR compliance requirements?
The regulation sets strict rules to protect individuals’ privacy and promote transparency in how their data is used. Ignoring these rules can lead to hefty fines and serious reputational damage.
Below are key GDPR requirements companies should know about to avoid these consequences.
Data protection principles
The GDPR is built on seven core principles that guide how organizations should manage personal data:
- Lawfulness, fairness, and transparency: Data must be collected and processed legally, and individuals must be clearly informed about how their information is used.
- Purpose limitation: Data can only be used for specific, legitimate purposes.
- Data minimization: Only collect the data necessary for those purposes.
- Accuracy: Organizations must keep data up to date and correct any errors.
- Storage limitation: Personal data should not be retained longer than necessary.
- Integrity and confidentiality: Strong security measures must be in place to prevent unauthorized access or breaches.
- Accountability: Businesses must not only comply with the GDPR but also be able to demonstrate their compliance.
These principles form the foundation of responsible data management and impact every aspect of GDPR compliance.
Legal bases for processing
Before processing personal data, organizations must identify a valid legal basis for doing so. There are several lawful grounds for data processing, including consent, which means that individuals explicitly agree to data processing.
Data may also be processed if it’s necessary for contractual obligations, such as fulfilling a service agreement, or to comply with a legal obligation, like adhering to tax laws or employment regulations.
In some cases, data may be processed for vital interests, such as protecting someone’s life in an emergency, or for public interest, when the data is necessary for official tasks. Legitimate interests such as fraud prevention or security can also justify processing, as long as those interests do not override individuals’ rights.
Data subject rights
The GDPR gives individuals greater control over their personal data than previous laws, via data subject rights. These rights include individuals’ ability to access their data, correct inaccuracies, and even request data deletion, commonly referred to as “the right to be forgotten.”
In certain circumstances, individuals can also restrict the processing of their data, transfer it to another service provider, or object to processing altogether.
Companies must be prepared to handle these requests in a timely manner and respect individual rights throughout the data processing lifecycle.
Documentation and accountability
Compliance requires clear documentation. Businesses must maintain records of:
- The types of data they process
- The purpose of processing
- Where data is stored
- How long data is retained
- The security measures in place to protect the data
Documentation promotes transparency and helps demonstrate compliance in case of regulatory inquiries or audits.
Data Protection Officer (DPO) requirements
Some organizations, particularly those handling large volumes of data or sensitive personal data, must appoint a Data Protection Officer (DPO). The DPO oversees privacy compliance, advises on data protection policies, and acts as the main point of contact for regulators and data subjects.
Even when not legally required, having a dedicated individual who is responsible for data protection can help organizations manage risks and stay aligned with GDPR standards.
How to comply with the GDPR?

Now that we’ve covered the basics, let’s dive into the steps your company can take to achieve GDPR compliance. While the specific approach will vary depending on your organization, there are key actions you can start taking today to meet GDPR standards.
1. Understand the GDPR principles
Start by familiarizing yourself with the core principles of the GDPR. These include lawfulness, fairness, transparency, data minimization, accuracy, and accountability. Align your processes with these principles for effective data protection.
2. Conduct a data audit
Identify all personal data you collect, store, and process, including customer, employee, and third-party data. Determine where it’s stored, how long it’s kept, and who has access. This step will help you understand and manage the data you handle.
3. Define data processing activities
Document how and why you process personal data, including the types of data and legal basis for processing. Be clear about your purposes for data collection, whether it’s for consent, contractual necessity, or legitimate interest.
4. Update privacy policies
Review and update your privacy policies to reflect GDPR requirements. Include details on the types of data collected, purposes for processing, parties that may access it, legal bases for processing, data retention, and individuals’ rights. Your policy should be easily accessible, clear for users, and updated regularly.
5. Obtain consent where necessary
If consent is your legal basis for processing, make sure it is both informed and unambiguous. Obtain consent separately from other terms and enable users to withdraw consent at any time. Track and manage consent records securely.
6. Ensure data subject rights
Implement systems to enable individuals to exercise their rights under the GDPR. These rights include access, correction, erasure, data portability, and objection to processing. Respond to these requests within the required timeframes.
7. Implement data protection measures
Apply technical and organizational measures to protect personal data. These may include encryption, access control, audits, and incident response plans. Build data protection into your systems from the outset to implement privacy by design, a concept built into GDPR requirements.
8. Appoint a data protection officer (DPO) if necessary
If your activities involve large-scale or sensitive data processing or the regular monitoring of individuals, appoint a DPO. The DPO’s job is to ensure ongoing compliance, advise on data protection issues, and act as a contact for data subjects and authorities.
9. Review contracts with third-party vendors
Ensure that you have contracts in place with third-party vendors and that they include GDPR-compliant data processing agreements. These should outline the data processed, the purpose for processing, security measures, and data retention policies. Verify that third parties also comply with the GDPR.
10. Create a breach response plan
Develop a response plan in case of a data breach. You must notify the relevant authorities within 72 hours and inform affected individuals if the breach poses a high risk. Your plan should include identification, containment, and reporting procedures.
11. Conduct regular training and awareness programs
Provide regular training on GDPR principles, data security, and breach reporting. Make sure staff understand how to handle data access requests and recognize potential breaches. Training should be ongoing to help prevent errors and promote a compliance culture.
12. Monitor compliance and update processes
Review and update your data protection practices regularly to maintain compliance. Stay informed about changes in the regulation and adapt your processes as needed. Ongoing monitoring will help you continue to meet GDPR requirements.
Technical requirements for GDPR compliance
For marketing teams handling customer data, GDPR compliance requires the right technical safeguards to keep information secure.
One key measure is data encryption. Whether you’re storing customer details in a marketing database or transferring data between platforms, encryption keeps sensitive information protected.
Email platforms, analytics tools, and customer relationship management (CRM) systems should follow current encryption standards. When sharing data with vendors, always use secure, encrypted connections.
Beyond encryption, access controls help minimize risk. Not everyone on your team needs full visibility into customer data. Your social media team may only require engagement metrics, while your email marketers need subscriber details but not full website analytics.
Setting up permissions based on necessity and enforcing strong authentication reduces exposure and strengthens security.
Even with these safeguards in place, regular security checks are essential. Periodic audits help you stay on top of access permissions, data retention settings, and inactive accounts that could pose security risks.
Having a clear process for reporting potential breaches is just as important. The GDPR requires notification within 72 hours, so a well-defined response plan can make all the difference in hitting that benchmark.
Finally, compliance doesn’t stop with your own team. Any marketing tools or platforms you use must also meet GDPR standards. Before adopting new software, verify that vendors have robust data security measures in place to avoid compliance risks down the line.
GDPR compliance for different business sizes
Businesses of all sizes operating in the EU must comply with the GDPR, but their approach to compliance will vary based on resources, data processing activities, and operational complexity.
GDPR compliance for enterprise companies
Enterprises processing large volumes of personal data must take a structured approach to GDPR compliance. This includes appointing a Data Protection Officer, conducting Data Protection Impact Assessments (DPIAs), and integrating compliance into company-wide policies and systems. Managing third-party risks is also crucial, so that all vendors meet GDPR requirements.
GDPR compliance for small businesses
Small businesses, even sole proprietors, are subject to GDPR regulations if they handle the personal data of EU residents. While they may not always need a DPO, they must still follow data protection principles, obtain valid consent, and implement security measures. Given resource constraints, leveraging automated compliance tools can help contain resource demands.
GDPR compliance for startups
Startups collecting or processing personal data must integrate data protection compliance from the beginning. This is important legally, but also, ongoing compliance is easier and less tricky than trying to retrofit it into operations that have grown substantially.
This includes designing privacy directly into products, websites, and other touchpoints, setting up consent management solutions, and ensuring secure data storage. Early adoption of compliance best practices can prevent costly legal issues and build customer trust to help a small business grow.
Assessing your GDPR compliance
Regular GDPR compliance assessments help organizations identify gaps and improve data protection practices. A structured framework like the one below can guide businesses through evaluation areas.
If you’re looking to assess your business practices to see if your processes comply with the GDPR, here are the aspects to evaluate:
- Data processing review: Determine if your business processes EU personal data and under which legal basis.
- Data collection and storage: Assess how data is collected, stored, and shared.
- Consent management: Verify that you are obtaining and documenting valid consent, and that users can easily withdraw it.
- Security measures: Evaluate the effectiveness of encryption, access controls, and incident response plans.
- Data subject rights: Establish clear processes for handling requests related to data access, correction, and deletion.
By following this structured compliance framework, your company can proactively address potential gaps and stay prepared for regulatory audits.
What are the penalties for noncompliance with the GDPR?
Failure to comply with the GDPR can result in significant fines. The regulation sets two levels of penalties:
- Up to EUR 10 million or 2 percent of annual global turnover for first-time or less severe violations, such as inadequate record-keeping or failing to notify authorities of a data breach.
- Up to EUR 20 million or 4 percent of annual global turnover for serious or repeat violations, such as unlawful data processing or failing to obtain valid consent.
Regulators also have the authority to issue warnings, impose processing bans, or require corrective actions. Businesses should prioritize compliance to avoid financial and reputational damage.
How Usercentrics can help you achieve GDPR compliance
Achieving and maintaining GDPR compliance can feel daunting, but the right tools and resources can help make the process simpler, less resource intensive, and enable better insights.
A Consent Management Platform (CMP) is one such tool that can help your company manage user consent in a structured and compliant way.
Usercentrics CMP is one such tool that enables you to collect, store, and manage consent while promoting transparency in your data collection practices.
We deliver features like cookie and tracker scanning, surfacing and providing visibility into the tracking technologies in use on your website and helping you stay aligned with GDPR requirements.
We also offer customizable privacy policies and automated reporting to further simplify compliance efforts. These features reduce the need for manual updates and help ensure ongoing adherence to the GDPR.
With Usercentrics CMP, businesses can not only meet GDPR requirements but also build trust with customers by maintaining clear and transparent data practices.
In 2019, New York’s data breach laws underwent significant changes when the SHIELD Act was signed into law. The regulation has continued to evolve, with new amendments in December 2024. This article outlines the SHIELD Act’s requirements for businesses and protecting and handling New York state residents’ private information, from security requirements to breach notifications.
What is the New York SHIELD Act?
The New York Stop Hacks and Improve Electronic Data Security Act (New York SHIELD Act) established data breach notification and security requirements for businesses that handle the private information of New York state residents. The law updated the state’s 2005 Information Security Breach and Notification Act with expanded definitions and additional safeguards for data protection.
The New York SHIELD Act introduced several requirements to protect New York residents’ data. These include:
- a broader definition of what constitutes private information
- updated criteria for what qualifies as a security or data breach
- specific notification procedures for data breaches
- implementation of administrative, technical, and physical safeguards
- expansion of the law’s territorial scope
The law also increased penalties for noncompliance with its data security and breach notification requirements.
The New York SHIELD Act was implemented in two phases:
- breach notification requirements became effective on October 23, 2019
- data security requirements became effective on March 21, 2020
Who does the New York SHIELD Act apply to?
The New York SHIELD Act applies to any person or business that owns or licenses computerized data containing the private information of New York state residents. It applies regardless of whether the business itself is located in New York. This scope marked a significant expansion from the previous 2005 law, which only applied to businesses operating within New York state. The law’s extraterritorial reach means that organizations worldwide must comply with its requirements if they possess private information of New York residents, even if they conduct no business operations within the state.
What is a security breach under the New York SHIELD law?
The New York SHIELD Act expanded the definition of a security breach beyond the 2005 law’s limited scope. The previous law only considered unauthorized acquisition of computerized data as a security breach. The New York SHIELD Act includes the following actions that compromise the security, confidentiality, or integrity of private information:
- unauthorized access to computerized data
- acquisition without valid authorization to computerized data
The law provides specific criteria to determine unauthorized access by examining whether an unauthorized person viewed, communicated with, used, or altered the private information.
What is private information under the New York SHIELD Act?
The New York SHIELD law defines two types of information: personal and private.
Personal information includes any details that could identify a specific person, such as their name or phone number.
Under the 2005 law, private information was defined as personal information concerning a natural person combined with one or more of the following:
- Social Security number
- driver’s license number
- account numbers with security codes or passwords
The New York SHIELD Act expands this definition of private information to include additional elements:
- account numbers and credit or debit card numbers that could enable access to a financial account without additional security codes, passwords, or other identifying information
- biometric information that is used to authenticate and ascertain an individual’s identity, such as a fingerprint, voice print, or retina or iris image
- email addresses or usernames combined with passwords or security questions and answers
The law specifically states that publicly available information is not considered private information.
This definition is set to expand once again. On December 21, 2024, Governor Kathy Hochul signed two bills that strengthened New York’s data breach notification laws. Under one of the amendments, effective March 21, 2025, private information will include:
- medical information, including medical history, conditions, treatments, and diagnoses
- health insurance information, including policy numbers, subscriber identification numbers, unique identifiers, claims history, and appeals history
What are the data security requirements under the New York SHIELD Act?
This New York data security law requires any person or business that maintains private information to implement reasonable safeguards for its protection. There are three categories of safeguards required: administrative, technical, and physical.
Administrative safeguards include:
- appointing one or more specific employees to manage security programs
- finding potential risks from internal and external sources
- reviewing existing safeguards to check their effectiveness
- training employees on the organization’s security practices and procedures
- choosing qualified service providers who meet security requirements through contracts
- modifying security programs when business need change
Technical safeguards include:
- assessing risks in network structure and software design
- evaluating risks in information processing, transmission, and storage
- detecting, preventing, and responding to attacks or system failures
- regularly testing and monitoring the effectiveness of key controls, systems, and procedures
Physical safeguards include:
- assessing risks related to information storage and disposal methods
- implementing systems to detect and prevent intrusions
- protecting private information from unauthorized access or use during collection, transportation, and disposal
- Properly disposing of electronic media within a reasonable timeframe to prevent data reconstruction when it is no longer needed
- disposing of private information by erasing electronic media when no longer needed for business purposes so that the information cannot be read or reconstructed
Businesses are deemed compliant with these safety requirements if they are subject to and compliant with certain federal laws, such as the Gramm-Leach-Bliley Act (GLBA), the Health Insurance Portability and Accountability Act (HIPAA), and the Health Information Technology for Economic and Clinical Health Act (HITECH).
What are the data breach notification requirements under the New York SHIELD law?
The New York SHIELD Act sets specific requirements for how and when businesses must notify individuals and authorities about data breaches involving private information.
The law previously required businesses that discover a security breach of computer data systems containing private information to notify affected consumers “in the most expedient time possible and without unreasonable delay.” The December 2024 amendment added a specific timeline to this requirement. Businesses now have a maximum of 30 days in which to notify affected New York state residents of data breaches. The 30-day time limit came into effect immediately upon the bill being signed.
The New York SHIELD Act also previously required businesses to notify three state agencies about security breaches:
- the Office of the New York State Attorney General
- the New York Department of State
- the New York State Police
The December 2024 amendment added a fourth state agency to be notified, with immediate effect: the New York State Department of Financial Services.
These notices must include information about the timing, content, distribution of notices, and approximate number of affected persons, as well as a copy of the template of the notice sent to affected persons. If more than 5,000 New York state residents are affected and notified, businesses must also notify consumer reporting agencies about the timing, content, distribution of notices, and approximate number of affected persons.
The law introduced specific restrictions on methods for notifying affected consumers. Email notifications are not permitted if the compromised information includes an email address along with a password or security question and answer that could allow access to the online account.
All notifications must provide contact information for the person or business notifying affected persons as well as telephone numbers and websites for relevant state and federal agencies that offer guidance on security breach response and identity theft prevention.
Enforcement of the New York SHIELD Act and penalties for noncompliance
The New York Attorney General has the authority to enforce the New York SHIELD Act, with the power to pursue injunctive relief, restitution, and penalties against businesses that violate the law.
The law establishes different levels of penalties based on the nature and severity of the violations. When businesses fail to provide proper breach notifications, but their actions are not reckless or intentional, courts may require them to pay damages that cover the actual costs or losses experienced by affected persons.
More severe penalties apply to knowing and/or reckless violations of notification requirements. In these cases, courts can impose penalties of up to USD 5,000 or USD 20 per instance of failed notification, whichever amount is greater. These penalties are capped at USD 250,000.
Businesses that fail to implement reasonable safeguards as required by the law face separate penalties. Courts can impose fines of up to USD 5,000 for each violation of these security requirements.
Impact of the New York SHIELD Act on businesses
The New York SHIELD law imposes significant obligations for any organization handling New York residents’ private information, regardless of location. Businesses must implement comprehensive data security programs with specific safeguards, meet strict breach notification deadlines, and prepare for expanded data protection requirements.
Key impacts include:
- 30-day mandatory breach notification requirement (currently in effect)
- the implementation of administrative, technical, and physical security safeguards
- expanded private information definition, in effect March 21, 2025
- potential penalties up to USD 250,000 for notification violations and USD 5,000 per security requirement violation
New York SHIELD Act Compliance Checklist
Below is a non-exhaustive checklist to help your business comply with the New York SHIELD Act. For advice specific to your organization, it’s strongly recommended to consult a qualified legal professional.
- Implement reasonable administrative, technical, and physical safeguards to protect the private information of New York residents.
- Create and maintain a process to detect data breaches affecting private information.
- Establish procedures to notify affected New York state residents within 30 days of discovering a breach.
- Set up a system to report breaches to the Attorney General, Department of State, State Police, and Department of Financial Services.
- Include contact information and agency resources for breach response and identity theft prevention in all notifications.
- Use appropriate notification methods (for instance, do not use email if the breach involves email/password combinations).
- Notify consumer reporting agencies if more than 5,000 New York state residents are affected by a breach.
- Train employees on security practices and procedures.
- Review and update security programs when business circumstances change.
- Prepare to protect additional categories of private information (medical and health insurance data) starting March 21, 2025.
California passed the first US state data privacy law in 2018 with the California Consumer Privacy Act (CCPA), the same year the General Data Protection Regulation (GDPR) came into force. Progress beyond that state was slow for the next several years, with the Virginia Consumer Data Protection Act (VCDPA) being the main state-level regulation passed.
New momentum started in 2023, with six states passing laws. The European Union and United States also replaced the struck-down Privacy Shield with their new data privacy framework: the EU-U.S. Data Privacy Framework.
The momentum continued into 2024, with seven more US state privacy laws being passed and federal legislation being made public for review. Eight state privacy laws are currently scheduled to come into effect in 2025, and three more in 2026. More state-level data privacy laws are expected to be passed, and some states are already enacting updates to their existing laws. Federal legislation has not made progress, which is expected to continue to be the case.
We also expect to see more topical or industry-specific laws being proposed or passed in the US, like the Washington My Health My Data Act, and the AI Act in Colorado.
What are the states with privacy laws?
There is a long way to go before US states with data privacy laws are the majority, or a federal law is passed that supplants them. However, momentum is growing, and states drafting legislation now have a substantial number of implemented regulations to draw from, as well as a wealth of evolving thought regarding data privacy, technology, and consumers’ rights.
To date, all the data privacy laws in the US at a state level have implemented an opt-out consent model, so in most cases personal data can be collected and processed without consent, though individuals have the right to opt-out of sale, sharing, targeted advertising, and/or profiling, depending on the specific regulation. California remains the only state to enable a private right of action, allowing consumers to directly sue companies for damages if they are involved in a data breach or other violation.
Which modern US state privacy laws are considered comprehensive?
Due to its somewhat more narrow focus and broader exclusions, the Florida Digital Bill of Rights (FDBR) is not considered among the comprehensive modern data privacy laws in the US. The same goes for the Nevada Privacy of Information Collected on the Internet from Consumers Act (NPICICA) and Amendment SB-260, though that law is older and predates even California’s CCPA.
What are the compliance requirements for US state privacy laws?
Compliance threshold standards vary across states, with thresholds like company revenue not being included in more recently passed laws. We are also seeing advancements in technology and social issues being reflected in the laws, e.g. with more explicit considerations for “automated decision-making” (e.g. AI tools) and inclusion of information like gender identity under the category of sensitive data.
While some of the US states with privacy laws tout themselves as being more “business-friendly” or more strict, they all remain fairly similar. It is important, however, to consult with qualified legal counsel or a data privacy expert to ensure that your business meets the requirements for all states where it’s required to comply with regulations.
Let’s look at a comparison of the US data privacy laws at the state level and what they mean for businesses and consumers.
What are the effective dates of the US state privacy laws?
US data privacy laws tend to draw on existing privacy regulations when they’re drafted. When the CCPA was drafted, there were fewer models than when other US state data privacy legislation was in progress. However, the EU’s GDPR was already in effect in 2018 when the CCPA was passed.
Typically, there has been a lead time of a couple of years between when legislation is passed and a new law comes into effect, giving businesses and other organizations time to familiarize themselves with the law’s contents and requirements. However, with laws passed in 2024, that period of time is getting shorter. The Nebraska Data Privacy Act (NDPA) comes into effect less than nine months after being signed into law by the governor, for example.
*The California Privacy Rights Act (CPRA) amends and expands the California Consumer Privacy Act (CCPA). In this article, they will be displayed as one regulation, and we will include the most up to date requirements, i.e. those introduced with the CPRA.
Who is protected in US states with data privacy laws?
Data privacy laws passed by the states are designed primarily to protect consumers, the data subjects from whom businesses and other organizations collect personal data. These days that data comes from an increasing number of sources as we live and work more and more online. Web browsers, mobile devices, connected appliances, and more all result in consumers generating vast amounts of data about their identities, preferences, and activities every day.
The US data privacy laws apply to residents of the state in question. This means that a company does not need to be headquartered in a state, or even have an office there, to be subject to the state’s privacy law, if their users or customers include residents of that state. Many of the state-level laws explicitly protect people and their data in a personal or household context, excluding those acting in a commercial or employment context (which is covered by other laws).
State | Protected Parties |
---|---|
California (CCPA/CPRA) | Residents of California, acting in an individual or household context, with specific rights for people acting in an employment context |
Colorado | Residents of Colorado, acting in an individual or household context |
Connecticut | Residents of Connecticut, acting in an individual or household context |
Delaware | Residents of Delaware, acting in an individual or household context |
Florida | Residents of Florida, acting in an individual or household context |
Indiana | Residents of Indiana, acting in an individual or household context |
Iowa | Residents of Iowa, acting in an individual or household context |
Kentucky | Residents of Kentucky, acting in an individual or household context |
Maryland | Residents of Maryland, acting in an individual or household context |
Minnesota | Residents of Minnesota, acting in an individual or household context |
Montana | Residents of Montana, acting in an individual or household context |
Nebraska | Residents of Nebraska, acting in an individual or household context |
Nevada | Residents of Nevada in their online activities |
New Hampshire | Residents of New Hampshire, acting in an individual or household context |
New Jersey | Residents of New Jersey, acting in an individual or household context |
Oregon | Residents of Oregon, acting in an individual or household context |
Rhode Island | Residents of Rhode Island, acting in an individual or household context |
Tennessee | Residents of Tennessee, acting in an individual or household context |
Texas | Residents of Texas, acting in an individual or household context |
Virginia | Residents of Virginia, acting in an individual or household context |
Utah | Residents of Utah, acting in an individual or household context |
Who has to comply with state-level US data privacy laws?
State privacy laws are primarily aimed at businesses, i.e. commercial enterprises intended to earn revenue. Those that obtain revenue from selling personal data are particularly responsible to comply. While the number of people whose data is sold is a common criterion, a company revenue threshold is only in use for some laws, and is increasingly being left out of states’ legislation.
Who is exempt from complying with state-level US data privacy laws?
Some of the laws also explicitly exempt small businesses. All of the laws have other exemptions, mainly for personal data covered under other laws, like that collected and processed by healthcare and financial institutions. Nonprofits and institutions of higher education are also often exempt (though not in all states), so as always, requirements of specific laws should be checked with input from qualified legal counsel.
All the thresholds listed below, except where noted, are for a calendar year or the preceding calendar year.
State | Compliance Thresholds |
---|---|
California (CCPA/CPRA) | – have gross annual revenue greater than USD 26,625,000 in the preceding calendar year, or – alone or in combination, annually buy, sell or share the personal data of 100,000 or more consumers or households, or – derive 50% or more of annual revenue from selling or sharing consumers’ personal data |
Colorado | – process personal data of at least 100,000 consumers, or – process personal data of at least 25,000 consumers, and – derive at least 50% of gross revenue from selling personal data |
Connecticut | – process personal data of at least 100,000 consumers, or – process personal data of at least 25,000 consumers, and – receive a discount on goods or services from selling personal data |
Delaware | – control or process personal data of at least 35,000 Delaware residents, excluding personal data controlled or processed solely for the purpose of completing a payment transaction, or – control or process personal data of at least 10,000 Delaware residents, and – derive more than 20 percent of gross revenue from the sale of personal data |
Florida | – are organized or operated for the profit or financial benefit of its shareholders or owners – conduct business in the state of Florida – collect personal data about consumers, or is the entity on behalf of which such information is collected – determines the purposes and means of processing personal data about consumers alone or jointly with others – makes in excess of USD 1 billion on global gross annual revenue and satisfies at least one of the following: – derive 50 percent or more of its global gross annual revenues from the sale of advertisements online, including providing targeted advertising or the sale of ads online – operate a consumer smart speaker and voice command component service with an integrated virtual assistant connected to cloud computing service that uses hands-free verbal activation – operate an app store or a digital distribution platform that offers at least 250,000 different software applications for consumers to download and install |
Indiana | – control or process personal data of at least 100,000 Indiana residents – control or process personal data of at least 25,000 Indiana residents and – derive over 50 percent of gross revenue from the sale of personal data |
Iowa | – control or process personal data of at least 100,000 consumers, or – control or process personal data of more than 25,000 consumers, and – derive over 50 percent of gross revenue from the sale of personal data |
Kentucky | – control or process personal data of at least 100,000 consumers , or – control or process personal data of at least 25,000 consumers, and – derive over 50 percent of gross revenue from the sale of personal data |
Maryland | – control or process the personal data of at least 35,000 consumers, excluding personal data controlled or processed only for completing a payment transaction, or – control or process the personal data of at least 10,000 consumers, and – derive more than 20 percent of their gross revenue from the sale of personal data |
Minnesota | – control or process personal data of at least 100,000 consumers, or – control or process personal data of at least 25,000 consumers, and – derive over 50 percent of gross revenue from the sale of personal data – not a small business as defined under the U.S. Small Business Act, unless they are engaged in the sale of sensitive data without consumer consent |
Montana | – control or process the personal data of at least 35,000 consumers, excluding personal data controlled or processed only for completing a payment transaction, or – control or process the personal data of at least 10,000 consumers, and – derive more than 20 percent of their gross revenue from the sale of personal data |
Nebraska | – process or engage in the sale of personal data – not a small business as defined under the U.S. Small Business Act, unless they are engaged in the sale of sensitive data without consumer consent |
Nevada | – own or operate a website or an online service for business purposes, and – collect and maintain the personal information of consumers who reside in Nevada and use or visit the website or the online service, and – engage in activities catered towards Nevada and conduct transactions with the State of Nevada, or its consumers or residents, and – have more than 20,000 visitors per year |
New Hampshire | – control or process personal data of 100,000 or more consumers, excluding data for the purpose of completing payment transactions, or – control or process personal data of 25,000 or more consumers, and – derive 25 percent or more of the gross revenue from selling personal data *The first state that does not limit the amount of data to a specific time period, e.g. “preceding calendar year” |
New Jersey | – control or process the personal data of at least 100,000 consumers, excluding personal data processed solely for the purpose of completing a payment transaction, or – control or process the personal data of at least 25,000 consumers, and – derive revenue or receive a discount on the price of any goods or services from the sale of personal data |
Oregon | – control or process personal data of at least 100,000 consumers, or – control or process personal data of at least 25,000 or more consumers, and – derive 25 percent or more of the annual gross revenue from selling personal data |
Rhode Island | – control or process the personal information of at least 10,000 Rhode Island consumers, and – derive more than 20 percent of their gross revenue from the sale of personal information |
Tennessee | – exceed USD 25 million in revenue, and – control or process the personal information of at least 25,000 Tennessee consumers, and – derive more than 50 percent of their gross revenue from the sale of personal information, or – control or process the personal information of at least 175,000 Tennessee residents during a calendar year |
Texas | – conduct business in Texas or generating products or services consumed by Texas residents, and – process or engage in the sale of personal data, and – not identify as a small business as defined by the U.S. Small Business Administration (independent for-profit entity with fewer than 500 employees) |
Virginia | – process personal data of at least 100,000 consumers, or – process personal data of at least 25,000 consumers, and – derive at least 50 percent of gross annual revenue from selling personal data |
Utah | – gross annual revenue of at least US 25 million, and – process personal data of at least 100,000 consumers, or – process personal data of at least 25,000 consumers, and – derive at least 50 percent of gross revenue from selling personal data |
Who is the enforcement authority in US states with data privacy laws?
Each state manages enforcement of the data privacy law, including investigations and penalties. The creation of the California Privacy Protection Agency was included in the CPRA, but to date it is the only state with a separate agency to enforce privacy law. All the other states have these functions under the Attorney General’s office.
What are the penalties for violation or noncompliance with the US state privacy laws?
Most penalties are monetary, though some can include cessation of data processing. Some of the privacy laws specify fine amounts, and others defer to laws governing deceptive trade practices, or to the Attorney General’s discretion. Outside of official channels, companies can also suffer loss of brand reputation, customer trust, and, ultimately revenue as the result of a publicized violation or data breach.
Do the US state privacy laws provide a cure period for violations?
Many of the US states with privacy laws provide companies with a “right to cure”, which is a specific number of days during which they have the opportunity to fix any violation they’ve been notified about without being penalized for it. If they don’t cure the violation, proceedings to levy fines and/or other penalties can then commence.
Some laws have put a time limit of one to two years on the cure period, specifying a sunset date. After that time, companies will not have a right to cure, but can be granted a cure period at the Attorney General’s discretion. In some cases, like with repeat or willful (known) violations, there is no cure period.
State | Fines, Penalties, and Cure Periods |
---|---|
California (CCPA/SPRA) | – up to USD 2,663 for each violation (e.g. negligence) or USD 7,988 for willful violations – fines for violations involving minors increased to USD 7,988 from USD 2,663 – provides consumers with private right of action only when their unencrypted or unredacted personal information is breached – no cure period |
Colorado | fines not specified under the CPA, penalties governed by the Colorado Consumer Protection Act – from USD 2,000 to USD 20,000 per violation, or between USD 10,000 to USD 50,000 per violation against an elderly person – violations can lead to criminal charges – cure period has sunset |
Connecticut | – fines not specified under the CTDPA, penalties governed by the Connecticut Unfair Trade Practices Act (CUTPA) – USD 5,000 for willful violations – restraining orders, which can lead to cessation of data collection (violation of a restraining order could result in an additional USD 25,000 penalty) – cure period has sunset |
Delaware | – fines not specified under the DPDPA, but the regulation references Subchapter II of Chapter 25 of Title 29, which provides the Attorney General standing to investigate, initiate administrative proceedings, sanction unlawful conduct, and/or seek remedies on behalf of the state for violations – willful violations can result in fines up to USD 10,000 per violation |
Florida | – fines not specified under the FDBR, as violations are considered deceptive trade practices- fines up to USD 50,000 per violation – penalties can be tripled if:
– includes prohibition that no government entity can request that a social media platform remove content or user accounts unless the content or account is used to commit a crime or otherwise violates Florida public records law |
Indiana | – fines up to USD 7,500 per violation – 30-day cure period (no sunset date) |
Iowa | – fines up to USD 7,500 per violation (paid into the fund for consumer education and litigation) – 90-day cure period (no sunset date) |
Kentucky | – fines up to USD 7,500 per violation – 30-day cure period (no sunset date) |
Maryland | – fines up to USD 10,000 per violation, fines for repeat violations up to USD 25,000 for each subsequent violation – 60-day cure period (sunsets April 1, 2027) – individuals do not have private right of action, but MODPA specifically notes that they are not prohibited from pursuing any other remedy provided by law |
Minnesota | – fines up to USD 7,500 per violation – 30-day cure period (sunsets July 31, 2026) |
Montana | – fines not specified under the MTCDPA, but notes that the Attorney General can “bring an action” – 60-day cure period (sunsets April 1, 2026) |
Nebraska | – fines up to USD 7,500 per violation – 30-day cure period (no sunset date) |
Nevada | – violations are considered deceptive trade practices, so NRS 598A applies – fines up to USD 5,000 per violation (which can mean per website visitor) – a data collector can pursue damages against a person or entity that has unlawfully obtained or benefitted from personal data obtained from the data collector’s records, which may include:
– the Attorney General or any county’s district attorney can bring action against a suspected violator, enabling them to obtain a temporary or permanent injunction against the violating activity, including cessation of data collection |
New Hampshire | – fines not specified under the NHPA, as violations are considered deceptive trade practices, but the regulation references Section 358-A:2 – Attorney General can seek civil penalties up to USD 10,000 per violation – 60-day cure period (sunsets January 1, 2026) |
New Jersey | – fines up to USD 10,000 for an initial violation and up to USD 20,000 for subsequent violations – 30-day cure period (sunsets July 16, 2026) |
Oregon | – fines up to USD 7,500 per violation – 30-day cure period (sunsets January 1, 2026) |
Rhode Island | – fines up to USD 10,000 per violation – 30-day cure period (sunsets January 31, 2026) |
Tennessee | – fines up to USD 15,000 per violation – fines can be up to three times higher for willful violations – 60-day cure period (no sunset date) |
Texas | – fines up to USD 7,500 per violation – 30-day cure period (no sunset date) |
Virginia | – fines up to USD 7,500 per violation – 30-day cure period (no sunset date) |
Utah | – fines up to USD 7,500 per violation – 30-day cure period (no sunset date) |
How are consent and Global Privacy Control managed under the US data privacy laws?
Opt in consent means that in most cases a business or other organization must obtain informed, valid consent from users and customers (data subjects) before collecting or processing their personal data. Opt out consent means that in most cases a business can collect and use data subjects’ personal data without requiring consent.
Under state privacy laws, data subjects must have the option to opt out of sale, sharing, targeted advertising, profiling, automated decision-making, or other use of their personal data, depending on the specific data privacy law. Under most of the US privacy laws, prior consent is required if the data to be processed is categorized as sensitive or belongs to a known child. Most of the laws defer to the Children’s Online Protection Act (COPPA) regarding access to and use of children’s personal data.
What are the notification requirements under US data privacy laws?
All of the American privacy laws require that data subjects be notified under all circumstances about what data is collected, for what purposes, who it’s shared with, etc. The United States is the main country utilizing an opt-out consent model. In much of the rest of the world, the opt-in model is the standard.
Are companies required to recognize the Global Privacy Control under US state privacy laws?
The Global Privacy Control (GPC) or universal opt-out mechanism, enables individuals to set their consent preferences once in their web browser, and having those preferences respected automatically by all websites they subsequently visit. Some of the state-level data privacy laws stipulate this signal must be respected, and others do not reference it at all. Some states have provided a grace period of a year or so before GPC signals must be respected.
State | Consent Model |
---|---|
California (CCPA/CPRA) | – opt out in most cases – “Do Not Sell Or Share My Personal Information” link required on websites – If sensitive personal information is processed, “Limit the Use of My Sensitive Personal Information” link required on websites – prior consent required for sensitive or children’s personal data |
Colorado | – opt out in most cases – prior consent required for sensitive or children’s personal data |
Connecticut | – opt out in most cases – if a controller sells personal data to third parties or processes it for targeted advertising, the controller must provide a “clear and conspicuous link” on their website that enables consumers to opt out of either of those activities (explicit wording for the link is not specified) – prior consent required for sensitive or children’s personal data |
Delaware | – opt out in most cases – controllers must provide “a clear and conspicuous link on the controller’s Internet web site to an Internet web page that enables a consumer, or an agent of the consumer, to opt out of the targeted advertising or the sale of the consumer’s personal data” – prior consent required for sensitive or children’s personal data |
Florida | – opt out in most cases – prior consent required for sensitive or children’s personal data – definition of a child is anyone under the age of 18 (under 13 is the standard under most of the state-level privacy laws) |
Indiana | – opt out in most cases – prior consent required for sensitive or children’s personal data |
Iowa | – opt out in most cases – prior consent required for sensitive or children’s personal data |
Kentucky | – opt out in most cases – prior consent required for sensitive or children’s personal data |
Maryland | – opt out in most cases – prior consent required for sensitive or children’s personal data – sale of sensitive data or children’s data is banned without exception |
Minnesota | – opt out in most cases – prior consent required for sensitive or children’s personal data |
Montana | – opt out in most cases – prior consent required for sensitive or children’s personal data |
Nebraska | – opt out in most cases – prior consent required for sensitive or children’s personal data |
Nevada | – opt out |
New Hampshire | – opt out in most cases – prior consent required for sensitive or children’s personal data |
New Jersey | – opt out in most cases – prior consent required for sensitive or children’s personal data |
Oregon | – opt out in most cases – prior consent required for sensitive or children’s personal data |
Rhode Island | – opt out in most cases – prior consent required for sensitive or children’s personal data |
Tennessee | – opt out in most cases – prior consent required for sensitive or children’s personal data |
Texas | – opt out in most cases – prior consent required for sensitive or children’s personal data |
Virginia | – opt out in most cases – prior consent required for sensitive or children’s personal data |
Utah | – opt out in most cases – prior consent required for sensitive or children’s personal data |
What are the privacy notice/policy requirements of the US state privacy laws?
While in many cases the data privacy laws in the US do not require consent before data collection or use, all of them require users to be notified with information about what data is collected, for what purposes, what parties it gets shared with, what consumers’ rights are and how to exercise them, etc. This is most commonly presented in a privacy notice or privacy policy.
State | Privacy Notice/Policy Requirements |
---|---|
California (CCPA/CPRA) | – a business that controls the collection of a consumer’s personal information must, before or at the point of collection, inform consumers about:categories of personal information to be collected – purposes for which the categories of personal information are collected or used and whether that information is sold or shared – categories of sensitive personal information to be collected, if any – purposes for which the categories of sensitive personal information are collected or used, and whether that information is sold or shared, if any – the length of time the business intends to retain each category of personal information, including sensitive personal information, if possible – if providing the data retention period is not possible, the criteria used to determine that period, provided that a business does not retain a consumer’s personal information for each disclosed purpose for longer than is reasonably necessary |
Colorado | – controllers must include an accessible, clear, and meaningful privacy notice, which must include the following information:
|
Connecticut | – controllers must include an accessible, clear, and meaningful privacy notice, which must include the following information:
|
Delaware | – a controller must include an accessible, clear, and meaningful privacy notice, which must include all of the following information:
|
Florida | – data controller must include an accessible and simple to read privacy notice on their website, which must contain at least the following information:
|
Indiana | – a controller must include an accessible, clear, and meaningful privacy notice, which must contain at least the following information:
– a secure and reliable means for consumers to submit a request to exercise their rights |
Iowa | – data processors must include an accessible and simple to read privacy notice on their website, which must contain at least the following information:
|
Kentucky | – controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Maryland | – controller must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Minnesota | – controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Montana | – controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Nebraska | – a controller shall provide each consumer with a reasonably accessible and clear privacy notice that includes:
|
Nevada | – data processors need to provide an accessible and simple to read privacy notice on their website, which must contain at least the following information:
|
New Hampshire | – a controller shall provide each consumer with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
New Jersey | – an operator that collects the personally identifiable information of a consumer through a commercial Internet website or an online service shall provide on its commercial Internet website or online service, notification to a consumer that shall include, but not be limited to:
|
Oregon | – a controller must provide an accessible, clear, and meaningful privacy notice on their website, which must contain at least the following information:
|
Rhode Island | – controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Tennessee | – upon receipt of an authenticated consumer request, a controller must provide the consumer with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Texas | – a controller must provide consumers with a reasonably accessible and clear privacy notice that includes:
|
Virginia | – controllers shall provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Utah | – a controller must provide an accessible and clear privacy notice, which must contain at least the following information:
|
State | Privacy Notice/Policy Requirements |
---|---|
California (CCPA/CPRA) | – a business that controls the collection of a consumer’s personal information must, before or at the point of collection, inform consumers about:categories of personal information to be collected – purposes for which the categories of personal information are collected or used and whether that information is sold or shared – categories of sensitive personal information to be collected, if any – purposes for which the categories of sensitive personal information are collected or used, and whether that information is sold or shared, if any – the length of time the business intends to retain each category of personal information, including sensitive personal information, if possible – if providing the data retention period is not possible, the criteria used to determine that period, provided that a business does not retain a consumer’s personal information for each disclosed purpose for longer than is reasonably necessary |
Colorado | – controllers must include an accessible, clear, and meaningful privacy notice, which must include the following information:
|
Connecticut | – controllers must include an accessible, clear, and meaningful privacy notice, which must include the following information:
|
Delaware | – a controller must include an accessible, clear, and meaningful privacy notice, which must include all of the following information:
|
Florida | – data controller must include an accessible and simple to read privacy notice on their website, which must contain at least the following information:
|
Indiana | – a controller must include an accessible, clear, and meaningful privacy notice, which must contain at least the following information:
|
Iowa | – data processors must include an accessible and simple to read privacy notice on their website, which must contain at least the following information:
|
Kentucky | – controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Maryland | – controller must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Minnesota | – controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Montana | – controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Nebraska | – a controller shall provide each consumer with a reasonably accessible and clear privacy notice that includes:
|
Nevada | – data processors need to provide an accessible and simple to read privacy notice on their website, which must contain at least the following information:
|
New Hampshire | – a controller shall provide each consumer with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
New Jersey | – an operator that collects the personally identifiable information of a consumer through a commercial Internet website or an online service shall provide on its commercial Internet website or online service, notification to a consumer that shall include, but not be limited to:
|
Oregon | – a controller must provide an accessible, clear, and meaningful privacy notice on their website, which must contain at least the following information:
|
Rhode Island | – controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Tennessee | – upon receipt of an authenticated consumer request, a controller must provide the consumer with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Texas | – a controller must provide consumers with a reasonably accessible and clear privacy notice that includes:
|
Virginia | – controllers shall provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Utah | – a controller must provide an accessible and clear privacy notice, which must contain at least the following information:
|
State | Privacy Notice/Policy Requirements |
---|---|
California (CCPA/CPRA) | – a business that controls the collection of a consumer’s personal information must, before or at the point of collection, inform consumers about:
|
Colorado | – controllers must include an accessible, clear, and meaningful privacy notice, which must include the following information:
|
Connecticut | – controllers must include an accessible, clear, and meaningful privacy notice, which must include the following information:
|
Delaware | – a controller must include an accessible, clear, and meaningful privacy notice, which must include all of the following information:
|
Florida | – data controller must include an accessible and simple to read privacy notice on their website, which must contain at least the following information:
|
Indiana | – a controller must include an accessible, clear, and meaningful privacy notice, which must contain at least the following information:
|
Iowa | – data processors must include an accessible and simple to read privacy notice on their website, which must contain at least the following information:
|
Kentucky | – controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Maryland | – controller must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Minnesota | – controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Montana | – controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Nebraska | – a controller shall provide each consumer with a reasonably accessible and clear privacy notice that includes:
|
Nevada | – data processors need to provide an accessible and simple to read privacy notice on their website, which must contain at least the following information:
|
New Hampshire | – a controller shall provide each consumer with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
New Jersey | – an operator that collects the personally identifiable information of a consumer through a commercial Internet website or an online service shall provide on its commercial Internet website or online service, notification to a consumer that shall include, but not be limited to:
|
Oregon | – a controller must provide an accessible, clear, and meaningful privacy notice on their website, which must contain at least the following information:
|
Rhode Island | – controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Tennessee | – upon receipt of an authenticated consumer request, a controller must provide the consumer with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Texas | – a controller must provide consumers with a reasonably accessible and clear privacy notice that includes:
|
Virginia | – controllers shall provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes:
|
Utah | – a controller must provide an accessible and clear privacy notice, which must contain at least the following information:
|
How is personal data defined under US state privacy laws?
Information that is considered personal data or personal information is generally required to be able to identify a person, by itself or in combination with other data points (e.g. name, address, credit card number, IP address). There are differences between what is categorized as personal data and personally identifiable information.
How is sensitive personal information defined and handled under US data privacy laws?
Many US data privacy laws also have explicit consideration for “sensitive personal data”, which can include information belonging to children, about racial or ethnic origin, medical or genetic data, sexual orientation, etc. Generally, this category includes information that could particularly be used to cause discrimination or harm if misused.
Typically, sensitive personal information (and children’s information) require consent before it can be collected or processed, and additional security measures. Specific US data privacy laws should be checked for their definitions and requirements for sensitive personal data. Data that is publicly available, like government records, is not typically considered personal data.
State | Definition of Personal Data/Information |
---|---|
California (CCPA/CPRA) | “…information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.” (Examples in Section 1798.140 CCPA) |
Colorado | “…information that is linked or reasonably linkable to an identified or identifiable individual… does not include de-identified data or publicly available information.” |
Connecticut | “…any information that is linked or reasonably linkable to an identified or identifiable individual… does not include de-identified data or publicly available information.” |
Delaware | “…any information that is linked or reasonably linkable to an identified or identifiable individual… does not include de-identified data or publicly available information.” |
Florida | Personal data: “…information that is linked or reasonably linkable to an identified or identifiable child, including biometric information and unique identifiers to the child.”Personal information: “…any information, including sensitive data, which is linked or reasonably linkable to an identified or identifiable individual. The term includes pseudonymous data when the data is used by a controller or processor in conjunction with additional information that reasonably links the data to an identified or identifiable individual. The term does not include deidentified data or publicly available information.” |
Indiana | “…information that is linked or reasonably linkable to an identified or identifiable individual… does not include:(1) de-identified data(2) aggregate data
(3) publicly available information” |
Iowa | “…any information that is linked or reasonably linkable to an identified or identifiable natural person… does not include de-identified or aggregate data or publicly available information.” |
Kentucky | “…any information that is linked or reasonably linkable to an identified or identifiable natural person… does not include de-identified data or publicly available information,” |
Maryland | “…any information that is linked or can be reasonably linked to an identified or identifiable consumer… does not include de-identified data or publicly available information.” |
Minnesota | “… any information that is linked or reasonably linkable to an identified or identifiable natural person… does not include deidentified data or publicly available information.” |
Montana | “…any information that is linked or reasonably linkable to an identified or identifiable individual… does not include deidentified data or publicly available information.” |
Nebraska | “any information, including sensitive data, that is linked or reasonably linkable to an identified or identifiable individual, and includes pseudonymous data when the data is used by a controller or processor in conjunction with additional information that reasonably links the data to an identified or identifiable individual… does not include deidentified data or publicly availableinformation” |
Nevada | Covered information: “…any one or more of the following items of personally identifiable information about a consumer collected by an operator through an Internet website or online service and maintained by the operator or a data broker in an accessible form: 1. A first and last name. 2. A home or other physical address which includes the name of a street and the name of a city or town. 3. An electronic mail address. 4. A telephone number. 5. A social security number. 6. An identifier that allows a specific person to be contacted either physically or online. 7. Any other information concerning a person collected from the person through the Internet website or online service of the operator and maintained by the operator or data broker in combination with an identifier in a form that makes the information personally identifiable. |
New Hampshire | “…any information that is linked or reasonably linkable to an identified or identifiable individual… does not include deidentified data or publicly available information.” |
New Jersey | “…any information that is linked or reasonably linkable to an identified or identifiable individual… does not include deidentified data or publicly available information.” |
Oregon | “…data, derived data or any unique identifier that is linked to or is reasonably linkable to a consumer or to a device that identifies, is linked to or is reasonably linkable to one or more consumers in a household… does not include deidentified data or data that is lawfully available through federal, state or local government records or through widely distributed media; or a controller reasonably has understood to have been lawfully made available to the public by a consumer.” |
Rhode Island | “… any information that is linked or reasonably linkable to an identified or identifiable individual and does not include de-identified data or publicly available information.” |
Tennessee | “…information that identifies, relates to, or describes a particular consumer or is reasonably capable of being directly or indirectly associated or linked with a particular consumer… does not include information that is: publicly available information; or de-identified or aggregate consumer information” (Examples in Section 2, 47-18-3201, 16B) |
Texas | “…any information, including sensitive data, that is linked or reasonably linkable to an identified or identifiable individual. The term includes pseudonymous data when the data is used by a controller or processor in conjunction with additional information that reasonably links the data to an identified or identifiable individual. The term does not include deidentified data or publicly available information.” |
Virginia | “…any information that is linked or reasonably linkable to an identified or identifiable natural person… does not include de-identified data or publicly available information.” |
Utah | “…information that is linked or reasonably linkable to an identified individual or an identifiable individual… does not include deidentified data, aggregated data, or publicly available information.” |
What are consumers’ rights under the states’ data privacy law?
Some rights are consistent across all of the state-level US data privacy laws to date, though some laws get more granular than others. California is currently the only state that enables consumers to sue for a data breach in specific circumstances (private right of action). Not all data privacy laws enable portability of one’s data, either.
How do companies have to handle consumer requests under the US state privacy laws?
It is common for businesses to have 45 days from receiving a consumer’s request to exercise their rights to fulfill it, with an option to extend that under certain circumstances. Specific US data privacy laws should be reviewed to confirm the exact time frame for responding to requests, extensions, and/or the ability to refuse requests, as well as ensuring familiarity with each data privacy law’s specific consumer rights to ensure consumers can exercise them or appeal a decision.
State | Consumers’ Rights |
---|---|
California (CCPA/CPRA) |
|
Colorado |
|
Connecticut |
|
Delaware |
|
Florida |
|
Indiana |
|
Iowa |
|
Kentucky |
|
Maryland |
|
Minnesota |
|
Montana |
|
Nebraska |
|
Nevada |
|
New Hampshire |
|
New Jersey |
|
Oregon |
|
Rhode Island |
|
Tennessee |
|
Texas |
|
Virginia |
|
Utah |
|
What are the requirements for consent management to comply with US data privacy laws?
The US data privacy laws to date all use an opt-out model of consent that does not require businesses to obtain consent before collecting personal data in most cases, with the typical exceptions being sensitive data and data belonging to known children. However, the laws do consistently require consumers to be notified about data collection and use, and provided with an option to opt out — of collection, selling data, or sharing of their personal data, or targeted advertising or profiling, depending on the law — as well as have instructions and at least one mechanism to contact the company with requests or complaints.
That said, a number of the states’ regulations don’t specify how consent or opting out must be handled, what form that needs to take, etc. A high performance Consent Management Platform, like Usercentrics CMP, can help companies flexibly and scalably provide the required notifications and consent options for states where they need to comply with privacy regulations.
State | Consent Management Requirements |
---|---|
California (CCPA/CPRA) | – clearly and conspicuously display a link reading “Do Not Sell Or Share My Personal Information” to enable consumers to submit an opt out request – must honor the Global Privacy Signal |
Colorado | – clearly and conspicuously display a link on the website that enables the consumer to submit an opt out request – by January 1st, 2025, websites must be able to honor preference signals that communicate the consumer’s opt out choice (Global Privacy Control) |
Connecticut | – no specific requirements regarding how an opt out option needs to be presented – must honor a Universal Opt-Out Mechanism |
Delaware | – clearly and conspicuously display a link on the website that enables the consumer to submit an opt out request – must honor a Universal Opt-Out Mechanism (as of January 2025) |
Florida | – no specific requirements regarding how an opt out option needs to be presented, except for “methods must be secure, reliable, and clearly and conspicuously accessible” – if a controller engages in the sale of sensitive personal data, the controller must provide the following notice: “NOTICE: This website may sell your sensitive personal data.” – if a controller engages in the sale of personal data that is biometric data, the controller must provide the following notice: “NOTICE: This website may sell your biometric personal data.” |
Indiana | – no specific requirements regarding how an opt out option needs to be presented |
Iowa | – no specific requirements regarding how an opt out option needs to be presented |
Kentucky | – no specific requirements regarding how an opt out option needs to be presented |
Maryland | – clearly and conspicuously display a link on the website that enables the consumer to submit an opt out request- must honor a Universal Opt-Out Mechanism |
Minnesota | – clearly and conspicuous method outside the privacy notice for a consumer to opt out, “This method may include but is not limited to an Internet hyperlink clearly labeled “Your Opt-Out Rights” or “Your Privacy Rights” that directly effectuates the opt-out request or takes consumers to a web page where the consumer can make the opt-out request” – must honor a Universal Opt-Out Mechanism |
Montana | – clearly and conspicuously display a link on the website that enables the consumer to submit an opt out request- must honor a Universal Opt-Out Mechanism |
Nebraska | – no specific requirements regarding how an opt out option needs to be presented- must honor a Universal Opt-Out Mechanism |
Nevada | – no specific requirements regarding how an opt out option needs to be presented- privacy policy is required |
New Hampshire | – clearly and conspicuously display a link on the website that enables the consumer to submit an opt out request- must honor a Universal Opt-Out Mechanism |
New Jersey | – clearly and conspicuously display a link on the website that enables the consumer to submit an opt out request- must honor a Universal Opt-Out Mechanism (with specific reference for user profiling) |
Oregon | – clearly and conspicuously display a link on the website that enables the consumer to submit an opt out request- must honor a Universal Opt-Out Mechanism (as of January 2026) |
Rhode Island | – no specific requirements regarding how an opt out option needs to be presented |
Tennessee | – clearly and conspicuously display a link on the website that enables the consumer to submit an opt out request |
Texas | – clearly and conspicuously display a link on the website that enables the consumer to submit an opt out request- must honor a Universal Opt-Out Mechanism |
Virginia | – no specific requirements regarding how an opt out option needs to be presented |
Utah | – no specific requirements regarding how an opt out option needs to be presented, aside from that the controller must clearly and conspicuously provide an option on the website that enables the consumer to submit an opt out request |
The United States does not have a comprehensive federal data privacy law that governs how businesses access or use individuals’ personal information. Instead, privacy protections and regulation are currently left to individual states. California led the way in 2020 with the California Consumer Privacy Act (CCPA), later strengthened by the California Privacy Rights Act (CPRA). As of January 2025, 20 states have passed similar laws. The variances in consumers’ rights, companies’ responsibilities, and other factors makes compliance challenging for businesses operating in multiple states.
The American Data Privacy and Protection Act (ADPPA) sought to simplify privacy compliance by establishing a comprehensive federal privacy standard. The ADPPA emerged in June 2022 when Representative Frank Pallone introduced HR 8152 to the House of Representatives. The bill gained strong bipartisan support in the House Energy and Commerce Committee, passing with a 53-2 vote in July 2022. It also received amendments in December 2022. However, the bill did not progress any further.
As proposed, the ADPPA would have preempted most state-level privacy laws, replacing the current multi-state compliance burden with a single federal standard.
In this article, we’ll examine who the ADPPA would have applied to, its obligations for businesses, and the rights it would have granted US residents.
What is the American Data Privacy and Protection Act (ADPPA)?
The American Data Privacy and Protection Act (ADPPA) was a proposed federal bill that would have set consistent rules for how organizations handle personal data across the United States. It aimed to protect individuals’ privacy with comprehensive safeguards while requiring organizations to meet strict standards for handling personal data.
Under the ADPPA, an individual is defined as “a natural person residing in the United States.” Organizations that collect, use, or share individuals’ personal data would have been responsible for protecting it, including measures to prevent unauthorized access or misuse. By balancing individual rights and business responsibilities, the ADPPA sought to create a clear and enforceable framework for privacy nationwide.
What data would have been protected under the American Data Privacy and Protection Act (ADPPA)?
The ADPPA aimed to protect the personal information of US residents, which it refers to as covered data. Covered data is broadly defined as “information that identifies or is linked, or reasonably linkable, alone or in combination with other information, to an individual or a device that identifies or is linked or reasonably linkable to an individual.” In other words, any data that would either identify or could be traced to a person or to a device that is linked to an individual. This includes data that may be derived from other information and unique persistent identifiers, such as those used to track devices or users across platforms.
The definition excludes:
- Deidentified data
- Employee data
- Publicly available information
- Inferences made exclusively from multiple separate sources of publicly available information, so long as they don’t reveal private or sensitive details about a specific person
Sensitive covered data under the ADPPA
The ADPPA, like other data protection regulations, would have required stronger safeguards for sensitive covered data that could harm individuals if it was misused or unlawfully accessed. The bill’s definition of sensitive covered data is extensive, going beyond many US state-level data privacy laws.
Protected categories of data include, among other things:
- Personal identifiers, including government-issued IDs like Social Security numbers and driver’s licenses, except when legally required for public display.
- Health information, including details about past, present, or future physical and mental health conditions, treatments, disabilities, and diagnoses.
- Financial data, such as account numbers, debit and credit card numbers, income, and balance information. The last four digits of payment cards are excluded.
- Private communications, such as emails, texts, calls, direct messages, voicemails, and their metadata. This does not apply if the device is employer-provided and individuals are given clear notice of monitoring.
- Behavioral data, including sexual behavior information when collected against reasonable expectations, video content selections, and online activity tracking across websites.
- Personal records, such as private calendars, address books, photos, and recordings, except on employer-provided devices with notice.
- Demographic details, including race, color, ethnicity, religion, and union membership.
- Biological identifiers, including biometric information and genetic information, precise location data, login credentials, and information about minors.
- Security credentials, login details or security or access codes for an account or device.
Who would the American Data Privacy and Protection Act (ADPPA) have applied to?
The ADPPA would have applied to a broad range of entities that handle covered data.
Covered entity under the ADPPA
A covered entity is “any entity or any person, other than an individual acting in a non-commercial context, that alone or jointly with others determines the purposes and means of collecting, processing, or transferring covered data.” This definition matches similar terms like “controller” in US state privacy laws and the European Union’s General Data Protection Regulation (GDPR). To qualify as a covered entity under the ADPPA, the organization would have had to be in one of three categories:
- Businesses regulated by the Federal Trade Commission Act (FTC Act)
- Telecommunications carriers
- Nonprofits
Although the bill did not explicitly address international jurisdiction, its reach could have extended beyond US borders. Foreign companies would have needed to comply if they handle US residents’ data for commercial purposes and meet the FTC Act’s jurisdictional requirements, such as conducting business activities in the US or causing foreseeable injury within the US. This type of extraterritorial scope is common among a number of other international data privacy laws.
Service provider under the ADPPA
A service provider was defined as a person or entity that engages in either of the following:
- Collects, processes, or transfers covered data on behalf of a covered entity or government body
OR
- Receives covered data from or on behalf of a covered entity of government body
This role mirrors what other data protection laws call a processor, including most state privacy laws and the GDPR.
Large data holders under the ADPPA
Large data holders were not considered a third type of organization. Both covered entities and service providers could have qualified as large data holders if, in the most recent calendar year, they had gross annual revenues of USD 250 million or more, and collected, processed, or transferred:
- Covered data of more than 5,000,000 individuals or devices, excluding data used solely for payment processing
- Sensitive covered data from more than 200,000 individuals or devices
Large data holders would have faced additional requirements under the ADPPA.
Third-party collecting entity under the ADPPA
The ADPPA introduced the concept of a third-party collecting entity, which refers to a covered entity that primarily earns its revenue by processing or transferring personal data it did not collect directly from the individuals to whom the data relates. In other contexts, they are often referred to as data brokers.
However, the definition excluded certain activities and entities:
- A business would not be considered a third-party collecting entity if it processed employee data received from another company, but only for the purpose of providing benefits to those employees
- A service provider would also not be classified as a third-party collecting entity under this definition
An entity is considered to derive its principal source of revenue from data processing or transfer if, in the previous 12 months, either:
- More than 50 percent of its total revenue came from these activities
or
- The entity processed or transferred the data of more than 5 million individuals that it did not collect directly
Third-party collecting entities that process data from more than 5,000 individuals or devices in a calendar year would have had to register with the Federal Trade Commission by January 31 of the following year. Registration would require a fee of USD 100 and basic information about the organization, including its name, contact details, the types of data it handles, and a link to a website where individuals can exercise their privacy rights.
Exemptions under the ADPPA
While the ADPPA potentially would have had a wide reach, certain exemptions would have applied.
- Small businesses: Organizations with less than USD 41 million in annual revenue or those that process data for fewer than 50,000 individuals would be exempt from some provisions.
- Government entities: The ADDPA would not apply to government bodies or their service providers handling covered data. It also excluded congressionally designated nonprofits that support victims and families with issues involving missing and exploited children.
- Organizations subject to other federal laws: Organizations already complying with certain existing privacy laws, including the Health Insurance Portability and Accountability Act (HIPAA), the Gramm-Leach-Bliley Act (GLBA), and the Family Educational Rights and Privacy Act (FERPA), among others, were deemed compliant with similar ADPPA requirements for the specific data covered by those laws. However, they would have still been required to comply with Section 208 of the ADPPA, which contains provisions for data security and protection of covered data.
Definitions in the American Data Privacy and Protection Act (ADPPA)
Like other data protection laws, the ADPPA defined several terms that are important for businesses to know. While many — like “collect” or “process” — can be found in other regulations, there are also some that are unique to the ADPPA. We look at some of these key terms below.
Knowledge under the ADPPA
“Knowledge” refers to whether a business is aware that an individual is a minor. The level of awareness required depends on the type and size of the business.
- High-impact social media companies: These are large platforms that are primarily known for user-generated content. They would have to have at least USD 3 billion in annual revenue and 300 million monthly active users over 3 months in the preceding year. They would be considered to have knowledge if they were aware or should have been aware that a user was a minor. This is the strictest standard.
- Large data holders: These are organizations that have significant data operations but do not qualify as high-impact social media. They have knowledge if they knew or willfully ignored evidence that a user was a minor.
- Other covered entities or service providers: Those that do not fall into the above categories are required to have actual knowledge that the user is a minor.
Some states — like Minnesota and Nebraska — define “known child” but do not adjust the criteria for what counts as knowledge based on the size or revenue of the business handling the data. Instead, they apply the same standard to all companies, regardless of their scale.
Affirmative express consent under the GDPR
The ADPPA uses the term “affirmative express consent,” which refers to “an affirmative act by an individual that clearly communicates the individual’s freely given, specific, and unambiguous authorization” for a business to perform an action, such as collecting or using their personal data. Consent for data collection would have to be obtained after the covered entity provides clear information about how it will use the data.
Like the GDPR and other data privacy regulations, consent would have needed to be freely given, informed, specific, and unambiguous.
Under this definition, consent cannot be inferred from an individual’s inaction or continued use of a product or service. Additionally, covered entities cannot trick people into giving consent through misleading statements or manipulative design. This includes deceptive interfaces meant to confuse users or limit their choices.
Transfer under the ADPPA
Most data protection regulations include a definition for the sale of personal data or personal information. While the ADPPA did not define sale, it instead defined “transfer” as “to disclose, release, disseminate, make available, license, rent, or share covered data orally, in writing, electronically, or by any other means.”
What are consumers’ rights under the American Data Privacy and Protection Act (ADPPA)?
Under the ADPPA, consumers would have had the following rights regarding their personal data.
- Right of awareness: The Commission must publish and maintain a webpage describing the provisions, rights, obligations, and requirements of the ADPPA for individuals, covered entities, and service providers. This information must be:
- Published within 90 days of the law’s enactment
- Updated quarterly as needed
- Available in the ten most commonly used languages in the US
- Right to transparency: Covered entities must provide clear information about how consumer data is collected, used, and shared. This includes which third parties would receive their data and for what purposes.
- Right of access: Consumers can access their covered data (including data collected, processed, or transferred within the past 24 months), categories of third parties and service providers who received the data, and the purpose(s) for transferring the data.
- Right to correction: Consumers can correct any substantial inaccuracies or incomplete information in their covered data and instruct the covered entity to notify all third parties or service providers that have received the data.
- Right to deletion: Consumers can request that their covered data processed by the covered entity be deleted. They can also instruct the covered entity to notify all third parties or service providers that have received the data of the deletion request.
- Right to data portability: Consumers can request their personal data in a structured, machine-readable format that enables them to transfer it to another service or organization.
- Right to opt out: Consumers can opt out of the transfer of their personal data to third parties and its use for targeted advertising. Businesses are required to provide a clear and accessible mechanism for exercise of this right.
- Private right of action: Consumers can sue companies directly for certain violations of the act, with some limitations and procedural requirements. (California is the only state to provide this right as of early 2025.)
What are privacy requirements under the American Data Privacy and Protection Act (ADPPA)?
The ADPPA would have required organizations to meet certain obligations when handling individuals’ covered data. Here are the key privacy requirements under the bill.
Consent
Organizations must obtain clear, explicit consent through easily understood standalone disclosures. Consent requests must be accessible, available in all service languages, and give equal prominence to accept and decline options. Organizations must provide mechanisms to withdraw consent that are as simple as giving it.
Organizations must avoid using misleading statements or manipulative designs, and must obtain new consent for different data uses or significant privacy policy changes. While the ADPPA works alongside the Children’s Online Privacy Protection Act (COPPA)’s parental consent requirements for children under 13, it adds its own protections for minors up to age 17.
Privacy policy
Organizations must maintain clear, accessible privacy policies that detail their data collection practices, transfer arrangements, retention periods, and rights granted to individuals. These policies must specify whether data goes to countries like China, Russia, Iran, or North Korea, which could present a security risk, and they must be available in all languages where services are offered. When making material changes, organizations must notify affected individuals in advance and give them a chance to opt out.
Data minimization
Organizations can only collect and process data that is reasonably necessary to provide requested services or for specific allowed purposes. These allowed purposes include activities like completing transactions, maintaining services, protecting against security threats, meeting legal obligations, and preventing harm or if there is a risk of death, among others. Collected data must also be proportionate to these activities.
Privacy by design
Privacy by design is a default requirement under the ADPPA. Organizations must implement reasonable privacy practices that consider the organization’s size, data sensitivity, available technology, and implementation costs. They must align with federal laws and regulations and regularly assess risks in their products and services, paying special attention to protecting minors’ privacy and implementing appropriate safeguards.
Data security
Organizations must establish, implement, and maintain appropriate security measures, including vulnerability assessments, preventive actions, employee training, and incident response plans. They must implement clear data disposal procedures and match their security measures to their data handling practices.
Privacy and data security officers
Organizations with more than 15 employees must appoint both a privacy officer and data security officer, who must be two distinct individuals. These officers are responsible for implementing privacy programs and maintaining ongoing ADPPA compliance.
Privacy impact assessments
Organizations — excluding large data holders and small businesses — must conduct regular privacy assessments that evaluate the benefits and risks of their data practices. These assessments must be documented and maintained, and consider factors like data sensitivity and potential privacy impacts.
Loyalty with respect to pricing
Organizations cannot discriminate against individuals who exercise their privacy rights. While they can adjust prices based on necessary financial information and offer voluntary loyalty programs, they cannot retaliate through changes in pricing or service quality, e.g. if an individual exercises their rights and requests their data or does not consent to certain data processing.
Special requirements for large data holders
In addition to their general obligations, large data holders would have had unique responsibilities under the proposed law.

Privacy policy
Large data holders would have been required to maintain and publish 10-year archives of their privacy policies on their websites. They would need to keep a public log documenting significant privacy policy changes and their impact. Additionally, they would need to provide a short-form notice (under 500 words) highlighting unexpected practices and sensitive data handling.
Privacy and data security officers
At least one of the appointed officers would have been designated as a privacy protection officer who reports directly to the highest official at the organization. This officer, either directly or through supervised designees, would have been required to do the following:
- Establish processes to review and update privacy and security policies, practices, and procedures
- Conduct biennial comprehensive audits to ensure compliance with the proposed law and make them accessible to the Commission upon request
- Develop employee training programs about ADPPA compliance
- Maintain detailed records of all material privacy and security practices
- Serve as the point of contact for enforcement authorities
Privacy impact assessments
While all organizations other than small businesses would be required to conduct privacy impact assessments under the proposed law, large data holders would have had additional requirements.
- Timing: While other organizations must conduct assessments within one year of the ADPPA’s enactment, large data holders would have been required to do so within one year of either becoming a large data holder or the law’s enactment, whichever came first.
- Scope: Both must consider nature and volume of data and privacy risks, but large data holders would need to specifically assess “potential adverse consequences” in addition to “substantial privacy risks.”
- Approval: Large data holders’ assessments would need to be approved by their privacy protection officer, while other entities would have no specific approval requirement.
- Technology review: Large data holders would need to include reviews of security technologies (like blockchain and distributed ledger), this review would be optional for other entities.
- Documentation: While both would need to maintain written assessments until the next assessment, large data holders’ assessments would also need to be accessible to their privacy protection officer.
Metrics reporting
Large data holders would be required to compile and disclose annual metrics related to verified access, deletion, and opt-out requests. These metrics would need to be included in their privacy policy or published on their website.
Executive certification
An executive officer would have been required to annually certify to the FTC that the large data holder has internal controls and a reporting structure in place to achieve compliance with the proposed law.
Algorithm impact assessments
Large data holders using covered algorithms that could pose a consequential risk of harm would be required to conduct an annual impact assessment of these algorithms. This requirement would be in addition to privacy impact assessments and would need to begin no later than two years after the Act’s enactment.
American Data Privacy and Protection Act (ADPPA) enforcement and penalties for noncompliance
The ADPPA would have established a multi-layered enforcement approach that set it apart from other US privacy laws.
- Federal Trade Commission: The FTC would serve as the primary enforcer, treating violations as unfair or deceptive practices under the Federal Trade Commission Act. The proposed law required the FTC to create a dedicated Bureau of Privacy for enforcement.
- State Attorneys General: State Attorneys General and State Privacy Authorities could bring civil actions on behalf of their residents if they believed violations had affected their state’s interest.
- California Privacy Protection Authority (CPPA): The CPPA, established under the California Privacy Rights Act, would have special enforcement authority. The CPPA could enforce the ADPPA in California in the same manner as it enforces California’s privacy laws.
Starting two years after the law would have taken effect, individuals would gain a private right of action, or the right to sue for violations. However, before filing a lawsuit, they would need to notify both the Commission and their state Attorney General.
The ADPPA itself did not establish specific penalties for violations. Instead, violations of the ADPPA or its regulations would be treated as violations of the Federal Trade Commission Act, subject to the same penalties, privileges, and immunities provided under that law.
The American Data Privacy and Protection Act (ADPPA) compared to other data privacy regulations
As privacy regulations continue to evolve worldwide, it’s helpful to understand how the ADPPA would compare with other comprehensive data privacy laws.
The EU’s GDPR has set the global standard for data protection since 2018. In the US, the CCPA (as amended by the CPRA) established the first comprehensive state-level privacy law and has influenced subsequent state legislation. Below, we’ll look at how the ADPPA compares with these regulations.
The ADPPA vs the GDPR
There are many similarities between the proposed US federal privacy law and the EU’s data protection regulation. Both require organizations to implement privacy and security measures, provide individuals with rights over their personal data (including access, deletion, and correction), and mandate clear privacy policies that detail their data processing activities. Both also emphasize data minimization principles and purpose limitation.
However, there are also several important differences between the two.
Aspect | ADPPA | GDPR |
---|---|---|
Territorial scope | Would have applied to individuals residing in the US. | Applies to EU residents and any organization processing their data, regardless of location. |
Consent | Not a standalone legal basis; required only for specific activities like targeted advertising and processing sensitive data. | One of six legal bases for processing; can be a primary justification. |
Government entities | Excluded federal, state, tribal, territorial and local government entities. | Applies to public bodies and authorities. |
Privacy officers | Required “privacy and security officers” for covered entities with more than 15 employees, with stricter rules for large data holders. | Requires a Data Protection Officer (DPO) for public authorities or entities engaged in large-scale data processing. |
Data transfers | No adequacy requirements; focus on transfers to specific countries (China, Russia, Iran, North Korea). | Detailed adequacy requirements and transfer mechanisms. |
Children’s data | Extended protections to minors up to age 17. | Focuses on children under 16 (can be lowered to 13 by member states). |
Penalties | Violations would have been treated as violations of the Federal Trade Commission Act. | Imposes fines up to 4% of annual global turnover or €20 million, whichever is higher. |
The ADPPA vs the CCPA/CPRA
There are many similarities between the proposed US federal privacy law and California’s existing privacy framework. Both include comprehensive transparency requirements, including privacy notices in multiple languages and accessibility for people with disabilities. They also share similar approaches to prohibiting manipulative design practices and requirements for regular security and privacy assessments.
However, there are also differences between the ADPPA and CCPA/CPRA.
Aspect | ADPPA | CCPA/CPRA |
---|---|---|
Covered entities | Would have applied to organizations under jurisdiction of the Federal Trade Commission, including nonprofits and common carriers; excluded government agencies. | Applies only to for-profit businesses meeting any of these specific thresholds:gross annual revenue of over USD 26,625,000receive, buy, sell, or share personal information of 100,000 or more consumers or householdsearn more than half of their annual revenue from the sale of consumers’ personal information |
Private right of action | Broader right to sue for various violations. | Limited to data breaches only. |
Data minimization | Required data collection and processing to be limited to what is reasonably necessary and proportionate. | Similar requirement, but the CPRA allows broader processing for “compatible” purposes. |
Algorithmic impact assessments | Required large data holders to conduct annual assessments focusing on algorithmic risks, bias, and discrimination. | Requires risk assessments weighing benefits and risks of data practices, with no explicit focus on bias. |
Executive accountability | Required executive certification of compliance. | No executive certification requirement. |
Enforcement | Would have been enforced by the Federal Trade Commission, State Attorney Generals, and the California Privacy Protection Authority (CPPA). | CPPA and local authorities within California. |
Consent management and the American Data Privacy and Protection Act (ADPPA)
The ADPPA would have required organizations to obtain affirmative express consent for certain data processing activities through clear, conspicuous standalone disclosures. These consent requests would need to be easily understood, equally prominent for either accepting or declining, and available in all languages where services are offered. Organizations would also need to provide simple mechanisms for withdrawing consent that would be as easy to use as giving consent was initially. The bill also required organizations to honor opt-out requests for practices like targeted advertising and certain data transfers. These opt-out mechanisms would need to be accessible and easy to use, with clear instructions for exercising these rights.
Organizations would need to clearly disclose not only the types of data they collect but also the parties with whom this information is shared. Consumers would also need to be informed about their data rights and how to act on them, such as opting out of processing, through straightforward explanations and guidance.
To support transparency, organizations would also be required to maintain privacy pages that are regularly updated to reflect their data collection, use, and sharing practices. These pages would help provide consumers with access to the latest information about how their data is handled. Additionally, organizations would have been able to use banners or buttons on websites and apps to inform consumers about data collection and provide them with an option to opt out.
Though the ADPPA was not enacted, the US does have an increasing number of state-level data privacy laws. A consent management platform (CMP) like the Usercentrics CMP for website consent management or app consent management can help organizations streamline compliance with the many existing privacy laws in the US and beyond. The CMP securely maintains records of consent, automates opt-out processes, and enables consistent application of privacy preferences across an organization’s digital properties. It also helps to automate the detection and blocking of cookies and other tracking technologies that are in use on websites and apps.
Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or privacy specialists regarding data privacy and protection issues and operations.
Protecting personal data is more critical than ever. As organizations handle vast amounts of information, understanding the distinctions between various data types — such as Personally Identifiable Information (PII), Personal Information (PI), and sensitive data — becomes essential.
These classifications play a significant role in data privacy and security, helping companies determine compliance requirements with global privacy regulations while safeguarding individual privacy.
By differentiating among these types of data, organizations and website owners can implement appropriate security measures and build trust with their customers.
Understanding various data types
Understanding the nuances among different data types is essential for effective data privacy and security management. Distinguishing between Personally Identifiable Information (PII) vs Personal Information (PI) vs sensitive data enables companies to safeguard individuals’ privacy and comply with relevant regulations.
Before we delve into the specifics of each data type, here’s a brief overview of PII vs PI vs sensitive data:
- PII: This includes any information that can identify an individual, like names, Social Security numbers, or email addresses.
- PI: This broader category covers any information related to a person, even if it doesn’t identify them on its own, such as a common name or web browsing activity.
- Sensitive data: This subset of PI requires extra protection due to its potential for harm if exposed, like medical records, sexual orientation, or financial information.
Recognizing these data types is essential for regulatory compliance, as laws like the General Data Protection Regulation (GDPR) and the California Privacy Rights Act (CPRA) have specific requirements for handling personal data.
Accurate classification supports compliance and enhances risk management by enabling organizations to implement tailored security measures that mitigate the risk of data breaches and data exposures. Moreover, a deep understanding of data types strengthens user trust, as companies that implement smart data collection strategies and prioritize data protection foster stronger, more reliable relationships with their customers.
What you need to know about Personally Identifiable Information (PII)
What is PII?
Personally Identifiable Information (PII) refers to any data that can be used to identify a specific individual. This includes information that can directly identify a person or can be used in combination with other data to identify someone.
This definition is widely used by privacy professionals and aligns with interpretations from organizations like the National Institute of Standards and Technology (NIST) in the United States. We specify this because there is not a single, global definition of Personally Identifiable Information or what types of information it encompasses. As a result, specific definitions of PII can differ across organizations and borders. Different regulations also use different language and have different levels of detail in describing these categories.
What are the different types of PII?
There are two main types of PII:
- Direct identifiers: Information that can immediately identify an individual, such as full name, Social Security number, or passport number.
- Indirect identifiers: Data that, when combined with other information, can lead to the identification of an individual, like date of birth, place of work, or job title.
Additionally, PII can be classified as sensitive or non-sensitive, depending on the potential harm that could result from its disclosure or misuse.
Sensitive PII refers to information that, if disclosed or breached, could result in substantial harm, embarrassment, inconvenience, or unfairness to an individual. This type of PII requires stricter protection measures due to its potential for misuse. Many data privacy laws specifically address sensitive data and apply additional restrictions and protection requirements to it.
Non-sensitive PII, on the other hand, is information that can be transmitted in an unencrypted form without resulting in harm to the individual. While it still requires protection, the security measures may not be as stringent as those for sensitive PII.
Examples of PII
PII encompasses a wide range of data points that can be used to identify an individual. So it’s important to understand specific examples for each category. Doing so enables your company to implement appropriate security measures and make it a consideration of data strategy for marketing and other operations.
Sensitive PII includes information that, if disclosed, could lead to significant harm or privacy violations. Examples of sensitive PII are:
- Social Security number
- driver’s license number
- financial account numbers (e.g., bank account, credit card)
- passport number
- biometric data (fingerprints, retinal scans)
- medical records
- genetic information
On the other hand, non-sensitive PII refers to information that is less likely to cause harm if disclosed but still requires protection. Examples of non-sensitive PII include:
- full name
- email address
- phone number
- physical address
- IP address
- date of birth
- place of birth
- race or ethnicity
- educational records
- employment information
It’s important to note that even non-sensitive PII can pose privacy risks when combined with other data. Therefore, it’s recommended that companies aim to protect all types of PII data that they collect and handle.
PII under GDPR
While the term “Personally Identifiable Information” is not explicitly used in the GDPR, the regulation encompasses this concept within its broader definition of “personal data.”
However, there are some key differences in how PII is treated under the GDPR compared to other data privacy laws:
- Expanded scope: The GDPR takes a more expansive view of what constitutes identifiable information. It includes data that might not traditionally be considered PII in other contexts, such as IP addresses, cookie identifiers, and device IDs.
- Context-dependent approach: Under the GDPR, whether information is classified as personal data (and thus protected) depends on the context and the potential to identify an individual, rather than fitting into specific predefined categories of PII.
- Pseudonymized data: The GDPR introduces pseudonymization, a process that changes personal data so it can’t be linked to a specific individual without additional information. While pseudonymized data is still classified as personal data under GDPR, it is subject to slightly relaxed requirements.
- Data minimization principle: The GDPR emphasizes the importance of data minimization, which aligns with but goes beyond traditional PII protection practices. Organizations are required to collect and process only the personal data that is necessary for the specific purpose they have declared.
- Risk-based approach: The GDPR requires companies to evaluate the risk of processing personal data, including what is traditionally considered PII. This assessment determines the necessary security measures and safeguards.
The key takeaway brands should understand is that the GDPR offers a detailed framework for protecting personal data, covering more types of identifiable information than traditional PII definitions. Companies need to understand these distinctions to achieve compliance and protect individuals’ privacy.
PII compliance best practices
To effectively protect PII data and enable compliance with relevant regulations, organizations can implement best practices tailored to their specific data handling processes. Doing so not only helps mitigate risks associated with data breaches but also fosters trust among customers and stakeholders.
Here are some key best practices for PII compliance:
- Conduct regular data audits to identify and classify PII.
- Use encryption and access controls to protect sensitive information.
- Develop and enforce clear policies for how PII is collected, processed, and stored.
- Train employees regularly on data protection and privacy best practices.
- Apply data minimization techniques to collect only necessary information.
- Implement secure methods for disposing of PII when it is no longer needed.
- Keep privacy policies updated and obtain user consent for data collection and processing.
- Perform periodic risk assessments and vulnerability scans to identify and address security weaknesses.
- Have an incident response plan ready to manage potential data breaches effectively.
PII violation and its consequences
Violations of PII protection can have serious consequences for both individuals and organizations. For individuals, this can lead to identity theft, financial fraud, and reputational damage, causing emotional and financial stress.
For organizations, the risks are significant. Non-compliance can result in hefty legal penalties, such as fines of up to EUR 20 million or 4 percent of global annual revenue under regulations like the GDPR. Companies may also face reputational damage, loss of customer trust, and reduced revenue. You could also experience operational disruptions and increased costs from addressing data breaches, including legal fees, new reporting requirements to data protection authorities, and the need to implement stronger security measures.
What you need to know about PI (personal information)
What is personal data?
Personal data is any information that can identify an individual. It encompasses a broader range of data points than PII. It also includes both direct identifiers (like names and Social Security numbers) and indirect identifiers (like location data and online IDs) that can identify someone when combined with other information.
In short, all PII is personal data, but not all personal data is considered PII.
Personal data is a key concept in data protection laws, including the GDPR and the California Consumer Privacy Act (CCPA).
Personal information examples
Personal information can include a variety of data types, both objective and subjective:
Objective data types are factual, measurable, and verifiable. This includes:
- full name
- date of birth
- Social Security number
- phone number
- email address
- IP address
- financial information (e.g., bank account numbers, credit card details)
- biometric data (e.g., fingerprints, facial recognition data)
Subjective data types are based on personal opinions, interpretations, or evaluations. This involves:
- Performance reviews
- Customer feedback
- Personal preferences
- Medical symptoms described by a patient
- Personality assessments
Both objective and subjective data can be considered personal information if they can be linked to an identifiable individual.
It’s important to note that even publicly available information can be considered personal data in some jurisdictions. For instance, under the CCPA, publicly available information is generally excluded from the definition of personal information. However, even publicly available information can be considered personal data under the GDPR.
Personal data under the GDPR
The GDPR defines personal data in Article 4(1) as, “‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person.”
This definition encompasses a broad scope and includes both direct identifiers (like names) and indirect identifiers (like location data). Given this definition, here are the key features of personal data as defined under the GDPR:
- Direct and indirect identifiers: Both are considered personal data, emphasizing the need to understand the context of information to identify individuals.
- Data collection context: The specifics of how and why data is collected and processed determine if it qualifies as personal data.
- Pseudonymized data: Even if data is pseudonymized, it is still classified as personal data if it can be re-identified. In contrast, anonymized data, where the possibility of re-identification has been eliminated, falls outside the scope of the GDPR.
- Applicability: The GDPR covers both automated and manual processing of personal data.
- Special categories: The regulation also includes sensitive data such as racial or ethnic origin, political opinions, religious beliefs, and health information.
PI compliance and best practices
To achieve and maintain compliance with data protection regulations and safeguard people’s personal information, companies can adopt the following best practices.
- Conduct regular data audits: Identify and classify all personal information within your company.
- Implement data minimization: Collect and retain only the personal data necessary for specific and legitimate purposes. Regularly delete unnecessary data.
- Manage consent and preferences: Use a consent management platform (CMP) to clearly explain how you’ll use personal information. Provide easy-to-use opt-in and opt-out options, allowing people to control their data preferences. A CMP can help automate this process, making it easier to comply with regulations and manage user choices across your digital properties.
- Check partners’ data collection: Make sure any third parties you work with protect personal information properly. Be transparent about your data-selling practices, and confirm that all partners have strong safeguards, as you could still be held responsible for how they handle data on your behalf.
- Train your team: Regularly educate all employees about the importance of protecting personal information and how to do it.
- Handle requests efficiently: Set up a system to quickly respond when people ask to see, change, or delete their personal information, depending on their particular rights.
- Assign responsibility: If required by law or as a best practice, designate a Data Protection Officer to oversee data protection compliance.
By implementing these best practices, companies can better protect personal information, build trust with their customers, and reduce the risk of data breaches and penalties.
What you need to know about sensitive information
What is sensitive data?
Sensitive data is confidential information that requires protection from unauthorized access or disclosure. If this data is compromised, it could lead to harm, discrimination, or negative consequences for the affected individual or organization. Sensitive information includes a broad range of information, such as certain kinds of PII, and also financial records, health data, and proprietary business details.
Examples of sensitive information
Sensitive information comes in various forms, and understanding these categories is essential for effective data protection. Common examples of sensitive personal data include:
- Personal data: Full names, home addresses, phone numbers, Social Security numbers, driver’s license numbers
- Financial information: Bank account numbers, credit card details, payment information
- Health data: Medical records, health insurance information, protected health information (PHI)
- Employee data: Payroll information, performance reviews, background checks
- Intellectual property: Trade secrets, proprietary code, product specifications
- Access credentials: Usernames, passwords, PINs, biometric data
- Industry-specific data: Retail sales figures, legal case information, research data
- Identity data: Political affiliation, religious beliefs, sexual or gender orientation
How GDPR treats sensitive data
Under the GDPR, sensitive personal data, also known as special categories of data, includes information about a person’s race, political beliefs, religion, union membership, genetic and biometric data, health, and sexual orientation.
Processing this type of data is generally only allowed if specific conditions are met. For instance, individuals must give explicit consent for their sensitive data to be used. It can also be processed if necessary for employment, legal claims, public interest, healthcare, or research.
How to safeguard sensitive data
Organizations must take extra precautions to protect sensitive data. So to safeguard sensitive information, here are some recommendations for companies.
- Implement data classification: Categorize data based on sensitivity levels to minimize processing and apply appropriate security measures.
- Limit access: Restrict access to sensitive data on a need-to-know basis and implement strong authentication methods.
- Use encryption: Encrypt sensitive data both at rest and in transit to prevent unauthorized access.
- Conduct regular audits: Perform security assessments to identify vulnerabilities, identify processes or data that are no longer needed, and maintain compliance with data protection regulations.
- Train employees: Educate staff on an ongoing basis about data security best practices and the importance of protecting sensitive information.
- Implement security technologies: Utilize firewalls, intrusion detection systems, and data loss prevention tools to safeguard sensitive data.
- Develop incident response plans: Create and maintain policies and procedures for responding to data breaches or unauthorized access attempts and communicating with authorities and affected data subjects.
By following these practices, companies can significantly reduce the risk of sensitive data exposure and maintain compliance with relevant data protection regulations
PII vs. PI vs. sensitive data comparison
Know your data types to better comply with global privacy laws
Safeguarding personal data — whether it falls under PII, PI, or sensitive data — is a fundamental responsibility of any organization. Each data type requires specific protection strategies, from encryption to strict access controls, to prevent unauthorized access and potential breaches.
Understanding the nuances between these data categories not only ensures compliance with global privacy laws but also fortifies the trust between your company and your customers. As the regulatory landscape continues to evolve, maintaining a proactive approach to data protection will be key to securing both sensitive information and organizational reputation.
Oregon was the twelfth state in the United States to pass comprehensive data privacy legislation with SB 619. Governor Tina Kotek signed the bill into law on July 18, 2023, and the Oregon Consumer Privacy Act (OCPA) came into effect for most organizations on July 1, 2024. Nonprofits have an extra year to prepare, so their compliance is required as of July 1, 2025.
In this article, we’ll look at the Oregon Consumer Privacy Act’s requirements, who they apply to, and what businesses can do to achieve compliance.
What is the Oregon Consumer Privacy Act (OCPA)?
The Oregon Consumer Privacy Act protects the privacy and personal data of over 4.2 million Oregon residents. The law establishes rules for any individual or entity conducting business in Oregon or those providing goods and services to its residents and processing their personal data. Affected residents are known as “consumers” under the law.
The OCPA protects Oregon residents’ personal data when they act as individuals or in household contexts. It does not cover personal data collected in a work context. This means information about individuals acting in their professional roles, rather than as consumers, is not covered under this law.
Consistent with the other US state-level data privacy laws, the OCPA requires businesses to inform residents about how their personal data is collected and used. This notification — usually included in a website’s privacy policy — must cover key details such as:
- What data is collected
- How the data is used
- Whether the data is shared and with whom
- Information about consumers’ rights
The Oregon privacy law uses an opt-out consent model, which means that in most cases, organizations can collect consumers’ personal data without prior consent. However, they must make it possible for consumers to opt out of the sale of their personal data and its use in targeted advertising or profiling. The law also requires businesses to implement reasonable security measures to protect the personal data they handle.
Who must comply with the Oregon Consumer Privacy Act (OCPA)?
Similar to many other US state-level data privacy laws, the OCPA establishes thresholds for establishing which organizations must comply with its requirements. However, unlike some other laws, it does not contain a revenue-only threshold.
To fall under the OCPA’s scope, during a calendar year an organization must control or process the personal data of:
- 100,000 consumers, not including consumers only completing payment transactionsor
or
- 25,000 consumers if 25 percent or more of the organization’s annual gross revenue comes from selling personal data
Exemptions to OCPA compliance
The OCPA is different from some other data privacy laws because many of its exemptions focus on the types of data being processed and what processing activities are being conducted, rather than just on the organizations themselves.
For example, instead of exempting healthcare entities under the Health Insurance Portability and Accountability Act (HIPAA), the OCPA exempts protected health information handled in compliance with HIPAA. This means protected health information is outside of the OCPA’s scope, but other data that a healthcare organization handles could still fall under the law. Organizations that may be exempt from compliance with other state-level consumer privacy laws should consult a qualified legal professional to determine if they are required to comply with the OCPA.
Exempted organizations and their services or activities include:
- Governmental agencies
- Consumer reporting agencies
- Financial institutions regulated by the Bank Act and their affiliates or subsidiaries, provided they focus exclusively on financial activities
- Insurance companies
- Nonprofit organizations established to detect and prevent insurance fraud
- Press, wire, or other information services (and the non-commercial activities of media entities)
Personal data collected, processed, sold, or disclosed under the following federal laws is also exempt from the OCPA’s scope:
- Health Insurance Portability and Accountability Act (HIPAA)
- Gramm-Leach-Bliley Act (GLBA)
- Health Care Quality Improvement Act
- Fair Credit Reporting Act (FCRA)
- Driver’s Privacy Protection Act
- Family Educational Rights and Privacy Act (FERPA)
- Airline Deregulation Act
Definitions in the Oregon Consumer Privacy Act (OCPA)
This Oregon data privacy law defines several key terms related to the data it protects and relevant data processing activities.
What is personal data under the OCPA?
The Oregon privacy law protects consumers’ personal data, which it defines as “data, derived data or any unique identifier that is linked to or is reasonably linkable to a consumer or to a device that identifies, is linked to or is reasonably linkable to one or more consumers in a household.”
The law specifically excludes personal data that is:
- Deidentified data
- made legally available through government records or widely distributed media
- made public by the consumer
The law does not specifically list what constitutes personal data. Common types of personal data that businesses collect include a consumer’s name, phone number, email address, Social Security Number, or driver’s license number.
It should be noted that personal data (also called personal information under some state privacy laws) and personally identifiable information are not always the same thing, and distinctions between the two are often made in data privacy laws.
What is sensitive data under the OCPA?
Sensitive data is personal data that requires special handling because it could cause harm or embarrassment if misused or unlawfully accessed. It refers to personal data that would reveal an individual’s:
- Racial or ethnic background
- National origin
- Religious beliefs
- Mental or physical condition or diagnosis
- Genetic or biometric data
- Sexual orientation
- Status as transgender or non-binary
- Status as a victim of crime
- Citizenship or immigration status
- Precise present or past geolocation (within 1,750 feet or 533.4 meters)
All personal data belonging to children is also considered sensitive data under the OCPA.
Oregon’s law is the first of the US privacy laws to include either transgender or non-binary gender expression or the status as a victim of crime as sensitive data. The definition of biometric data excludes facial geometry or mapping unless it is done for the purpose of identifying an individual.
An exception to the law’s definition of sensitive data includes “the content of communications or any data generated by or connected to advanced utility metering infrastructure systems or equipment for use by a utility.” In other words, the law does not consider sensitive information to include communications content, like that in emails or messages, or data generated by smart utility meters and related systems used by utilities.
What is consent under the OCPA?
Like many other data privacy laws, the Oregon data privacy law follows the European Union’s General Data Protection Regulation (GDPR) regarding the definition of valid consent. Under the OCPA, consent is “an affirmative act by means of which a consumer clearly and conspicuously communicates the consumer’s freely given, specific, informed and unambiguous assent to another person’s act or practice…”
The definition also includes conditions for valid consent:
- the consumer’s inaction does not constitute consent
- the user interface used to request consent must not attempt to obscure, subvert, or impair the consumer’s choice
These conditions are highly relevant to online consumers and reflect that the use of manipulative dark patterns are increasingly frowned upon by data protection authorities, and increasingly prohibited. The Oregon Department of Justice (DOJ) website also clarifies that the use of dark patterns may be considered a deceptive business practice under Oregon’s Unlawful Trade Practices Act.
What is processing under the OCPA?
Processing under the OCPA means any action or set of actions performed on personal data, whether manually or automatically. This includes activities like collecting, using, storing, disclosing, analyzing, deleting, or modifying the data.
Who is a controller under the OCPA?
The OCPA uses the term “controller” to describe businesses or entities that decide how and why personal data is processed. While the law uses the word “person,” it applies broadly to both individuals and organizations.
The OCPA definition of controller is “a person that, alone or jointly with another person, determines the purposes and means for processing personal data.” In simpler terms, a controller is anyone who makes the key decisions about why personal data is collected and how it will be used.
Who is a processor under the OCPA?
The OCPA defines a processor as “a person that processes personal data on behalf of a controller.” Like the controller, while the law references a person, it typically refers to businesses or organizations that handle data for a controller. Processors are often third parties that follow the controller’s instructions for handling personal data. These third parties can include advertising partners, payment processors, or fulfillment companies, for example. Their role is to carry out specific tasks without deciding how or why the data is processed.
What is profiling under the OCPA?
Profiling is increasingly becoming a standard inclusion in data privacy laws, particularly as it can relate to “automated decision-making” or the use of AI technologies. The Oregon privacy law defines profiling as “an automated processing of personal data for the purpose of evaluating, analyzing or predicting an identified or identifiable consumer’s economic circumstances, health, personal preferences, interests, reliability, behavior, location or movements.”
What is targeted advertising under the OCPA?
Targeted advertising may involve emerging technologies like AI tools. It is also becoming a standard inclusion in data privacy laws. The OCPA defines targeted advertising as advertising that is “selected for display to a consumer on the basis of personal data obtained from the consumer’s activities over time and across one or more unaffiliated websites or online applications and is used to predict the consumer’s preferences or interests.” In simpler terms, targeted advertising refers to ads shown to a consumer based on their interests, which are determined by personal data that is collected over time from different websites and apps.
However, some types of ads are excluded from this definition, such as those that are:
- Based on activities within a controller’s own websites or online apps
- Based on the context of a consumer’s current search query, visit to a specific website, or app use
- Shown in response to a consumer’s request for information or feedback
The definition also excludes processing of personal data solely to measure or report an ad’s frequency, performance, or reach.
What is a sale under the OCPA?
The OCPA defines sale as “the exchange of personal data for monetary or other valuable consideration by the controller with a third party.” This means a sale doesn’t have to involve money. Any exchange of data for something of value, even if it’s non-monetary, qualifies as a sale under the law.
The Oregon privacy law does not consider the following disclosures of personal data to be a “sale”:
- Disclosures to a processor
- Disclosures to an affiliate or a third party to help the controller provide a product or service requested by the consumer
- Disclosures or transfers of personal data as part of a merger, acquisition, bankruptcy, or similar transaction in which a third party takes control of the controller’s assets, including personal data
- Disclosures of personal data that occur because the consumer:
- directs the controller to disclose the data
- intentionally discloses the data while directing the controller to interact with a third party
- intentionally discloses the data to the public, such as through mass media, without restricting the audience
Consumers’ rights under the Oregon Consumer Privacy Act (OCPA)
The Oregon privacy law grants consumers a range of rights over their personal data, comparable to other US state-level privacy laws.
- Right to access: consumers can request confirmation of whether their personal data is being processed and the categories of personal data being processed, gain access to the data, and receive a list of the specific third parties it has been shared with (other than natural persons), all subject to some exceptions.
- Right to correction: consumers can ask controllers to correct inaccurate or outdated information they have provided.
- Right to deletion: consumers can request the deletion of their personal data held by a controller, with some exceptions.
- Right to portability: consumers can obtain a copy of the personal data they have provided to a controller, in a readily usable format, with some exceptions.
- Right to opt out: consumers can opt out of the sale of their personal data, targeted advertising, or profiling used for decisions with legal or similarly significant effects.
Consumers can designate an authorized agent to opt out of personal data processing on their behalf. The OCPA also introduces a requirement for controllers to to recognize universal opt-out signals, further simplifying the opt-out process.
This Oregon data privacy law stands out by giving consumers the right to request a specific list of third parties that have received their personal data. Unlike many other privacy laws, this one requires controllers to maintain detailed records of the exact entities they share data with, rather than just general categories of recipients.
Children’s personal data has special protections under the OCPA. Parents or legal guardians can exercise rights for children under the age of 13, whose data is classified as sensitive personal data and subject to stricter rules. For minors between 13 and 15, opt-in consent is required for specific processing activities, including its use for targeted advertising or profiling. “Opt-in” means that explicit consent is required before the data can be used for these purposes.
Consumers can make one free rights request every 12 months, to which an organization has 45 days to respond. They can extend that period by another 45 days if reasonably necessary. Organizations can deny consumer requests for a number of reasons. These include cases in which the consumer’s identity cannot reasonably be verified, or if the consumer has made too many requests within a 12-month period.
Oregon’s privacy law does not include private right of action, so consumers cannot sue data controllers for violations. California remains the only state that allows this provision.
What are the privacy requirements under the Oregon Consumer Privacy Act (OCPA)
Controllers must meet the following OCPA requirements to protect the personal data they collect from consumers.
Privacy notice and transparency under the OCPA
The Oregon privacy law requires controllers to be transparent about their data handling practices. Controllers must provide a clear, easily accessible, and meaningful privacy notice for consumers whose personal data they may process. The privacy notice, also known as the privacy policy, must include the following:
- Purpose(s) for processing personal data
- Categories of personal data processed, including the categories of sensitive data
- Categories of personal data shared with third parties, including categories of sensitive data
- Categories of third parties with which the controller shares personal data and how each third party may use the data
- How consumers can exercise their rights, including:
- How to opt out of processing for targeted advertising or profiling
- How to submit a consumer rights request
- How to appeal a controller’s denial of a rights-related request
- The identity of the controller, including any business name the controller uses or has registered in Oregon
- At least one actively monitored online contact method, such as an email address, for consumers to directly contact the organization
- A “clear and conspicuous description” for any processing of personal data for the purpose of targeted advertising or profiling “in furtherance of decisions that produce legal effects or effects of similar significance”
According to the Oregon DOJ website, the third-party categories requirement must strike a particular balance. It should offer consumers meaningful insights into the relevant types of businesses or processing activities, without making the privacy notice overly complex. Acceptable examples include ”analytics companies,” “third-party advertisers,” and ”payment processors,” among others.
The privacy notice or policy must be easy for consumers to access. It is typically linked in the website footer for visibility and accessibility from any page.
Data minimization and purpose limitation under the OCPA
The OCPA requires controllers to limit the personal data they collect to only what is “adequate, relevant, and reasonably necessary” for the purposes stated in the privacy notice. If the purposes for processing change, controllers must notify consumers and, where applicable, obtain their consent.
Data security under the OCPA
The Oregon data privacy law requires controllers to establish, implement, and maintain reasonable safeguards for protecting “the confidentiality, integrity and accessibility” of the personal data under their control. The data security measures also apply to deidentified data.
Oregon’s existing laws about privacy practices remain in effect as well. These laws include requirements for reasonable administrative, technical, and physical safeguards for data storage and handling, IoT device security features, and truth in privacy and consumer protection notices.
Data protection assessments (DPA) under the OCPA
Controllers must perform data protection assessments (DPA), also known as data protection impact assessments, for processing activities that present “a heightened risk of harm to a consumer.” These activities include:
- Processing for the purposes of targeted advertising
- Processing sensitive data
- The sale of personal data
- Processing for the purposes of profiling if there is a reasonably foreseeable risk to the consumer of:
- Unfair or deceptive treatment
- Financial, physical, or reputational injury
- Intrusion into a consumer’s private affairs
- Other substantial injury
The Attorney General may also require a data controller to conduct a DPA or share the results of one in the course of an investigation.
Consent requirements under the OCPA
The OCPA primarily uses an opt-out consent model. This means that in most cases controllers are not required to obtain consent from consumers before collecting or processing their personal data. However, there are specific cases where consent is required:
- Processing sensitive data requires explicit consent from consumers.
- For children’s data, the OCPA follows the federal Children’s Online Privacy Protection Act (COPPA) and requires consent from a parent or legal guardian before processing the personal data of any child under 13.
- Controllers must obtain explicit consent to use the personal data of minors between the ages of 13 and 15 for targeted ads, profiling, or sale.
- Controllers must obtain consent to use personal data for purposes other than those originally disclosed in the privacy notice.
To help consumers to make informed decisions about their consent, controllers must clearly disclose details about the personal data being collected, the purposes for which it is processed, who it is shared with, and how consumers can exercise their rights. Controllers must also provide clear, accessible information on how consumers can opt out of data processing.
Consumers must be able to revoke consent at any time, as easily as they gave it. Data processing must stop after consent has been revoked, and no later than 15 days after receiving the revocation.
Nondiscrimination under the OCPA
The OCPA prohibits controllers from discriminating against consumers who exercise their rights under the law. This includes actions such as:
- Denying goods or services
- Charging different prices or rates than those available to other consumers
- Providing a different level of quality or selection of goods or services to the consumer
For example, if a consumer opts out of data processing on a website, that individual cannot be blocked from accessing that website or its functions.
Some website features and functions do not work without certain cookies or trackers being activated, so if a consumer does not opt in to their use because they collect personal data, the site may not work as intended. This is not considered discriminatory.
This Oregon privacy law permits website operators and other controllers to offer voluntary incentives for consumers’ participation in activities where personal data is collected. These may include newsletter signups, surveys, and loyalty programs. Offers must be proportionate and reasonable to the request as well as the type and amount of data collected. This way, they will not look like bribes or payments for consent, which data protection authorities frown upon.
Third party contracts under the OCPA
Before starting any data processing activities, controllers must enter into legally binding contracts with third-party processors. These contracts govern how processors handle personal data on behalf of the controller, and must include the following provisions:
- The processor must ensure that all individuals handling personal data are bound by a duty of confidentiality
- The contract must provide clear instructions for data processing, detailing:
- The nature and purpose of processing
- The types of data being processed
- The duration of the processing
- The rights and obligations of both the controller and the processor
- The processor must delete or return the personal data at the controller’s direction or after the services have ended, unless legal obligations require the data to be retained
- Upon request, the processor must provide the controller with all necessary information to verify compliance with contractual obligations
- If the processor hires subcontractors, they must have contracts in place requiring the subcontractors to meet the processors’ obligations
- The contract must allow the controller or their designee to conduct assessments of the processor’s policies and technical measures to ensure compliance
These contracts are known as data processing agreements under some data protection regulations like the GDPR.
Universal opt-out mechanism under the OCPA
As of January 1, 2026, organizations subject to the OCPA must comply with a universal opt-out mechanism. Also called a global opt-out signal, it includes tools like the Global Privacy Control.
This mechanism enables a consumer to set their data processing preferences once and have those preferences automatically communicated to any website or platform that detects the signal. Preferences are typically set via a web browser plugin.
While this requirement is not yet standard across all US or global data privacy laws, it is becoming more common in newer legislation. Other states that require controllers to recognize global opt-out signals include California, Minnesota, Nebraska, Texas, and Delaware.
How to comply with the Oregon Consumer Privacy Act (OCPA)
Below is a non-exhaustive checklist to help your business and website address key OCPA requirements. For advice specific to your organization, consulting a qualified legal professional is strongly recommended.
- Provide a clear and accessible privacy notice detailing data processing purposes, shared data categories, third-party recipients, and consumer rights.
- Maintain a specific list of third parties with whom you share consumers’ personal data.
- Limit data collection to what is necessary for the specified purposes, and notify consumers if those purposes change.
- Obtain consent from consumers if you plan to process their data for purposes other than those that have been communicated to them.
- Implement reasonable safeguards to protect the confidentiality, integrity, and accessibility of personal and deidentified data.
- Conduct data protection assessments for processing activities with heightened risks, such as targeted advertising, activities involving sensitive data, or profiling.
- Implement a mechanism for consumers to exercise their rights, and communicate this mechanism to consumers.
- Obtain explicit consent for processing sensitive data, children’s data, or for purposes not initially disclosed.
- Provide consumers with a user-friendly method to revoke consent.
- Once consumers withdraw consent, stop all data processing related to that consent within the required 15-day period.
- Provide a simple and clear method for consumers to opt out of data processing activities.
- Avoid discriminatory practices against consumers exercising their rights, while offering reasonable incentives for data-related activities.
- Include confidentiality, compliance obligations, and terms for data return or deletion in binding contracts with processors.
- Comply with global opt-out signals like the Global Privacy Control by January 1, 2026.
Enforcement of the Oregon Consumer Privacy Act (OCPA)
The Oregon Attorney General’s office is the enforcement authority for the OCPA. Consumers can file complaints with the Attorney General regarding data processing practices or the handling of their requests. The Attorney General’s office must notify an organization of any complaint and in the event that an investigation is launched. During investigations, the Attorney General can request controllers to submit data protection assessments and other relevant information. Enforcement actions must be initiated within five years of the last violation.
Controllers have the right to have an attorney present during investigative interviews and can refuse to answer questions. The Attorney General cannot bring in external experts for interviews or share investigation documents with non-employees.
Until January 1, 2026, controllers have a 30-day cure period during which they can fix OCPA violations. If the issue is not resolved within this time, the Attorney General may pursue civil penalties. The right to cure sunsets January 1, 2026, after which the opportunity to cure will only be at the discretion of the Attorney General.
Fines and penalties for noncompliance under the OCPA
The Attorney General can seek civil penalties up to USD 7,500 per violation. Additional actions may include seeking court orders to stop unlawful practices, requiring restitution for affected consumers, or reclaiming profits obtained through violations.
If the Attorney General succeeds, the court may require the violating party to cover legal costs, including attorney’s fees, expert witness fees, and investigation expenses. However, if the court determines that the Attorney General pursued a claim without a reasonable basis, the defendants may be entitled to recover their attorney’s fees.
How does the Oregon Consumer Privacy Act (OCPA) affect businesses?
The OCPA introduces privacy law requirements that are similar to other state data protection laws. These include obligations around notifying consumers about data practices, granting them access to their data, limiting data use to specific purposes, and implementing reasonable security measures.
One notable distinction is that the law sets different compliance timelines based on an organization’s legal status. The effective date for commercial entities is July 1, 2024, while nonprofit organizations are given an additional year and must comply by July 1, 2025.
Since the compliance deadline for commercial entities has already passed, businesses that fall under the OCPA’s scope should ensure they meet its requirements as soon as possible to avoid penalties. Nonprofits, though they have more time, should actively prepare for compliance.
Businesses covered by federal laws like HIPAA and the GLBA, which may exempt them from other state data privacy laws, should confirm with a qualified legal professional whether they need to comply with the OCPA.
The Oregon Consumer Privacy Act (OCPA) and consent management
Oregon’s law is based on an opt-out consent model. In other words, consent does not need to be obtained before collecting or processing personal data unless it is sensitive or belongs to a child.
Processors do need to inform consumers about what data is collected and used and for what purposes, as well as with whom it is shared, and if it is to be sold or used for targeted advertising or profiling.
Consumers must also be informed of their rights regarding data processing and how to exercise them. This includes the ability for consumers to opt out of processing of their data or change their previous consent preferences. Typically, this information is presented on a privacy page, which must be kept up to date.
As of 2026, organizations must also recognize and respect consumers’ consent preferences as expressed via a universal opt-out signal.
Websites and apps can use a banner to inform consumers about data collection and enable them to opt out. This is typically done using a link or button. A consent management platform (CMP) like the Usercentrics CMP for website consent management or app consent management also helps to automate the detection of cookies and other tracking technologies that are in use on websites and apps.
A CMP can streamline sharing information about data categories and the specific services in use by the controller and/or processor(s), as well as third parties with whom data is shared.
The United States still only has a patchwork of state-level privacy laws rather than a single federal law. As a result, many companies doing business across the country, or foreign organizations doing business in the US, may need to comply with a variety of state-level data protection laws.
A CMP can make this easier by enabling banner customization and geotargeting. Websites can display data processing, consent information, and choices for specific regulations based on specific user location. Geotargeting can also improve clarity and user experience by presenting this information in the user’s preferred language.
Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or a privacy specialist regarding data privacy and protection issues and operations.