Skip to content
Master the essentials of data privacy with our expert-led guide. From key laws and principles to consent tools and compliance tips, explore real-world examples to stay informed, build trust, and run privacy-first marketing campaigns with confidence.
Resources / Guides / Data Privacy
Published by Usercentrics
35 mins to read
Mar 25, 2025

What global data privacy laws in 2025 mean for organizations

The General Data Protection Regulation (GDPR) is often the first thing that comes to mind when discussing data privacy. Since enforcement of it started in 2018, this European Union (EU) regulation has shaped global privacy standards and inspired countries around the world to create their own privacy laws.

But the GDPR was not the first law of its kind. Sweden introduced Datalagen, the world’s first national data privacy law, in 1973. Today, more than 170 countries have enacted data privacy regulations, with new data protection laws introduced each year. The Vatican City State even passed its own data protection law in 2024 for the processing of personal data by its Governorate.

As businesses increasingly serve international markets, organizations need a clear understanding of their global data protection obligations to avoid regulatory violations, operational penalties, and reputational damage.

In this guide, we examine some of the major global data privacy laws in 2025, who they protect, and how they impact the personal data of millions.

Data privacy laws summary

Data privacy laws regulate how organizations collect, use, store, and share personal data. These laws aim to give individuals more control over their personal information and to hold businesses accountable for protecting it. 

While core principles — like requiring transparency and limiting data use — are similar across regulations, specific rights and requirements vary from country to country. Some laws focus heavily on consent, others prioritize data security or user access rights. Together, they shape how businesses handle personal information around the world.

1. EU laws

The European Union has established a comprehensive legal framework to protect the personal data of individuals across its 27 member states and the European Economic Area (EEA). 

A range of regulations work together to uphold individuals’ privacy rights, promote transparency, and set clear requirements for data handling and AI deployment across member states.

General Data Protection Regulation (GDPR)

The General Data Protection Regulation (GDPR) requires organizations that offer goods or services to individuals in the EU or EEA or monitor their behavior to uphold certain privacy rights and protect those individuals’ personal data. This landmark regulation took effect on May 25, 2018.

Unlike a directive, which requires individual countries to pass laws to implement requirements and handle enforcement of them, the GDPR applies automatically across all EU/EEA member states. 

Its reach extends to any organization that processes the personal data of individuals in EU/EEA territory, whether payment is involved or not, and regardless of the organization’s physical location. It also applies to any data controller or data processor established in the EU, even if the actual data processing takes place outside EU borders.

The regulation makes no exceptions based on organization size or revenue and applies equally to public and private organizations, nonprofits, and government bodies.

Personal data must be processed following these principles:

  • Lawfulness, fairness, and transparency: There must be a legal basis for processing data, and individuals must be given clear information about how their data will be used
  • Purpose limitation: Personal data can only be used for the specific purpose(s) for which it was collected
  • Data minimization: Organizations must collect only the data that is necessary to fulfil the data processing purpose(s)
  • Accuracy: Organizations must keep data up to date and correct any errors or outdated information when they become aware of them
  • Storage limitations: Personal data should not be kept for any longer than the organization needs it to fulfil the specified purposes
  • Integrity and confidentiality: Organizations must implement stringent security measures to keep personal data safe from unauthorized access or other breaches
  • Accountability: Organizations must be able to demonstrate their compliance with the regulation’s requirements

Consent is one of the legal bases for processing personal data, and the GDPR sets strict requirements for what makes it valid. It must be a “freely given, specific, informed and unambiguous indication of the data subject’s wishes.” 

Valid consent must involve a clear affirmative action that signals agreement to the processing of personal data. In other words, pre-ticked boxes, lack of response, or inactivity do not constitute valid consent.

The GDPR grants individuals several rights regarding their personal data, including the right to access their data, correct any inaccuracies, request data be deleted, restrict its processing, obtain data for portability, or object to certain processing activities.

Failure to comply with the GDPR can trigger significant fines. There are two levels of penalties for violations:

  • For first time or less severe violations: up to EUR 10 million or 2 percent of annual global turnover, whichever is greater
  • For serious or repeat violations: up to EUR 20 million or 4 percent of annual global turnover, whichever is greater

ePrivacy Directive

The ePrivacy Directive (ePD), sometimes called the “cookie law,” was enacted in 2002 and updated in 2009. It specifically addresses privacy issues in electronic communication, complementing the broader data protection framework established by the GDPR.

The ePrivacy Directive applies to any organization that either provides electronic communications services to or processes personal data from EU residents. 

This includes businesses that process personal data, third parties that use tracking technologies, electronic communications services providers, and website operators. Like the GDPR, the ePrivacy Directive has extraterritorial reach.

The Directive requires:

  • That communications over public networks remain confidential
  • That explicit user consent be obtained before storing or accessing cookies, unless strictly necessary for a requested service
  • Restrictions on unsolicited marketing messages via email, SMS, or other electronic means, with a right to opt out
  • That individuals be notified of directory listings and have the right to verify, correct, or withdraw their data
  • The implementation of technical and organizational measures to secure communication services
  • That metadata (such as location data and call times) not be processed without user consent or another legal basis

Cookie consent banners became more common after the ePrivacy Directive’s enactment as they provide a practical way to notify users about data collection and obtain explicit and granular consent on websites, apps, and other connected platforms.

Unlike the GDPR, which is a regulation directly applicable across the EU, the ePrivacy Directive requires each member state to enact national laws to implement its requirements. Member states have implemented the ePD into various laws and executive orders, such as: 

The ePrivacy Directive was intended to evolve into the ePrivacy Regulation, which would have had EU-wide jurisdiction like the GDPR does, but that law was delayed for many years and the proposal finally withdrawn in February 2025.

Digital Markets Act (DMA)

The Digital Markets Act (DMA), which came into effect on November 1, 2022, regulates large online platforms designated as gatekeepers by the European Commission (EC). This regulation aims to foster competition, strengthen consumer protection, and safeguard privacy in the digital sector by placing specific obligations on these dominant market players. The DMA significantly impacts user privacy and personal data.

While the GDPR makes website owners and business operators responsible for how they collect and process personal data, the DMA imposes additional requirements on gatekeepers due to their significant market power. These additional obligations promote fair competition and strengthen user privacy protections beyond standard GDPR requirements.

The EC has officially designated seven companies as gatekeepers under the DMA:

  • Alphabet, the parent company of Google, which controls various digital services including search, advertising, and mobile operating systems
  • Amazon, the ecommerce and cloud services giant, with substantial market influence
  • Apple, whose ecosystem spans hardware, software, and digital marketplaces
  • Booking.com, a dominant online travel and accommodation booking platform
  • ByteDance, owner of TikTok and other content platforms with growing global market presence
  • Meta, which controls multiple social media platforms including Facebook, Instagram, and WhatsApp, and also has a dominant ad network
  • Microsoft, whose corporate dominance and influence spans operating systems, cloud services, and business applications

Gatekeepers must obtain explicit user consent before they can process personal data for advertising or combine data across different services. The DMA’s standard of consent matches that of the GDPR: it must be freely given, specific, informed, and unambiguous.

Gatekeepers face strict limitations on cross-platform data sharing, which prevents practices like Meta using data from WhatsApp to target Facebook ads or for profiling without specific user consent.

The regulation also mandates data portability options that enable individuals to transfer their data between competing services, giving users greater control over their digital footprint.

Organizations must clearly document and explain their profiling techniques, including how user data is used to build consumer profiles. They must inform users of the purpose, duration, and impact of profiling, and provide clear mechanisms to deny or withdraw consent.

The DMA law also prohibits gatekeepers from processing personal data collected through third-party services that use their platforms for advertising purposes, which limits their data collection scope.

Digital Services Act (DSA)

The Digital Services Act (DSA) regulates online intermediaries and platforms such as marketplaces, social networks, content‑sharing services, app stores, and online travel and accommodation sites. It aims to stop illegal content, harmful activities, and the spread of false information online.

The DSA applies to all digital intermediary services that connect EU users to products, services, or content. It uses a tiered approach, which means the largest platforms and search engines — those with more than 45 million monthly EU users — face stricter requirements because of their wider influence.

The DSA strengthens data privacy protection with:

  • Transparent ad targeting: Platforms must explain why each user sees a particular ad, including data sources and profiling criteria, and offer a one‑click opt‑out from personalized advertising.
  • Stronger protections for minors and sensitive data: Ads based on profiling that uses children’s personal data or sensitive personal data under the GDPR (such as health, religion, or ethnicity) are strictly banned.
  • No deceptive design: The user interfaces on these platforms must avoid misleading designs or dark pattern tactics by making data privacy controls and consent choices clearly visible so individuals can make informed decisions.

EU AI Act

The EU AI Act, adopted in March 2024 and in effect as of August 1, 2024, is the world’s first comprehensive law to govern artificial intelligence (AI). Enforcement for its initial requirements — prohibiting high‑risk practices and introducing AI literacy measures — began on February 2, 2025. Most provisions will apply from August 2, 2026. 

This regulation creates rules for AI technologies across the EU, focusing on safety, transparency, and the protection of basic rights. It applies to all AI systems used within the EU, regardless of where the company, educational institution, or other organization using or developing AI is located.

The EU AI Act sorts AI systems into four categories based on risk:

  • Unacceptable risk (banned)
  • High risk (strict rules)
  • Limited risk (transparency rules)
  • Minimal risk (no special rules)

These classifications are based on factors like the system’s intended purpose, how independently it operates, and its potential impact on health, safety, and fundamental rights.

The regulation bans AI applications that pose unacceptable risks, including:

  • Systems that manipulate human behavior, especially those targeting vulnerable groups like children
  • Social scoring systems that rank people based on behavior, socio-economic status, or personal characteristics
  • Biometric identification and categorization systems
  • Real-time and remote biometric identification systems like facial recognition in public spaces

Penalties for noncompliance with these prohibited AI practices are steep. They can include administrative fines of up to EUR 35 million or up to 7 percent of an organization’s global annual turnover, whichever is higher.

The regulation does permit AI with reasonably high risks, but requires these systems to maintain use logs, offer transparency reports, allow human oversight, and conduct risk assessments before and after market entry.

In rare cases, high‑risk providers may process sensitive data (like health, racial, or religious information) solely for the purpose of detecting and correcting biases. This is allowed only under specific conditions with strict access limits, security measures, and data deletion rules. Such processing is permitted only when bias detection can’t be done effectively using other data types.

The EU AI Act requires that personal data processed by AI systems comply with existing data protection laws like the GDPR. It requires robust data management practices that cover data collection, processing, and storage to preserve data integrity and security.

The Digital Operational Resilience Act (DORA)

While not primarily focused on data privacy, the Digital Operational Resilience Act (DORA) represents an important regulatory development for EU financial institutions that handle personal data. 

Effective January 16, 2023, with enforcement having started January 17, 2025, DORA establishes standardized cybersecurity and operational risk requirements for the financial sector. 

This regulation complements data privacy laws — like the GDPR — by addressing how financial institutions must protect all data, including personal information, throughout their digital operations.

While the GDPR centers on data privacy, DORA focuses on operational resilience and ICT risk management, with data security measures forming a key component of that framework.

DORA obliges financial entities to protect the integrity, confidentiality, and availability of all data — including customer data — throughout its lifecycle in the financial ecosystem. 

Among DORA’s compliance requirements are that financial entities must:

  • Perform ongoing cyber threat assessments and evaluate risks whenever they make significant changes to network or IT systems
  • Set up ICT security monitoring systems and create policies that safeguard data availability, authenticity, integrity, and confidentiality to prevent corruption, loss, and unauthorized data access
  • Restrict access to necessary functions and implement authentication protocols — such as multi‑factor authentication — to verify user identity with each access
  • Establish structured protocols for ICT incident identification, management, and reporting, including data breach situations
  • Strengthen vendor oversight by vetting third‑party ICT providers, updating contracts to reflect DORA’s resilience standards, and holding partners to the same security expectations

Overview of key EU data privacy laws

LawEffective dateKey scope
ePrivacy DirectiveJuly 12, 2002Requires explicit, prior consent for cookies and traffic data
GDPR (General Data Protection Regulation)May 25, 2018Applies to any organization offering goods or services to, or monitoring the behavior of, individuals in the EU/EEA
Digital Markets Act (DMA)November 1, 2022Imposes obligations on designated gatekeeper platforms around data handling and user consent
Digital Services Act (DSA)November 16, 2022Establishes transparency, content-moderation rules, and tiered obligations for very large online platforms
EU AI ActAugust 1, 2024Governs AI systems by risk category, bans unacceptable practices, and mandates transparency, use logs, and human oversight

2. Other European laws

Beyond the EU’s regulatory framework, several European countries outside the EU have established their own data privacy regulations to protect personal data and digital rights. 

These laws often reflect similar principles to the GDPR while addressing specific national priorities. This variance creates important compliance considerations for organizations operating across the broader European region.

United Kingdom General Data Protection Regulation (UK-GDPR)

The United Kingdom General Data Protection Regulation (UK-GDPR) governs the processing of personal data belonging to individuals located in the UK. It works alongside the Data Protection Act of 2018 (DPA) and the Privacy and Electronic Communications (EC Directive) Regulations of 2003 to form the UK’s data protection framework.

The UK-GDPR took effect on January 1, 2021 following Brexit to ensure there was no gap in data protection after the EU GDPR ceased to apply in the UK at the end of the transition period on December 31, 2020.

The UK-GDPR is nearly identical to the EU GDPR, but slightly adapted to suit UK-specific requirements. Key differences include:

  • The age for obtaining valid consent is lowered to 13 years in the UK (compared to 16 years in the EU)
  • Provisions related to cooperation between multiple EU supervisory authorities have been removed, as they don’t apply to the UK

The UK-GDPR has extraterritorial jurisdiction and applies to:

  • Any person or entity in the UK that processes personal data (regardless of where the processing takes place) 
  • Any person or entity outside the UK that processes the personal data of UK citizens or residents when offering goods or services or monitoring behavior 
  • The processing of personal data in places where UK law applies by virtue of public international law

Exemptions exist for data processing for personal or household activities, law enforcement purposes, and intelligence services.

The UK-GDPR maintains the same seven key principles of processing as the EU GDPR. Like the EU GDPR, valid consent under the UK-GDPR must be freely given, specific, informed, unambiguous, and indicated by a clear affirmative action. Pre-checked boxes, silence, or inactivity do not constitute valid consent.

Data subjects have the same rights as those under the EU GDPR. These include the rights to:

  • Be informed about how their data is used
  • Access their personal data
  • Request corrections (rectification)
  • Have their data erased
  • Restrict how their data is processed 
  • Transfer data to another provider (data portability)
  • Object to certain types of processing 
  • Challenge decisions made through automated decision processing

The UK-GDPR imposes significant penalties for violations:

  • For less severe violations: Up to GBP 8.7 million or 2 percent of annual global turnover, whichever is higher
  • For serious violations: Up to GBP 17.5 million or 4 percent of annual global turnover, whichever is higher

The Information Commissioner’s Office (ICO), headed by the Information Commissioner, is responsible for enforcing the UK-GDPR.

European Economic Area (EEA)

On July 6, 2018, the GDPR became applicable to the non-EU EEA countries of Iceland, Liechtenstein, and Norway through a Joint Committee Decision. National legislatures then enacted laws to formally adopt those provisions and to regulate electronic communications.

Liechtenstein implemented the GDPR through its Datenschutzgesetz (Data Protection Act) and accompanying Datenschutzverordnung (Privacy Regulation), both of which took effect on January 1, 2019.

Iceland adopted Act 90/2018 to implement the GDPR. The law took effect in July of 2018. Additionally, cookie usage in Iceland falls under the Electronic Communications Act No. 70/2022, which governs how websites must handle tracking technologies.

Norway incorporated the GDPR through the “Act of 15 June 2018 no. 38 relating to the processing of personal data,” commonly known as the Personal Data Act. The law became effective in Norway on July 20, 2018. The Electronic Communications Act (Ecom Act), updated January 1, 2025, regulates tracking cookies.

Switzerland

The overhauled Federal Act on Data Protection (FADP) came into effect on September 1, 2023, largely replacing Switzerland’s previous 1992 data privacy law. The FADP took effect with immediate application and no transition period for organizations to adapt.

Similar to the GDPR, Switzerland’s data protection law extends beyond its borders. The FADP applies to data processing activities that impact individuals in Switzerland, regardless of where the organizations engaged in these activities are located. It applies to both private and public sector organizations.

3. US data privacy laws

The United States lacks a comprehensive federal data privacy law comparable to the GDPR. Instead, the responsibility for protecting personal data falls to individual states, many of which have created their own privacy laws.

While there have been several attempts to introduce a comprehensive national data privacy law, no such legislation has passed to date. 

Some federal laws do exist, but these address personal data only in specific contexts or for certain industries, leaving broader privacy protection to state lawmakers.

Federal Trade Commission Act (FTC Act)

The Federal Trade Commission Act (FTC Act) gives the US Federal Trade Commission the authority to stop unfair methods of competition and deceptive or unfair practices in commerce. Although the law doesn’t explicitly address personal data, the FTC has repeatedly used its powers to protect consumer privacy and personal information.

The law applies to individuals, partnerships, and corporations whose business engages in or affects commerce. Exceptions include financial institutions, insurance companies, air carriers, nonprofits, and transportation and communications carriers.

The FTC has pursued enforcement actions against companies for various data protection failures. In recent years, the FTC has brought enforcement actions for personal data breaches, failing to meet data security requirements, sharing individuals’ personal data, unlawful tracking of personal data, and selling sensitive data.

Children’s Online Protection Act (COPPA)

The Children’s Online Protection Act (COPPA) is a federal privacy law enacted in 1998 that became effective in 2000. It regulates how commercial websites and online services collect personal data from children under the age of 13. 

COPPA applies to any operator that: 

  • Directs services specifically to children under 13
  • Knowingly collects personal information from children under 13 on a general audience site
  • Knowingly collects personal information through third-party child‑directed websites or services, such as plugins or ad networks

Before collecting, using, or disclosing personal information from children, entities covered by COPPA must obtain verifiable consent from a parent or legal guardian. This consent must be obtained before the initial collection of the child’s information. Additional, separate consent is required before disclosing that information to any third parties.

Companies must also provide clear notice explaining what personal information they will collect from children, how they will use it, and whether they intend to share it with third parties.

Health Insurance Portability and Accountability Act (HIPAA)

The Health Insurance Portability and Accountability Act (HIPAA) mandates federal standards to safeguard protected health information (PHI) from disclosure without a patient’s consent. 

It applies to “covered entities,” which include:

  • Healthcare providers, such as hospitals, doctors, dentists, and pharmacies
  • Health plans, including private insurers, employee‑sponsored plans, Medicare, and Medicaid
  • Healthcare clearinghouses that perform administrative tasks or process healthcare information
  • Business associates that access PHI on behalf of covered entities for services such as billing, data storage, or legal consulting
  • Consultants providing advice or analysis related to health information or operations that require the handling of PHI
  • Contractors or subcontractors offering services like claims processing or data analysis involving PHI

HIPAA requires that healthcare information remain protected from the moment it is created until it is destroyed. Access to this information must be limited, and safeguards must be in place whenever it is used or transmitted for healthcare purposes.

Covered entities must obtain a signed HIPAA authorization before selling or sharing PHI, using it for marketing or fundraising activities, disclosing psychotherapy notes, or releasing PHI to research organizations. 

The law also mandates that covered entities implement comprehensive privacy policies and procedures, establish processes for handling data subject access requests (DSAR), maintain appropriate data security measures, and conduct regular risk assessments to protect health information.

Gramm-Leach-Bliley Act (GLBA)

The Gramm-Leach-Bliley Act (GLBA), enacted in 1999, sets federal standards for data privacy and security in the US financial industry. It applies to financial institutions, which the law defines as “any institution the business of which is engaging in activities that are financial in nature or incidental to such financial activities.” 

This definition includes banks, insurance companies, payday lenders, mortgage brokers, non‑bank lenders, debt collectors, real estate appraisers, professional tax preparers, and financial advisors and planners.

The GLBA protects nonpublic personal information (NPI), which can include:

  • Information a consumer provides to a financial institution to obtain a product or service
  • Information resulting from a transaction between the consumer and the institution involving a financial product or service
  • Any information a financial institution otherwise obtains about a consumer in connection with providing a financial product or service

The law requires financial institutions to provide clear privacy notices that explain how they collect, use, and share customer data. These notices must also inform consumers of their right to opt out of having their NPI shared with nonaffiliated third parties. 

Financial institutions must also develop, implement, and maintain comprehensive data security programs to protect consumer data from unauthorized access, misuse, and breaches.

The law also makes it illegal to obtain or disclose — or attempt to obtain or disclose — customer information under false pretenses.

Family Educational Rights and Privacy Act (FERPA)

The Family Educational Rights and Privacy Act (FERPA) grants parents the right to access their children’s education records, request corrections, and control the disclosure of personally identifiable information (PII)

Those rights transfer to the student when they turn 18 or enter an educational institution after high school.

The law applies to any educational institution that receives federal funding.

FERPA has two goals: 

  • To give parents (and eligible students) access to education records
  • To protect those records from being disclosed to third parties without consent

Under FERPA, schools must provide parents or eligible students an opportunity to inspect and review education records on request. Parents can request that their child’s education records be corrected if the information is inaccurate or misleading, or if it violates the child’s privacy rights.

While an educational institution is not obligated to make the requested amendment, it must consider the request, inform the parent of its decision, and, if the request is denied, notify the parent of their right to a hearing on the matter.

Generally, a school cannot disclose PII from a student’s education records to a third party without prior written consent from the parent or eligible student, although some exceptions exist. Schools are also required to annually notify parents of their rights under FERPA.

Video Privacy Protection Act (VPPA)

The Video Privacy Protection Act (VPPA) is a federal data privacy law focused on safeguarding individual privacy related to video rental and viewing histories. It restricts how companies can share records of video rentals and purchases.

The law targets “video tape service providers,” a term that originally covered businesses involved in renting, selling, or delivering physical video materials. Courts have since applied this definition to modern services like streaming platforms Hulu and Netflix, recognizing that they perform similar functions in the digital era.

Under the VPPA, providers may not knowingly disclose PII that links a consumer to specific video materials, though certain exceptions exist. To disclose such PII, providers must first obtain informed, written consent from the consumer. This consent process has specific requirements: 

  • Consent must be given separately from other agreements
  • It must be obtained either at the time of disclosure or up to two years in advance (with revocation possible earlier)
  • There must be a clear method for users to withdraw their consent anytime, either for specific disclosures or in full

US state-level privacy laws

In the absence of comprehensive federal legislation, many individual US states have enacted their own consumer data privacy laws with varying scopes and requirements. 

California led the movement with the California Consumer Privacy Act (CCPA), enacted in 2020, and many other states have since established privacy frameworks that grant their residents specific rights regarding their personal information.

Most of these state privacy laws operate on an opt-out consent model, which means that in most circumstances businesses can collect and process personal information or data without prior consumer consent until a user actively opts out. 

However, certain types of data — particularly sensitive personal information, which includes children’s data — do require explicit opt-in consent before processing across most state laws.

Get a comprehensive breakdown of US state-level data privacy laws and what they mean for organizations.

California Consumer Privacy Act (CCPA) / California Privacy Rights Act (CPRA)

The California Consumer Privacy Act (CCPA) was passed in 2018 and took effect on January 1, 2020. It was amended and expanded by the California Privacy Rights Act (CPRA) on January 1, 2023. Following a legal challenge, enforcement of the CPRA began in February 2024. These two laws are commonly referred to together as the CCPA/CPRA. 

The CCPA/CPRA protects the personal information of California’s nearly 40 million residents, defined as:

  • Any individual in California for non-temporary purposes
  • Anyone domiciled in the state who is temporarily outside California

The law applies to for-profit businesses operating in California that collect personal information from state residents and meet any of these thresholds:

  • Their annual gross revenue exceeds USD 26,625,000 for the previous calendar year
  • They receive, buy, sell, or share personal information of 100,000 or more consumers or households
  • They earn more than half of their annual revenue from the sale of consumers’ personal information

Like European data privacy regulations, the CCPA/CPRA has extraterritorial jurisdiction. Businesses that meet any threshold must comply with CCPA/CPRA obligations when doing business with California residents, regardless of where the company itself is based.

The CCPA/CPRA grants consumers the following rights regarding their personal information:

  • Right to delete: Consumers mayrequest that a business delete personal information it collected from that consumer
  • Right to correct: They mayrequest that a business correct incomplete or inaccurate personal information
  • Right to know and access: Consumers maylearn the categories of personal information a business holds about them, the purposes for collection, from where the information was obtained, categories of third‑party recipients, and specific personal information the business has gathered
  • Right to know regarding sale or disclosure: Consumers must be made aware of what categories of personal information have been sold, shared, or disclosed and to which categories of third parties
  • Right to opt out: Consumers maystop the sale or sharing of personal information, or its use for targeted advertising or profiling
  • Right to limit: The use or disclosure of sensitive personal information may be restricted
  • Right of nondiscrimination: Consumers shouldreceive the same price and service, as well as retaining access to services, even if they exercise their privacy rights

Violations of the CCPA/CPRA can result in civil penalties up to: 

  • USD 2,663 per non-intentional violation
  • USD 7,988 per intentional violation or any violation involving minors’ personal information

Both the California Attorney General and the California Privacy Protection Agency (CPPA) — which was established under the CPRA — have enforcement authority, though a business cannot be penalized by both entities for the same violation. The CPPA must stay its administrative action or investigation when requested by the Attorney General.

California law provides consumers with a private right of action to sue businesses directly following certain data breaches. This provision applies when a security breach involves non-encrypted or non-redacted personal information that was stolen due to the business’s failure to implement reasonable security measures. Consumers can seek statutory damages between USD 107 and USD 799 per incident

To date, California remains the only state to grant this private right of action.

The Virginia Consumer Data Protection Act (VCDPA)

The Virginia Consumer Data Protection Act (VCDPA) was signed into law in March 2021 and took effect on January 1, 2023. It protects the personal data of Virginia’s 8.8 million residents.

The law applies to for-profit companies that conduct business in Virginia or produce products and services targeting Virginia residents that:

  • Control or process the personal data of 100,000 or more consumers during a calendar year
  • Control or process the personal data of 25,000 or more consumers and derive over 50 percent of their gross revenue from the sale of that personal data

The VCDPA has extraterritorial reach, so companies do not need to be headquartered in Virginia for the law to apply to them.

The VCDPA differs from some state laws, like those in California, as to what it considers the sale of personal data. It narrows “sale” to mean only an exchange of personal data for monetary payment from the controller to a third party. Many other states cast a wider net, treating any transfer of personal data for money or “other valuable consideration” as a sale.

The Utah Consumer Privacy Act (UCPA)

The Utah Consumer Privacy Act (UCPA) came into effect on December 31, 2023. It gives nearly 4 million Utah residents control over how businesses collect and use their personal data and establishes obligations for companies operating in the state or offering goods and services to its consumers.

The UCPA applies to businesses with an annual revenue of USD 25 million and above that conduct business in Utah or target Utah consumers and either:

  • Control or process the personal data of 100,000 or more consumers

or

  • Derive more than 50 percent of their gross revenue from the sale or control of personal data of 25,000 or more consumers

Consumers have fewer rights under the UCPA than laws like the CCPA/CPRA and the VCDPA. The UCPA provides consumers with four main rights: 

  • Right to access their personal data and confirm whether a controller is processing it
  • Right to delete personal data they directly provided to a controller
  • Right to data portability, which means they can obtain a copy of their data in a readily usable format and transfer it to another controller
  • Right to opt out of personal data sales or targeted advertising

Unlike more comprehensive state privacy laws, the UCPA does not include the right to appeal decisions or the right to correct inaccuracies in personal data.

The UCPA also does not require prior consent when processing data categorized as sensitive. Instead, businesses must notify consumers about its collection and use and offer a clear opt-out option.

Like the VCDPA, Utah’s privacy law also maintains a narrow definition of a “sale” as any “exchange of personal data for monetary consideration by a controller to a third party” and does not consider exchange for other valuable consideration as a sale.

Utah is the first state to enact an AI-focused consumer protection law. The Utah Artificial Intelligence Policy Act (UAIP), effective May 1, 2024, modifies the UCPA by placing additional requirements on businesses using generative AI. Regulated industries — in which professionals need a license or state certificate — must disclose when customers interact with generative AI or content created by it.

The Florida Digital Bill of Rights (FDBR)

The Florida Digital Bill of Rights (FDBR) establishes data privacy protections for more than 23 million Florida residents and sets obligations for companies doing business in the state or offering goods and services to its residents. The FDBR came into effect on July 1, 2024.

The law applies to organizations conducting business in Florida or offering products or services targeted to Florida residents that meet both of these criteria:

  • Global gross annual revenue exceeding USD 1 billion
  • At least one of the following:
    • Derives 50 percent or more of global gross annual revenues from online advertising sales, including targeted advertising
    • Operates a consumer smart speaker with voice command service and cloud-connected virtual assistant using hands-free verbal activation (excluding vehicle systems operated by motor vehicle manufacturers or their affiliates)
    • Runs an app store or digital distribution platform offering at least 250,000 different software applications

Florida’s USD 1 billion revenue threshold — much higher than other states’ limits — targets large corporations instead of smaller businesses. Due to its requirements and targeting of larger companies, the FDBR is often not considered one of the comprehensive US data privacy laws.

The FDBR also functions as a social media regulation. It prohibits government entities from requesting content or account removal from social media platforms unless the content or account is used for criminal activity or violates Florida public records law. 

The law defines social media platforms as “a form of electronic communication through which users create online communities or groups to share information, ideas, personal messages, and other content.”

The FDBR also enhances protections for children by defining anyone under age 18 as a child and tripling financial penalties — which are ordinarily up to USD 50,000 — for violations affecting known minors.

Other instances that can triple penalties are when:

  • A controller fails to delete personal data after a verified consumer request, or a processor ignores the controller’s instruction
  • A controller continues to sell or share a consumer’s personal data after the consumer exercises their opt-out rights

The Maryland Online Data Privacy Act (MODPA)

The Maryland Online Data Privacy Act (MODPA) was signed into law on May 9, 2024. While the law goes into effect on October 1, 2025, it will not impact personal data processing activities until April 1, 2026. 

MODPA protects the privacy and personal data of Maryland’s roughly 6.2 million residents by setting rules for how businesses collect, process, and use that information.

The Maryland privacy law applies to businesses that operate in Maryland or target its residents with products and services and that, during the previous calendar year, either:

  • Controlled or processed the personal data of at least 35,000 consumers, excluding data processed solely for payment transactions

or

  • Controlled or processed the personal data of at least 10,000 consumers and derived more than 20 percent of their gross revenue from personal data sales

Any company meeting these thresholds must comply with MODPA’s requirements, regardless of where the company itself is based.

Maryland takes a stricter approach to sensitive data than most other states. Under MODPA, controllers are prohibited from:

  • Collecting, processing, or sharing sensitive data unless “strictly necessary” to provide or maintain a specific product or service requested by the consumer
  • Selling sensitive data under any circumstances

Unlike other state laws, MODPA provides no option for controllers to obtain consent for these sensitive data processing activities. Maryland’s restrictions on sensitive information are therefore more strict than many other state laws.

Washington’s My Health My Data Act

The Washington My Health My Data Act (MHMDA) establishes a targeted state-level privacy framework focused exclusively on consumer health data. To date Washington does not yet have a comprehensive data privacy law, though legislation has been introduced several times.

The MHMDA extends privacy protections to consumer health data collected by entities outside HIPAA’s scope, such as mobile apps, websites, and small businesses. Whereas HIPAA covers health information held by healthcare providers and plans, this law reaches any business that handles health data belonging to Washington residents.

The Washington MHMDA applies to three categories of entities: regulated entities, small businesses, and processors.

Regulated entities 

These are legal entities that conduct business in Washington State or target Washington consumers with products or services and that determine how consumer health data is collected, processed, shared, or sold. Government agencies, tribal nations, and their service providers are exempt from this category.

Small businesses

These are regulated entities that meet one of the following thresholds: 

  • They collect, process, sell, or share health data of fewer than 100,000 consumers in a year

or 

  • They derive under 50 percent of gross revenue from the collection, processing, selling, or sharing of consumer health data while handling data of fewer than 25,000 consumers

Processors

These are individuals or organizations that process consumer health data on behalf of regulated entities or small businesses. This category includes out of state providers working for Washington-based entities.

The Washington MHMDA prohibits any collection of consumer health data except when:

  • The consumer gives explicit, opt-in consent for a specified purpose
  • The data is strictly necessary to provide a product or service the consumer has requested

Once collected lawfully, businesses may not share consumer health data without separate opt-in consent — which must be distinct from initial consent for collection — or unless sharing is needed to deliver the requested product or service.

The law also restricts geofencing technology within 2,000 feet of in-person healthcare services when used to collect health data, track individuals, or deliver targeted ads.

The Washington MHMDA grants consumers a private right of action. In fact, on February 10, 2025, the first class action lawsuit based on this law was filed against an online retailer.

The New York SHIELD Act

The New York Stop Hacks and Improve Electronic Data Security Act (New York SHIELD Act) introduced breach notification obligations and data security standards for businesses processing the private information of New York residents. It updated the state’s 2005 Information Security Breach and Notification Act by broadening the definition of private data and adding extra protections.

The law applies to any person or business that owns or licenses computerized data that contains the private information of New York state residents, regardless of whether the business itself is located in New York. This law’s reach is significantly expanded — the previous 2005 law only applied to businesses operating within New York state.

The New York SHIELD Act includes:

  • A broader definition of private information
  • Updated criteria for what is a security or data breach
  • Specific procedures for notifying individuals and regulators after a breach
  • Requirements for implementing administrative, technical, and physical safeguards for data protection
  • Extended reach to businesses that are out of state and even outside the US

The law also increased penalties for noncompliance with its data security and breach notification requirements.

Enforcement rolled out in two phases: 

  • Breach notification requirements took effect on October 23, 2019
  • Data security obligations began on March 21, 2020

4. Global data privacy laws

Beyond Europe and the US, countries around the world have established their own comprehensive data privacy regulations for protecting personal data and individual privacy rights.

These national regulations often draw on international norms — such as those established by the GDPR — while tailoring requirements to reflect local legal and cultural contexts.

Brazil’s LGPD

Brazil’s Lei Geral de Proteção de Dados Pessoais (LGPD), also known as the General Data Protection Law, took effect on August 16, 2020. While it draws heavily on the GDPR, the LGPD extends protections in areas such as data transfers and sensitive processing.

The LGPD establishes 10 key principles for data processing:

  • Purpose: Processing must serve legitimate, specific purposes with no incompatible uses
  • Adequacy: Activities must align with purposes communicated to data subjects
  • Necessity: Processing must be limited to minimum required data proportional to stated purposes
  • Free access: Subjects are entitled to cost-free consultation about their data and processing
  • Data quality: Information must be accurate, clear, relevant, and up to date
  • Transparency: Organizations should provideclear information while protecting business secrets
  • Security: Technical safeguards should be implemented against unauthorized access or unintended disclosure
  • Prevention: Organizations should takeproactive measures to prevent data-related damages
  • Nondiscrimination: Processing should not be done for discriminatory purposes
  • Accountability: Organizations mustdemonstrate compliance with data protection rules

Brazil follows an opt-in consent model similar to the GDPR. Under Brazilian law, when a data subject agrees to the processing of their personal data for a specific purpose, that consent must be “free, informed and unambiguous.”

Canada’s PIPEDA

Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) received royal assent in 2000, with subsequent provisions coming into effect in 2001 and 2009. One of Canada’s earliest privacy laws, it was designed to build consumer trust in the emerging ecommerce market. 

PIPEDA sets rules for how private organizations can collect, use, and disclose personal information, and includes rules for electronic documents.

The law applies to any organization that collects, uses, or discloses personal information about Canadian residents during commercial activities related to federal work, undertaking, or business, though some exceptions apply. Coverage extends to data about employees and job applicants as well as private citizens.

PIPEDA establishes several fundamental consumer rights:

  • Right to be informed about why organizations collect and use personal information, and to access and correct this information
  • Right to responsible use, which means organizations must handle personal data reasonably and only for purposes to which consumers have agreed
  • Right to security, which requires organizations to implement appropriate protections and identify staff responsible for safeguarding personal information
  • Right to rectification, which means consumers can expect accurate, complete information and request corrections when needed
  • Right to complain when consumers believe organizations have violated their privacy rights

Additional Canadian data privacy laws include:

  • Québec Law 25
  • Provincial private-sector privacy laws in British Columbia and Alberta
  • Health-information acts in Ontario, New Brunswick, Nova Scotia, and Newfoundland and Labrador

South Africa’s POPIA

The Protection of Personal Information Act (POPIA) is a comprehensive framework that protects South African residents’ personal data. Although POPIA received Presidential assent in 2013, it only reached full effect in 2020, and enforcement began in 2021.

The law applies to any natural or juristic person who processes personal information by either automated or non-automated means. While this includes individuals, the law most commonly affects companies, organizations, and government entities.

The law specifies six justifications for processing personal information, similar to the legal bases found in the GDPR. These include: 

  • Consent from the data subject
  • Necessity for contract performance
  • Compliance with legal obligations
  • Protection of data subjects’ legitimate interests
  • Fulfillment of public law duties by public bodies
  • Pursuit of legitimate interests by the responsible party or third parties receiving the information

POPIA follows an opt-in consent approach similar to the GDPR and LGPD. Generally, organizations must obtain consent from legally competent individuals before collecting or processing their personal information. 

For data regarding children, this consent must come from a “competent person,” such as a parent, guardian, or other legal representative, although the law permits certain exceptions. POPIA defines children as individuals under 18 years old — a higher age threshold than the GDPR.

China’s PIPL

China’s Personal Information Protection Law (PIPL) was passed on August 20, 2021, and took effect on November 1, 2021. It complements the Data Security Law of June, 2021 and establishes a comprehensive framework for protecting the personal information of Chinese citizens

Chinese organizations and foreign companies operating in China, as well as those outside China that handle the personal information of its citizens, must implement compliance measures to meet the law’s requirements.

Personal information handlers may process personal information only when they satisfy at least one of these conditions:

  • They obtain the individual’s explicit consent
  • Processing is necessary to:
    • Conclude or perform a contract to which the individual is a party, or to manage human resources under valid labor rules and collective agreements
    • Fulfil statutory duties, responsibilities, or legal obligations
    • To respond to public health emergencies or to safeguard life, health, or property under urgent conditions
  • Processing falls within a reasonable scope for news reporting, public opinion oversight, or other public interest activities
  • The personal information was lawfully disclosed by the individual or another party, and processing stays within a reasonable scope
  • In other circumstances specifically authorized by laws or administrative regulations

Unlike the GDPR, the PIPL does not include a “legitimate interest” basis. Handlers must secure prior consent rather than rely on a broader justification for processing.

Consent under the PIPL must be voluntary, informed, and explicit. When laws or regulations demand it, handlers must obtain separate or written consent. Individuals have the right to withdraw consent at any time.

For minors under age 14, handlers must obtain prior consent from a parent or other legal guardian before collecting or processing their personal information.

Consent often serves as the legal basis for collecting personal data under many data protection regulations. When organizations rely on consent, many laws — like the GDPR and Brazil’s LGPD — require it to be explicit. Individuals must provide clear, opt-in affirmative action before any personal data processing occurs.

In contrast, laws following an opt-out model, like the US state-level laws, have specific requirements to include ways for individuals to opt out, such as California’s requirement for a “Do Not Sell Or Share My Personal Information” link. These laws also frequently mandate prior opt-in consent for processing sensitive data or data regarding children. 

For organizations to meet their obligations under global data privacy legislations, they need an effective consent management process. 

Consent management platforms (CMPs) like Usercentrics CMP help organizations request, receive, document, and manage user consent decisions, whether they’re dealing with opt-ins or opt-outs. They are particularly useful for managing consent related to cookies and tracking technologies and assist with updating consent flows as regulations change. 

Tools like cookie pop-ups or banners and privacy notices help organizations obtain consent and transparently inform users about data collection, usage practices, and their rights. In other words, they help organizations fulfill requirements found in data privacy laws.

Documenting and securely storing consent choices over time creates an audit-ready record, streamlines data access requests, reduces the risk of regulatory penalties or lawsuits, and demonstrates respect for individual preferences, ultimately strengthening trust.

By managing consent transparently, organizations can satisfy regulatory demands and demonstrate respect for individual preferences.

Usercentrics does not provide legal advice, and information is provided for educational purposes only. We always recommend engaging qualified legal counsel or privacy specialists regarding data privacy and protection issues and operations.