Skip to content

Dark patterns: What you need to know and examples

Dark patterns manipulate user behavior and are bad customer experience, as well as being strongly frowned upon by regulators, as they can negate valid consent. We look at examples of dark patterns and how you can prevent them to maintain trust.
Resources / Blog / Dark patterns: What you need to know and examples
Published by Usercentrics
12 mins to read
Feb 24, 2021

Good user experience (UX) design should make digital experiences intuitive and user-friendly. However, some companies use deceptive tactics to influence user decisions. These tactics, known as nudging or dark patterns, can exploit cognitive biases to manipulate users into taking actions they didn’t intend, like subscribing to unwanted services, sharing personal data, or making unnecessary purchases. 

Understanding how dark patterns work can help marketers avoid these practices, and enable companies to create ethical designs and avoid regulatory risks.

What is a dark pattern?

The term “dark pattern” was coined by London-based UX designer Harry Brignull. He defines a dark pattern as, “a type of user interface that appears to have been carefully crafted to trick users into doing things that are not in their interest and is usually at their expense.”

At their core, dark patterns exploit cognitive biases. They leverage the way our brains process information to nudge us toward specific decisions, often against our best interests.

These deceptive design tactics can pop up across websites, forms, emails, and apps. They are designed to trick or pressure users into doing things they may not have meant to do, or even things they actively wanted to avoid. Instead of using clear communication or fair persuasion, they manipulate users into giving up data, staying on a site, or keeping a subscription.

Types of dark patterns

Infographic presenting the types of dark patterns

Dark patterns take many forms, and different sources may categorize them in various ways. Below, we’ve outlined the dark patterns that are the most commonly used. Each is designed to manipulate users into making choices that benefit the company rather than the user.

Sneaking

Sneaking occurs when companies hide crucial information or actions to push users toward a specific decision. It can take many forms, such as hidden fees that only appear at checkout, automatic signups for paid services without obtaining clear consent, or pre-selected checkboxes that commit users to unwanted subscriptions. The goal is to get users to agree to something without realizing it until it’s too late.

Forced continuity

Many subscription services rely on forced continuity, which means that users sign up for a free trial but are automatically charged once the trial period ends. Often, these services provide little to no warning before billing begins and make cancellation difficult.

Instead of a simple “Cancel” button, users might have to navigate a complex process involving multiple confirmation steps or even direct customer service interactions.

Roach motel

A roach motel is a design that makes signing up for a subscription or service easy, but canceling or opting out deliberately difficult. Users may find themselves jumping through hoops — contacting support, mailing a cancellation request, navigating a maze of menu options, etc. — just to leave a service. The inconvenience discourages cancellations and prolongs unwanted subscriptions.

Misdirection

Misdirection steers users toward an action they didn’t intend by using visual tricks. An example of this dark pattern might be a brightly colored, bold “Subscribe” button that is positioned prominently, while the option to opt out is smaller, less visible, only an unobtrusive text link, or written in a confusing way. This tactic plays on user expectations, nudging them toward decisions they might not have made otherwise.

Privacy Zuckering

Named after Facebook’s founder, Privacy Zuckering is a deceptive practice that tricks users into sharing more personal data than they intended. This often happens through misleading privacy settings, vague descriptions of data usage, or default options that enable extensive tracking and data collection. Users may believe they are protecting their privacy, only to discover later that they unknowingly consented to extensive data sharing.

Confirmshaming

Confirmshaming relies on guilt or social pressure to influence user behavior. This tactic is commonly seen in popups that present declining an offer in a negative light.

For example, a website might prompt users with a message like, “No thanks, I prefer to stay uninformed” when they try to reject a newsletter subscription, or “That’s okay, I like paying full price” if a prospect declines to sign up for something when offered a discount. The goal is to make users feel guilty or ashamed about their decision and reconsider.

Bait and switch

A bait and switch occurs when users expect one outcome but experience another. A common example is a button that appears to close a popup but instead triggers a signup or redirects the user to an unrelated page. This deceptive approach exploits user expectations to drive engagement in misleading ways.

Disguised ads

Disguised ads blend seamlessly with regular content, which encourages users into clicking on them. They may appear as recommended articles, in-site navigation elements, or even user generated content. Because they mimic real content, users engage with them without realizing they are advertisements.

Hidden costs

Hidden costs are additional fees that only appear at the final step of a transaction. A user might add an item to their cart expecting a certain total, only to discover extra charges right before checkout, such as processing fees, mandatory add-ons, or inflated shipping costs. By revealing these costs late in the process, companies increase the likelihood that users will proceed with the purchase rather than abandon it.

Trick questions

Trick questions use confusing language to manipulate user responses, often through double negatives or misleading phrasing.

For example, a form might include a checkbox labeled “Uncheck this box if you don’t want to receive promotional emails.” This type of wording may confuse users, increasing the chance they will make an unintended selection.

Relentless repeated requests

A newer tactic that has been seen, e.g. in the current BeReal case, involves providing a much better user experience for users that do what the service wants, and relentless repeated requests and disrupted user experience for those who don’t.

For example, a user opens a new app and sees a banner requesting consent for targeted advertising. If they accept, they never see the banner again. But if they decline, the banner reappears every time they attempt to use the app. In the BeReal case it’s an app users are encouraged to use multiple times a day, leading to significant likely frustration for users.

How do dark patterns work?

Dark patterns take advantage of how humans process information. They exploit cognitive biases, mental shortcuts that help us make decisions quickly, but that can also lead to manipulation.

For example, the “default effect” makes people more likely to stick with pre-selected options. That’s why companies often use pre-checked boxes for newsletter signups or data-sharing permissions. Users may not notice these settings or assume they are necessary.

Similarly, dark patterns use urgency and scarcity to make people worry that they are missing out on something. Fake countdown timers or messages like “only two left in stock” create artificial pressure, nudging users to act impulsively.

These tactics can lead to unintentional purchases, privacy violations, and frustration. So, while they may initially benefit the business, they also damage trust in the long run.

Dark patterns examples

Dark patterns are more common than you may think. A European Commission study from 2018 found that 97% of the most popular websites and apps used at least one deceptive design tactic to manipulate users. 

The numbers improved a bit with time, with the FTC in the US, as well as the International Consumer Protection and Enforcement Network (ICPEN) and Global Privacy Enforcement Network (GPEN) discovering in 2024 that 75.7% of 642 companies’ sites and apps used least one dark pattern, with 66.8% using two or more.

Whether it’s making subscriptions hard to cancel, sneaking in extra fees, or creating fake urgency for purchases, these tactics are designed to push people into decisions they wouldn’t necessarily normally make. 

Here are three real-world dark pattern examples.

Amazon’s Prime cancellation process

Until recently, Amazon made it notoriously difficult to cancel Prime subscriptions. Users had to click through multiple pages filled with reminders about Prime benefits, special renewal offers, and unclear buttons before they could finally cancel.

This is an example of a “roach motel” dark pattern, where signing up is easy, but getting out is deliberately complicated. The goal was to discourage cancellations by making the process time-consuming and confusing.

Hidden fees on ticket websites 

Ticketing platforms, such as Ticketmaster, often advertise low prices upfront, only to reveal extra fees at checkout. A ticket listed at USD 50 might end up costing USD 80 or more after service charges, facility fees, and processing costs.

These hidden fees can add 20 percent or more to a ticket’s price, with some platforms charging an extra USD 30-60 per ticket, plus handling fees. It’s also not necessarily clear by the naming what some fees are even for. 

By waiting until the final step to display the full cost, these platforms take advantage of users’ commitment to the purchase process.

Mobile games using fake urgency

Many mobile games — such as Candy Crush Saga and Clash of Clans — use false urgency to encourage spending. They display limited time offers for in-game items, often with countdown timers suggesting the deal will disappear soon.

However, when the timer runs out, the same or a nearly identical offer usually appears again. This tactic creates the illusion of scarcity, pressuring players into impulse purchases, even though the offer was never truly limited.

Laws and regulations governing dark patterns

Infographic presenting the laws and regulations governing dark patterns

Governments and regulators worldwide have begun taking action against deceptive UX practices and dark patterns in advertising. Regulations are being implemented to enforce stricter rules and protect users from manipulation.

Several key regulations specifically address dark patterns. We’ll cover these below.

The General Data Protection Regulation

The General Data Protection Regulation (GDPR), enforced in the European Union, requires that user consent for data processing be freely given, specific, informed, and unambiguous. This means companies cannot rely on deceptive tactics to obtain consent, such as pre-checked boxes, vague language, or hidden settings.

Dark patterns that pressure users into agreeing to data collection are considered a violation of the GDPR and can result in hefty fines. These tactics can include misleading button designs or confusing opt-out options.

The California Consumer Privacy Act

Under the California Consumer Privacy Act (CCPA), companies are required to provide clear and conspicuous notices about their data collection practices and consumer rights. It also mandates that opting out of data sales must be as easy as opting in. It seeks to prevent companies from using manipulative designs to make the process confusing or burdensome.

The California Privacy Rights Act (CPRA), which amends and expands the CCPA, takes a stronger stance against dark patterns. It explicitly states that any interface designed to “subvert or impair” consumer choices regarding privacy rights is unlawful. This includes deceptive UI elements that discourage users from opting out of data collection, or those that make it unnecessarily difficult to delete personal information.

The California Privacy Protection Agency (CPPA) enforces these regulations and can issue hefty fines and penalties for noncompliance.

The Children’s Online Privacy Protection Act

Designed to protect children under 13, the Children’s Online Privacy Protection Act (COPPA) prohibits companies from collecting personal information from minors without verifiable parental consent.

Many dark patterns can violate this law. Examples include nudging children into sharing data, making unintended purchases through deceptive in-app mechanisms, or using manipulative design to encourage excessive screen time.

Companies that fail to comply with COPPA can face significant fines from the Federal Trade Commission (FTC), which has been increasingly aggressive in enforcing these protections.

How to avoid dark patterns

Checklist of actions that help avoiding dark patterns

Download Checklist

With regulations like the GDPR and the CPRA cracking down on deceptive UX practices, businesses need to go beyond bare minimum compliance and actively build trust with their users. Dark patterns might drive short-term gains, but they damage customer relationships, increase customer churn, and can even lead to legal consequences.

For marketers, designers, and enterprise companies, ethical design can help you avoid fines, but more importantly, it’ll help build long-term engagement, loyalty, and a positive brand reputation.

Here’s how to keep your UX transparent, fair, and user-friendly.

Prioritize transparency

Transparency is key to maintaining user trust. When users feel informed and in control of their experience, they’re more likely to engage with your brand for the long term. This includes being up front about pricing, data collection, and subscription terms. Incomplete or deceptive information, whether it’s about costs or how data is handled, creates confusion and frustration. This often leads to users feeling manipulated.

Here are a few ways to prioritize transparency in your UX:

  • Pricing and fees: Display total costs upfront, including taxes, shipping, and recurring fees. Avoid hidden charges or last-minute surprises at checkout.
  • Subscription terms: Clearly outline renewal policies, trial expiration dates, and how to cancel. If a subscription auto-renews, make it explicit before the user signs up and send a reminder that renewal is coming up.
  • Data collection policies: Use plain language to explain what data you collect, why you collect it, and how it will be used. Don’t bury privacy details in fine print.
  • Marketing communications: Users should know when they are subscribing to emails, texts, or push notifications. Provide clarity by avoiding pre-checked boxes and deceptive consent prompts.

Offer truthful choices to users

When users feel they have control over their decisions, they are more likely to trust and return to your brand. This means offering users genuine choices, rather than pushing towards decisions through hidden opt-outs or unnecessarily complex processes.

To offer truthful choices, follow these principles:

  • Opt-in instead of opt-out: Users should actively choose to receive marketing emails, data tracking, or additional services. Requiring users to actively opt-out of unwanted services via default opt-ins is misleading and can violate privacy laws.
  • Easy subscription management: The process for unsubscribing or canceling should be as simple as signing up. Avoid deterrents like hidden steps, long wait times, or forced calls to customer service.
  • Fair comparisons: If your business offers multiple pricing plans, present them objectively instead of using design tricks to steer users toward a specific option.

Consent and privacy management should be simple and straightforward. Your website visitors should never feel overwhelmed by complex or confusing privacy settings.

Here’s how you can simplify consent and privacy settings:

  • Clear, jargon-free language: Avoid using complex legal terminology that users may not understand. Instead, use simple, direct language when explaining privacy policies or consent forms.
  • One-click privacy controls: Enable users to adjust cookie settings, ad preferences, and data-sharing options with minimal steps.
  • No forced data sharing: Give users access to basic features without requiring unnecessary personal information. Offer options to limit data sharing while still enjoying core functionalities.

Test for ethical UX

Even with careful planning, designs may unintentionally mislead or frustrate users. To avoid dark patterns, regularly test your user experience and gather feedback. Continuous testing will help keep your platform user-friendly, transparent, and aligned with ethical principles.

To test for ethical UX, you can conduct:

  • User testing and feedback: Conduct A/B testing, usability studies, and direct feedback sessions to identify any confusing or misleading design elements.
  • Accessibility audits: Make it easy for users with disabilities to navigate and understand your platform. Ethical design includes inclusivity.
  • Regular compliance reviews: Stay updated with global privacy regulations to help ensure your UX meets both legal and ethical standards.

Dark patterns may deliver quick wins, but they come at a high cost. Your business may face frustrated users, lost trust, and potential legal penalties. As consumers become more aware and regulations tighten, businesses that rely on these deceptive design tactics take a risky gamble, trading short-term gains for long-term damage.

But privacy compliance and data-driven success don’t have to be at odds. Businesses can collect the data they need while respecting user consent, fostering transparency, and building stronger customer relationships.

A consent management platform helps businesses by helping achieve and maintain compliance and enabling transparent data practices. Solutions like Usercentrics CMPs enable globally compliant cookie consent management and support a robust data privacy framework. We’ll help your business process data responsibly and ethically.