What are dark patterns and how do they affect consent?

With more privacy laws in place around the world, use of dark patterns and similar manipulative design techniques is increasingly risky.
Resources / Blog / What are dark patterns and how do they affect consent?
Published by Usercentrics
10 mins to read
Dec 17, 2021

Enforcement of data privacy law is growing, and new legislation to protect consumers’ privacy continues to be enacted around the world. This means that businesses that need user data must be sure to obtain consent for collection and processing. However, how data and consent are obtained varies. Manipulative web design techniques, like dark patterns, used to gather data, acquire permission to use cookies and more remain common.

These techniques are generally referred to as dark patterns. But with Gartner stating that 65% of the world’s population will have privacy protection under one or more global regulations by 2023, employing such tactics is increasingly risky.

Not sure if your company is using dark patterns on your website, or how to avoid them in favour of more transparent, user-friendly user experiences? We’ll break down why dark patterns happen — and why they’re not a great idea — and how businesses can go about avoiding them while still obtaining high rates of user consent. These techniques help businesses build more customer trust while also being attentive to privacy compliance.

What are dark patterns?

US Federal Trade Commissioner Rohit Chopra defined dark patterns as: “design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service, but often harmful to users or contrary to their intent.”

These practices on websites, forms, emails, apps, etc. are intended to trick or manipulate visitors into actions that an entity wants them to take, that the user didn’t intend, or sometimes that the user actively didn’t want to do. They eschew transparent communication or more benign persuasion. They aim to reduce or remove the uncertainty of users choosing to limit access to their data, cancel a subscription, leave the website, or other actions.

The term “dark patterns” was coined by UK-based user experience researcher Harry Brignull. While it sounds ominous and dramatic, dark patterns aren’t always overtly manipulative. They can also include poorly-designed UI or UX elements that aren’t intended to be problematic, but that can nonetheless frustrate users or produce results contrary to their wishes.

Some examples of ways dark patterns can manifest include:

  • buttons or other user interface elements that encourage selecting one option over others via color, size, placement or text format
  • necessary text that is intentionally made hard to notice via size, colour, or placement
  • interactive elements (like a toggle) that are extremely difficult to select or deselect
  • making the entity’s preferred action the default selection
  • a sign-up form that uses complex or confusing language and obscures what the user is really agreeing to
  • hiding or obscuring information, like pricing
  • omitting or downplaying potentially negative information
  • requiring the user to actively decline or remove options (like purchase add-ons) they don’t want and didn’t select
  • making it difficult to cancel a subscription

So why use them? Because it removes uncertainty. It makes more money (at least for a while). It gets companies what they want (e.g. user data). It’s easier than implementing transparent best practices. It’s the result of ignorance or negligence.

What are companies’ responsibilities for users’ data privacy under the CCPA?

Icon_Headphones

Types of dark patterns

Dark patterns can be part of the text or visual design elements in websites or apps. Which ones get used likely depends on what the organization is trying to get from visitors or customers. These are a few that could relate to user data privacy or consent. (Learn more about types of dark patterns.)

Trick questions

This could include a question on a form or a banner with options, and at first glance the question or choices may appear to say one thing. But if you read more closely, you’re answering or agreeing to something else entirely. Double negatives and similar tricks can show up here.

Roach motels

This is a user experience tactic that obfuscates navigation. Basically, it’s easy to get into a situation, but much harder to get out. This is often found when you want to stop doing something you may have initially agreed to, like providing your consent for data collection, or cancelling a subscription. Under many privacy laws, this is illegal. It must be as easy to withdraw consent as it was to give it initially.

Privacy “Zuckering”

Named after the Facebook CEO, this involves using tricks to get users to share more information about themselves than they wanted or meant to. Regarding data privacy, this can show up where users are not given equal ability to consent to or refuse data collection, or the most permissive consent settings are made the default.

Misdirection

Not just for magicians! This tactic is meant to distract your attention from one thing by focusing you on another. Usually while something happens with the thing you’re not supposed to be paying attention to. This could tie in to nudging, where design elements encourage you to select one option over another, like consent over rejection. Or the text could go on about how the company needs your consent for the best customer experience, or to make the website work correctly. Which is probably true, but at the same time you’re being nudged toward giving consent for marketing data collection, which isn’t essential.

Confirmshaming

This could be considered a cousin of several of the other tactics, and it’s meant to psychologically manipulate users, guilting you into making the choice the organization wants. Or making you feel like you are doing something wrong or missing out if you make the other choice or no choice at all. It can range from a gray area to straight up illegal. “Manipulinks” is also a portmanteau for a version of this tactic.

Trying to convince you that the company can’t deliver products or services (or won’t remain in business) without your consent to collect your data is an example of this. Telling you that bad things will result from saying no or cancelling a service or subscription is also common. As is implying you’re doing something stupid if you decline the “offer”. This is also referred to as “negative opt-out”. 

Nudging

Nudging itself is not a dark pattern, but depending on how it’s used can become one. Nudging refers to the use of user interface or design elements, sometimes referred to as “choice architecture”, to guide user behavior. Some nudges you don’t even notice, and they’re innocuous, even beneficial. Others are more manipulative, possibly even illegal, and that’s where they move into the category of being dark patterns.

When users encounter a cookie banner on a website, unless all relevant information is easily available to them, and all of their consent (or rejection) options are identically or equally displayed (text, color, size, placement) they are being nudged. Most likely in the direction of providing more consent, rather than less.

Some privacy laws do state that organizations cannot deny access to products or services if a user refuses consent — if consent is not strictly necessary for providing them — but nudging is often less obvious and more of a gray area. (Learn more: Cookie walls – what’s allowed and what isn’t?)

Making acceptance rates your new key marketing KPI.

Icon Auditcheck

Why are dark patterns effective?

Dark patterns are effective for a variety of mainly psychological reasons. We do experience fear of missing out (or “FOMO”) or want to take advantage of a good deal. These tactics may play into patterns and norms so ingrained in us that the cognitive load to do the opposite — or even be aware of any manipulation — is a major effort. 

With our often limited attention spans online, nudging tactics can be highly effective without people even noticing. At this point, most people online have been exposed to them many times and we have been well trained to take the path of least resistance, which happens to be the one that gets companies what they want.

In the past, many people also haven’t been fully aware of how companies collect and use their data, but that is changing, as is concern about it and action to change it. Awareness of and resistance to dark patterns is also growing, both among consumers and legislators and privacy authorities.

Dark patterns and consumer relationship building

Dark patterns are designed and implemented to benefit the company using them. They may not do or request anything illegal, and may still achieve compliance with relevant privacy laws. But these actions comply with the letter, rather than the spirit, of the law, and aren’t doing anything to endear these organizations to their visitors, customers or prospects. 

Manipulation, and doing the least required to comply with the law (or skate under the radar), isn’t a great way to let your potential user base know that you’re a reputable organization that respects them and values doing business with them. It makes people less likely to give the organization what they want, be it money or data, and also less likely to become a return customer or to develop an ongoing business relationship. It can also result in people speaking negatively about the brand and thus influence others.

It seems counterintuitive to spend time, money, and resources on web or app design and development, marketing campaigns and brand-building efforts only to employ trust-harming activities. And the slope is a slippery one. While not all behaviors are clearly illegal, if there’s a complaint against a company regarding privacy or data protection, the authorities aren’t likely to give the offending organization much benefit of the doubt if they have a track record of ethically questionable or negligent behavior.

How the GDPR and CCPA address dark patterns

The GDPR continues to be updated as the geopolitical and technology landscapes change, and EU regulators are looking at preventing dark pattern usage. However, dark patterns are not explicitly mentioned in the GDPR to date. A study published in January 2020 called Dark Patterns after the GDPR: Scraping Consent Pop-ups and Demonstrating their Influence found that dark patterns remain common practice, ranging from mildly questionable gray area tactics to outright illegal noncompliant activities.

Some EU enforcement authorities are taking more aggressive action. French data protection authority CNIL released a report in April 2019 called Shaping Choices in the Digital World, with the subtitle “From dark patterns to data protection: the influence of ux/ui design on user empowerment”. The CNIL’s 50 million fine to Google was also in part because of dark pattern usage regarding privacy settings.

Several bills have been introduced in the US Senate to address dark pattern usage, like the Deceptive Experiences to Online Users Reduction Act (DETOUR Act), which would lean on the Federal Trade Commission’s powers to curb dark pattern usage. None of the introduced legislation has been passed or become law, however.

The requirement for transparency and informed, voluntary consent is central to the current California Consumer Privacy Act (CCPA), which in theory guards against the kinds of activities that dark pattern usage encompasses. Modifications to the CCPA were proposed in October 2020, among them:

  • limiting the number of steps required for a consumer to opt out of the sale of their personal information (cannot require more than opt in does)
  • prohibiting businesses from using confusing language to prevent consumer opt out
  • prohibiting businesses from requesting personal information from consumers trying to opt out when it is not necessary to complete the request
  • prohibiting businesses from forcing the consumer to read or listen to a list of reasons not to opt out while they are trying to opt out
  • prohibiting businesses from requiring consumers to search or scroll through a privacy policy, web page, etc. to find how to submit an opt out request when they have clicked “Do Not Sell My Personal Information”

Advertising and marketing associations were not fans of these proposed amendments, particularly the one about making consumers read or listen to a list of reasons not to opt out. They claimed that they would unduly hinder consumers’ receipt of “factual, critical information about the nature of the ad-supported Internet” thus “undermining a consumer’s ability to make an informed decision.”

California’s upcoming privacy law, the California Privacy Rights Act (CPRA), explicitly addresses dark patterns, listing them in definitions as: a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision‐making, or choice, as further defined by regulation”. 

Dark patterns are also referenced in the CPRA section about consent, noting that: “agreement obtained through use of dark patterns does not constitute consent.”

It is reasonable to expect that dark patterns and comparable tactics will be addressed in new privacy legislation that is tabled in various countries, and likely added to updates of existing laws, especially as technologies change and new applications become possible.

Conclusion

Though they’ve been in use for a long time, dark patterns are increasingly noticed, disliked, and potentially illegal. Consumers are becoming more savvy and legislators more strict. Businesses that look to trick customers and visitors — rather than opt for transparency in their communications and user interfaces — are taking a risky gamble. They are certainly trading possible short term data or revenue gains for long term damage to customer goodwill or brand reputation, if not millions in regulatory fines.

It is more than possible for businesses to be privacy compliant and get the data they need, with users’ valid, informed consent, and to build better long-term customer relationships with transparency rather than trickery. 

Run a website audit to find out what your cookie risk is with the GDPR or CCPA, as a great first step in ensuring your website is consumer-friendly and clear about data collection requests. Or, if you have questions, chat with one of our experts. We’re here to help.