Skip to content

12 Days of Privacy

‘Tis the season for transparency and trust (and treats!)

Day 1 of privacy

The holiday season is a festive feast of customer data, but responsible collection and use are vital. Every interaction — from click to check out — can build or erode trust, impacting your brand long after the holidays.

Here are key holiday shopping touchpoints where customer consent is required.

Give (and get) the gift of trust this holiday season. Privacy-centric brands aren’t just compliant. They convert better. Stay on your customers’ nice list all year round.

Day 2 of privacy

The holidays can bring digital privacy risks along with delight, and it’s not just your credit card transactions that you need to keep an eye on. Smart speakers, consoles, and connected toys often request logins, microphone access, or account pairing, all of which can open the door to new forms of data collection. With a few proactive steps, families can enjoy their new tech while limiting privacy exposure.

Start with setup: Make security part of unboxing

Before batteries go in or the device powers on, take a moment to review privacy settings, permissions, and connection options. Most devices now provide granular controls that can help to support safer and more private use.

Recommendations for better security and data privacy

Recommendations for better security and data privacy

Recommendations for better security and data privacy
Create unique logins and avoid reusing passwords

Across toys, streaming services, and email accounts. A password manager helps generate strong credentials, and masked email addresses can reduce spam and reveal when data has been shared.

Turn on multi-factor authentication

Wherever offered, especially for gaming, social platforms, and device ecosystems. It’s a simple way to reinforce access security.

Check network access

And ensure your Wi-Fi is properly secured. Creating separate networks for work, guests, and devices can help isolate risks.

Update firmware right away

As many patches are released only after devices leave the factory. Older toys or gadgets may be several updates behind.

Smart devices, smarter privacy

Connected toys and AI-enabled assistants are increasingly common. They can capture snippets of conversation or behavioral patterns, so it’s important to understand how they listen — and how to limit what they store.

A brief look at how these devices operate can help you make informed choices as you configure them.

How connected toys and digital assistants listen and respond

Voice assistants listen locally for a “wake word,” such as “Alexa,” or “Hey Siri.” When detected, the device begins recording and interpreting speech. Commands are typically processed in two stages: on-device wake word detection, followed by cloud-based natural language processing.

While wake word detection means devices aren’t continuously recording, they are continuously listening, which is what creates privacy questions for many families. 

Devices’ “listening” isn’t foolproof

False activations are possible. Music, similar-sounding words, or background noise can trigger recordings unexpectedly. There have also been cases where snippets were reviewed by human evaluators to improve accuracy.

Such incidents have raised concerns, particularly when mistaken activations resulted in private conversations being transmitted to unintended contacts. In addition, some criminal investigations have involved requests for smart speaker recordings. Providers are increasingly shifting more processing on-device to minimize exposure.

Checklist for better security and data privacy

Checklist for better security and data privacy
Follow the principle of least privilege

And grant only the permissions a device truly needs.

Choose on-device processing

Where possible so information remains local.

Disable microphones or cameras

when not needed, or use a physical mic-mute switch during sensitive conversations.

Turn off always-listening modes

And review companion app access to photos, recordings, contacts, and location.

Place voice-enabled devices away

From areas where private or work-related discussions happen.

Delete recordings or history

Periodically if cloud storage is used by default. Opt out of human review programs when available.

Turn off optional features

Like drop-in or voice purchasing, especially when credit cards are linked.

Use parental controls

Voice profiles, and periodic audits of linked integrations or calendars.

Register toys to an adult

Rather than a child to reduce the amount of identifying data collected. Many children’s accounts require parental authorization by law.

Notable enforcement actions for toys, games, and connected platforms

Regulators have taken significant action in recent years against platforms, toys, games, and ed-tech providers that mishandled children’s data. These cases underscore the importance of parental consent, transparent defaults, and appropriate data handling.

Examples include:

Learn about the privacy policies of major platforms in our guide.

Bringing privacy and safety into the season

As new devices, apps, and connected toys enter your home, a few mindful steps can help your household enjoy them safely all year. Strong credentials, sensible permissions, and understanding how listening technologies operate all help reinforce confidence and control at a time when data can travel quickly.

As children explore new tools, adult guidance becomes part of their digital foundation. Helping them make informed choices about privacy equips them with safe, long-lasting habits. With a thoughtful approach to holiday tech, you can enjoy the season’s excitement while protecting your family’s data well into the new year.

Day 3 of privacy
Video Preview

We need your consent to load the YouTube Video service!

We use a third party service to embed video content that may collect data about your activity. Please review the details and accept the service to watch this video.

powered by Usercentrics Consent Management Platform
Day 4 of Privacy
Stylized world map in blue showing black stars in countries and U.S. states corresponding to the privacy laws referenced.

California, USA

Stylized map in blue, charcoal, and black, of the state of California, showing trees, houses, mountains, and a railroad.
California Consumer Privacy Act (CCPA) / California Privacy Rights Act (CPRA)

January 1: comprehensive amendments in force

  • Expanded consumer rights and contractor / processor rules
  • New dark pattern restrictions
  • Stricter purpose limitation and data minimization obligations
  • Read more

    CA AB 45: Privacy: health data: location and research

    January 1: in force

  • Stronger limits on collecting precise location and health data
  • Explicit consent required; retention limits
  • Read more

    Defending Californians’ Data Act (SB 361)

    January 1: in force

  • Requires a universal deletion mechanism
  • More disclosures required to CalPrivacy
  • Read more

    Account Cancellation Act (AB 656)

    January 1: in force

  • Platforms must provide simple, accessible account deletion flows
  • Clear confirmation and timing requirements
  • Read more

    China

    Stylized map in blue, charcoal, and black, of the , showing a Chinese flag centered.
    Personal Information Protection Law (PIPL)

    January 1: updates in force

  • Mandatory China-based assessments before data export
  • Requires approved transfer mechanisms or certifications
  • Heavy penalties for noncompliance
  • Read more

    Kentucky, USA

    Stylized map in blue, charcoal, and black, of the state of Kentucky, showing trees, houses, and mountains.
    Kentucky Consumer Data Protection Act (KCDPA)

    January 1: in force

  • Must provide opt-out for targeted ads and data sales
  • Must perform DPIAs for sensitive-data uses
  • Requires consumer rights portals (access, delete, etc.)
  • Read more

    Oregon, USA

    Stylized map in blue, charcoal, and black, of the state of Oregon, showing trees, houses, and coastline.
    Oregon Consumer Privacy Act (OCPA)

    January 1: updates in force

  • Tighter rules for teens and collection of precise geolocation
  • Must honor universal opt-out signals
  • Stronger sensitive data requirements
  • Read more

    Rhode Island, USA

    Stylized map in blue, charcoal, and black, of the state of Rhode Island, showing trees, houses, and coastline.
    Rhode Island Data Transparency and Privacy Protection Act (RIDTPPA)

    January 1: in force

  • Clear privacy notices required
  • Limits data collection to what is “reasonably necessary”
  • Prohibits selling minors’ data without consent
  • Read more

    Vietnam

    Stylized map in blue, charcoal, and black, of the Vietnam, showing a Vietnamese flag centered.
    Personal Data Protection Law (PDPL) (Law No. 91/2025/QH15)

    January 1: in force

  • Requires Data Protection Officers and impact assessments
  • Explicit consent for most processing
  • Strict cross-border transfer requirements
  • Read more

    Law on the Digital Technology Industry

    January 1: in force

  • Tightens rules on digital data exports
  • Requires government approval for certain transfers
  • Additional cybersecurity obligations
  • Read more

    Colorado, USA

    Stylized map in blue, charcoal, and black, of the state of Colorado, showing trees, houses, and mountains.
    Colorado Artificial Intelligence Act

    February 1: in force

  • Risk management program required for high-risk AI
  • Transparency obligations for AI decision-making
  • Consumers can appeal adverse automated decisions
  • Read more

    Brazil

    Stylized map in blue, charcoal, and black, of Brazil, showing a Brazilian flag centered.
    Digital Child and Adolescent Statute (ECA Digital)

    March 17: in force

  • Bans behavioral ads targeted at minors
  • Prohibits emotional analysis and manipulative interfaces
  • Requires strict parental consent and transparency
  • Read more

    Maryland, USA

    Stylized map in blue, charcoal, and black, of the state of Maryland, showing trees, houses, and coastline.
    Maryland Online Data Privacy Act (MODPA)

    April 1: enforcement regarding personal data processing

  • Strict data minimization rules
  • Broad restrictions on processing children’s data
  • Purpose limitation and sensitive data restrictions strengthened
  • Read more

    United States

    Stylized map in blue, charcoal, and black, of the United States, showing a US flag centered.
    Children’s Online Privacy Protection Act (COPPA)

    April 22: compliance deadline for new requirements

  • Stricter parental consent verification requirements
  • New limits on profiling, tracking, and data retention for children
  • Expanded obligations for ed-tech and connected devices
  • Read more

    United Kingdom

    Stylized map in blue, charcoal, and black, of the United Kingdom, showing a British flag centered.
    Data (Use and Access) Act

    June: full implementation expected

  • Opens certain datasets for regulated access
  • New rules for data-sharing, innovation, and oversight
  • Strengthens safeguards for high-risk processing
  • Read more

    Cameroon

    Stylized map in blue, charcoal, and black, of the Cameroon, showing a Cameroonian flag centered.
    Personal Data Protection Act (Law No. 2024/017)

    June 23: compliance deadline

  • Requires registration with the data authority
  • Strong consent and security requirements
  • Limits cross-border data transfers
  • Read more

    Arkansas, USA

    Stylized map in blue, charcoal, and black, of the state of Connecticut, showing trees, houses, and a river.
    Arkansas Children and Teens’ Online Privacy Protection Act

    July 1: in force

  • Applies to children under 13 and teens 13-16
  • Prohibits using children’s or teens’ personal data for targeted advertising
  • Businesses must minimize data collection, provide clear privacy notices, obtain prior consent, and enable exercising rights
  • Read more

    Connecticut, USA

    Stylized map in blue, charcoal, and black, of the state of Connecticut, showing trees, houses, and a railroad.
    Connecticut Data Privacy Act (CTDPA)

    July 1: updates in force

  • Expands definition of sensitive data (opt-in required)
  • More entities fall under scope (fewer exemptions)
  • Stricter children’s data protections and AI disclosure duties
  • Read more

    Indiana, USA

    Stylized map in blue, charcoal, and black, of the state of Indiana, showing trees, houses, and a railroad.
    Indiana Consumer Data Protection Act (INCDPA)

    July 2: in force

  • Must honor consumer rights (access, delete, correct)
  • Requires data protection assessments for high-risk processing
  • Consent required for sensitive data
  • Read more

    European Union

    Stylized map in blue, charcoal, and black, of Europe, showing a European Union flag centered.
    EU AI Act

    August 2: most provisions in force

  • Requires risk assessments, transparency, and documentation
  • Strict rules for biometric and predictive AI systems
  • Read more

    EU Data Act

    September 12: new product design requirements

  • Must allow users access to data generated by connected devices
  • Requires product design enabling portability and interoperability
  • New rules for cloud service switching
  • Read more

    GDPR and Digital Omnibus

    EC released proposal November 18, 2025, expected implementation in 2026

  • Likely tighter rules for dark patterns and adtech
  • Expanded children’s privacy safeguards
    Potential broader exemptions for tracking/cookies
  • Harmonization of enforcement procedures
  • Read more

    India

    Stylized map in blue, charcoal, and black, of India, showing an Indian flag centered.
    Digital Personal Data Protection Rules, 2025 (DPDP)

    November 13: Rule 4: Registration and governance framework for Consent Managers

  • Consent Manager is not mandatory for all companies or data processing, but must meet eligibility criteria
  • The Data Protection Board oversees Consent Managers
    Data Principals can give/manage/withdraw consent via the Consent Manager
  • Data Fiduciaries must be able to interoperate with the platforms of registered Consent Managers
  • Read more

    Chile

    Stylized map in blue, charcoal, and black, of Chile, showing a Chilean flag centered.
    Law 21,719 (nDPL)

    December 1: in force

  • GDPR-like framework with strong rights and penalties
  • Requires DPO for many organizations
  • Tightens cross-border transfer rules
  • Read more

    Australia

    Stylized map in blue, charcoal, and black, of the Australia, showing an Australian flag centered.
    Australia Privacy Act

    December 10: updates on automated decision-making and the Children’s Online Privacy Code in force

  • New rules for automated decision-making notices
  • More specific privacy-policy obligations
  • Higher penalties and broader individual rights
  • Read more

    International

    Stylized world map in blue showing black stars in countries and U.S. states corresponding to the privacy laws referenced.
    IAB Tech Lab Global Privacy Protocol expansion and Data Deletion Request Framework (DDRF) v2

    Public comment period closed December 1, 2025, expected release of final versions in 2026

  • Standardizes global consent and privacy signaling
  • Introduces a unified deletion-request framework
  • Helps enterprises manage multi-jurisdiction compliance
  • Read more

    Day 5 of Privacy

    The holiday season can be hectic, so anything that simplifies planning or shopping can feel like a lifesaver. Many families are using AI-powered tools to plan entertaining, shop efficiently, and coordinate busy schedules.

    These tools raise questions about privacy, data collection, and control. With the right knowledge and a few practical habits, holiday browsing can be smoother, safer, and more enjoyable.

    Recent research from Pew Research Center shows that 73 percent of Americans are willing to use AI for everyday tasks. But 57 percent feel they have little or no control over use of these systems in their lives, and many consumers encounter AI without realizing it

    Understanding how AI collects and uses your data

    As AI becomes more common in shopping apps, voice assistants, and browser extensions, many people use these tools without fully realizing how much data they collect. AI systems can interpret prompts, track browsing behavior, and learn from user interactions to improve recommendations.

    Common AI touchpoints during holiday shopping

    These tools often rely user inputs to tailor suggestions, including:

    • Answering questions about stores or products
    • Making recommendations for gifts, menus, or travel
    • Suggesting nearby shops and checking hours of operation or delivery cutoffs
    • Price-checking or deal-finding extensions

    What data do AI-powered tools collect?

    Depending on settings, AI tools may collect:

    • Search queries and browsing activity
    • Written or spoken prompts
    • Location data (depending on settings)
    • Shopping cart contents or purchase history

    Surfshark looks at 10 popular shopping apps and breaks down how much data they collect from you as you browse.

    Practical privacy habits for adults

    Limit what you share. AI tools rarely need very personal details to be useful. Try using neutral phrasing such as:

    • “Gift ideas for a 10-year-old who enjoys drawing” instead of naming the child
    • “Create a holiday budget template” rather than entering card details or specific purchases
    • “Turn the lights on at 5 p.m.” without providing a specific address or travel dates

    The Federal Trade Commission (FTC) provides guidance on protecting personal information and privacy online, including browser use, advertising permissions, and app usage.

    Smart data privacy management continues once your AI assistant has given you website or app recommendations. Treat consent banners as quick privacy dashboards.

     Rather than automatically selecting “Accept all”, take a moment for review, and potentially to opportunity to explain these actions to children in an age-appropriate way:

    • Look for clear descriptions of what data is collected and for what purposes
    • Adjust granular settings if available
    • Decline optional permissions that don’t relate to your task

    Strengthen your device and account protections

    Simple security checks provide valuable support for safer browser and reducing the risk of data exposure:

    • Update devices and apps
    • Use strong, unique passwords (and ideally a password manager)
    • Enable two-factor authentication on key accounts
    • Prefer secure home networks over public Wi-Fi when logging in or checking out
    • Turn on browser fraud warnings and password-breach alerts
    • Consider checking out as a guest when possible to limit stored information

    The U.K.’s National Cyber Security Centre has helpful advice for safe shopping online for individuals.

    Check the value exchange

    If an AI tool, website, or app asks for information, it should be for a clear reason that directly relates to what you want to do.

    If asked for access to contacts, precise location, photos, or full browsing history, pause and consider:

    • Why does it need this information?
    • Does the request clearly relate to what I’m doing?
    • Can I decline and still use the tool?

    If the purpose isn’t clear, it’s reasonable to say no and go elsewhere. This is especially true for platforms aimed at children. Data privacy laws generally have strict requirements for platforms regarding access to kids’ information. 

    Teaching children to use AI safely

    Children often use AI tools without realizing that some details they share may be personal. And just because the family may talk to a household AI assistant often, it’s not a friend, or even a person.

    Set boundaries and model good habits

    Kids learn by watching adults. When they see you reading or phrasing prompts carefully, adjusting cookie permissions, or avoiding oversharing, they absorb those behaviors. Where possible:

    • Enable built-in child protections or family settings
    • Use supervised accounts for younger children
    • Disable or restrict purchasing features in smart devices

    Exercise strong caution with smart toys using AI, as there are few controls on them to date, and they have been found to say inappropriate and disturbing things. The impact on such smart toys on children also hasn’t yet been well studied.

    Explain “private information” in simple terms

    One helpful guideline is: If you wouldn’t tell a stranger, don’t tell an AI tool. Examples of information you don’t want to share include:

    • Full names or birthdates
    • Passwords or usernames
    • Names and locations for home, school, or activities
    • Family travel plans
    • Photos revealing locations, identities, or personal details

    Encourage kids to ask an adult before sharing anything new or if they are confused or concerned about any online information request.

    UNICEF’s guidance on AI and children outlines ways to support young users.

    Make safe browsing a shared activity

    The holiday season provides opportunities for shared activities, like coming up with kinds of cookies to bake or shopping together. Be intentional about how you phrase questions or prompts to AI assistants, and identify secure websites and trusted retailers.

    Common Sense Media has some great resources on AI, like their Ultimate Parents’ Guide, AI Risk Assessments, and Guide to ChatGPT.

    Evaluating AI tools before you use them

    Take a moment to look into the design and data practices of AI-powered tools. Transparency is a strong indicator that a company respects data privacy and security.

    • Do they describe what data they collect and why?
    • Are there reports of privacy violations or problematic practices with the provider? 
    • Are there disproportionate promises in exchange for information?
    • Are users able to opt out of various tracking or personalization?
    • Can users easily customize account settings and user experience? 

    Safer, more secure holidays for a happier new year

    The benefits of AI-powered tools rely on thoughtful use. A little clarity and care can help families enjoy the convenience of AI while staying in control of their data and privacy, not just during the holidays but throughout the year.

    Day 6 of Privacy
    Video Preview

    We need your consent to load the YouTube Video service!

    We use a third party service to embed video content that may collect data about your activity. Please review the details and accept the service to watch this video.

    powered by Usercentrics Consent Management Platform
    Day 7 of Privacy

    Demonstrating respect for data privacy builds trust and encourages ongoing donor relationships. Safe and trustworthy donation experiences are a gift that charities can give individuals and companies.

    How to vet causes before giving

    How to vet causes before giving
    Check legitimacy
    • Registration numbers
    • Verified directories
    • Transparency ratings
    Review privacy practices
    • Clear notice
    • What data is collected and why
    • Cookie/tracking disclosures
    Evaluate security
    • HTTPS (check website URL)
    • Reputable payment processor
    • Multi-factor authentication option for logins
    Watch for red flags
    • Urgency pressure
    • Unsolicited messages, especially DMs
    • Requests for excessive personal information

    Privacy compliance for charities

    Privacy compliance for charities
    Follow legal requirements and privacy best practices
    • Provide a lawful basis for personal data collection and use (where required)
    • When obtaining consent for data collection and processing, ensure it’s informed and voluntary
    • Use donor data only for clearly defined and communicated purposes
    • Collect only the data necessary for the stated purposes, e.g., processing the donation
    • Ensure opt-out or consent withdrawal options are easily accessible
    • Ensure any third-party partners with access to donor data maintain appropriate data privacy and protection practices

    The holiday season brings high-intent shoppers and a flood of behavioral and transactional data. But are you collecting and activating it the right way? 

    Trust increasingly determines whether consumers choose you or move on, and it’s fast becoming a performance metric in its own right.

    Here are 12 quick tips to help refine consent flows while reinforcing credibility, control, and transparency. Keep the season merry and bright for customers and marketers alike.

    You want to make a great impression on all of those holiday shoppers, not just with great deals. 

    When was the last time you updated your tracking technologies or your consent banner design?

    A fresh layout with short, clear explanations and easy-to-navigate choices is a great gift for all. Make it fun with festive visuals.

    Consent doesn’t need to be a hurdle. Integrate it naturally when people create wishlists, playlists, or check out. Contextual prompts reinforce clarity and keep the journey smooth. 

    People say yes more often when they understand how consent can benefit what they’re already doing.

    But remember, consent for one thing isn’t consent for everything. Just because someone bought Grandma a scarf doesn’t mean you can subscribe them to your newsletter.

    Day 3: Check for privacy compliance updates

    Regulations and platform requirements evolve as quickly as your marketing stack.

    A quick pre-campaign audit of your CMP settings supports privacy-compliant, dependable data flows throughout the holiday rush.

    And with new regulations arriving 1 January 2026, now isn’t the time for a long winter’s nap (or a trip to the beach if you’re in the southern hemisphere).

    Day 4: Optimize for mobile-first shoppers

    Mobile usage surges during peak season, Make sure your consent banner loads quickly, displays clearly, and offers easily accessible choices. 

    Small improvements can enhance trust and encourage users to stick with you. 

    Mobile users are already less patient, so don’t make the journey feel longer than a winter’s night at the North Pole.

    Day 5: Use consented data for better personalization

    With the right permissions, you can deliver fun and relevant experiences.

    Zero- and first-party data is the most accurate and reflective of customers’ needs and interests. 

    Highlight how personalization benefits them to reinforce the value exchange. For example, quicker, more targeted gift suggestions or restock alerts. A hectic time becomes easier, even enchanting.

    Day 6: Keep your purposes clear and simple

    During a busy season, clarity wins. Decision fatigue is always lurking.

    Keep purpose descriptions short and in plain language so people understand what you want, why, and how it improves their experience.

    When they trust the exchange, they engage longer and share more.

    If you’re optimizing holiday content, optimize your consent banner too.

    A/B testing microcopy, placement, or button text reveals what improves engagement. Watch opt-in rates and interaction patterns to guide refinements. 

    What you learn will be valuable year-round.

    Day 8: Strengthen trust through transparent messaging

    Trust drives conversions, especially when shoppers have endless options.

    Be clear about how consent and data support safer, more relevant, more personal experiences.

    Avoid vague claims. Short, well-timed explanations go a long way toward building confidence.

    Day 9: Improve page performance

    Peak traffic strains websites, and delays increase drop-offs. 

    Reducing non-essential scripts, reviewing tag load order, and avoiding heavy elements can help pages load faster — supporting user satisfaction and higher opt-in rates.

    Give the gift of helping people get what they want, when they want it.

    Day 10: Review server-side integrations

    Reliable data flows matter even more during busy seasons. Gaps in privacy compliance or attribution are worse than a lump of coal.

    Review your server-side tagging setup to confirm consent signals are honored across the stack. Better measurement, better attribution, and more respect for user choices.

    It also keeps you firmly on the “Nice” list.

    Consistency across websites, apps, email, connected TV, and more reinforces trust and reduces confusion and frustration. 

    You don’t need magic. Just a solid CMP to support cross-device and cross-platform consent management.

    It’s especially helpful for returning customers engaging across multiple touchpoints

    Day 12: Measure and celebrate improvements

    Seasonal data offers insights for the whole year.

    Review changes in opt-in rates, performance, and engagement to see what resonated.

    Carry forward successes. Better transparency, UX, and messaging will help you build momentum for the new year. For sustainable growth all year long.

    Day 9 of privacy
    Video Preview

    We need your consent to load the YouTube Video service!

    We use a third party service to embed video content that may collect data about your activity. Please review the details and accept the service to watch this video.

    powered by Usercentrics Consent Management Platform
    Day 10 of Privacy

    Online retailers and service providers compete intensely for attention and sales during the holidays. Many resort to dark patterns that blur the line between persuasion and manipulation. We selected these five examples for how frequently they appear, how significantly they affect spending and consumers’ rights, and how much regulatory pressure they’re currently attracting.

    1. Drip pricing and hidden fees

    1. Drip pricing and hidden fees
    What it is

    Customers are attracted by low upfront prices, but mandatory fees appear at checkout and inflate the final price.

    Common in

    Retail and travel.

    Spotlight

    EU Commission price-transparency sweeps have repeatedly flagged drip pricing, and University College London conducted influential research into airlines’ “99p flights” and card surcharges, which led to regulatory and industry change.

    2. Obstructing cancellation (“roach motel”)

    2. Obstructing cancellation (“roach motel”)
    What it is

    Easy to subscribe but difficult, confusing, or lengthy to cancel.

    Common in

    Subscriptions, memberships, streaming services, and delivery services.

    Spotlight

    The Federal Trade Commission in the U.S. alleged Amazon’s Prime cancellation flow was “labyrinthine,” resulting in a $2.5 billion settlement. Publishers Clearing House (PCH) also settled with the FTC for $18.5 million over dark patterns used to mislead consumers about entering their sweepstakes.

    3. Forced continuity / subscription traps

    3. Forced continuity / subscription traps
    What it is

    “Free” or ultra-low-price trials with weak disclosure, which auto-renew at higher rates.

    Common in

    Apps and mobile games, streaming services, fitness/meal services, and SaaS.

    Spotlight

    Weight-loss app Noom settled a U.S. class action over deceptive auto-renewal and cancellation for $56 million cash and $6 million in credits. Meal kit service HelloFresh paid $7.5 million (penalties, restitution, and costs) to resolve a California enforcement action led by the Automatic Renewal Task Force regarding their practices.

    4. Scarcity and urgency pressure

    4. Scarcity and urgency pressure
    What it is

    Timers, “Only 1 left,” “21 people are viewing this now,” or resetting countdowns.

    Common in

    E-commerce product pages, travel bookings, and discount events like Black Friday.

    Spotlight

    The UK’s Competition and Markets Authority (CMA) forced major hotel booking sites (e.g., Booking.com, Expedia) to halt “pressure selling” tactics such as misleading scarcity and fake urgency. Similarly, the Netherlands’ Authority for Consumers & Markets (ACM) took enforcement action against Chinese webshop Temu for using fake discounts, countdown clocks, and scarcity claims.

    5. Confusing consent and privacy settings

    5. Confusing consent and privacy settings
    What it is

    Asymmetric banners, hidden or removed “reject” option, vague labels, or long opt-out paths.

    Common in

    Website cookie banners, account settings, and app onboarding.

    Spotlight

    The Norwegian Consumer Council’s “Deceived by Design” report showed how major platforms used defaults, misleading wording, and added friction. The European Data Protection Board’s (EDPB) dark pattern guidelines highlight tactics such as overloading, skipping, and obstructing.

    Smarter, safer holiday tech for families (and how brands can help)

    The holiday season often comes with new gadgets, games, and connected toys, along with the rush to set them up. In the excitement, it’s easy to skip privacy checks that shape how much data you and your family share. 

    A few minutes of mindful setup can help support safer and more transparent experiences for everyone, especially children.

    Resources to help make informed privacy choices

    You don’t have to investigate every device alone. There are good sources that offer clear, consumer-friendly reviews of privacy practices. These resources can help you spot red flags before you buy or set up accounts.

    • BBC reporting often highlights digital privacy issues, breaches, and risks, which are helpful when researching a device or brand.
    • Consumer Reports Digital Lab evaluates common tech products and explores what happens behind “Agree” buttons.
    • Common Sense Media has Parents’ Ultimate Guides with reviews, parental control setup instructions, and more, filterable by age, platform, and more.

    Account hygiene: Seasonal cleaning for your accounts and data

    New devices usually mean new accounts for games, apps, streaming services, and cloud backups. All of these expand your digital footprint. Taking time for basic account hygiene supports safer year-round use.

    Recommendations for better security and data privacy

    Safer digital experiences begin with small, practical steps that help you stay in control of your data — including preventing access to it entirely.

    Teaching kids to be privacy-savvy

    Kids adopt digital habits quickly. Introducing privacy concepts early helps them navigate devices and apps with confidence.

    Explain, in age-appropriate terms, why personal data matters, what devices can collect, and why some information shouldn’t be widely shared. Many kids are naturally vigilant once they understand the risks, and often remind others.

    Examples you can discuss together:

    • “Your toy can listen like a microphone. Let’s decide when that’s OK and when it should stay muted.”
    • “Usernames and account details shouldn’t include your real name, school, or address.”
    • “If an app asks for your photo or location, check with me first.”

    The Children’s Online Privacy Protection Act (COPPA) governs how children’s personal data online is accessed, used, and protected in the United States.

    Check before you connect

    Before downloading an app, pairing a toy, or creating a new account, run a quick check of the essentials. A few extra minutes can prevent unwanted data exposure and create a safer digital environment.

    How companies can give the gift of privacy and trust

    While families carry much of the work of safeguarding personal data, businesses share responsibility. Developers, retailers, manufacturers, marketers, and website owners all influence how transparent and secure consumer experiences can be.

    Building trust starts with privacy by design and considering privacy, data protection, and user rights at every stage, from concept to post-purchase support.

    A transparent approach doesn’t just reduce regulatory risk. It helps strengthen customer trust, which is the foundation of long-term relationships.

    What companies can do

    What companies can do
    Disclose

    Provide clear, accessible information within websites, apps, and consent management platforms.

    Clarify

    Communicate what data is collected, why, who can access it, and how individuals can exercise their rights.

    Support

    Offer guidance on how to adjust settings to limit data collection or processing.

    Activate

    Adopt Privacy-Led Marketing to support the responsible activation of high-quality, consented data while respecting the privacy of children and families.

    Day 12 of privacy
    Video Preview

    We need your consent to load the YouTube Video service!

    We use a third party service to embed video content that may collect data about your activity. Please review the details and accept the service to watch this video.

    powered by Usercentrics Consent Management Platform
    Easter egg
    Questions about how you can have a more Privacy-Led 2026?

    Laws, policies, and consumers’ expectations are constantly evolving. We’re here to help.