Skip to content

Predatory commerce: How AI-driven personalized pricing could kill the internet

AI & emerging tech

Fluctuating surge and dynamic pricing aren’t new. But AI-driving technology is enabling personalization of pricing to a degree that consumers don’t understand, haven’t consented to, and that is potentially predatory. Here’s how it works, what the risks are, and what can be done about it.

Tilman Harmeling
Written by
Tilman Harmeling
Read time
5 mins
Published
Sep 2, 2025
Magazine / Articles / Predatory commerce: How AI-driven personalized pricing could kill the internet

Let’s start with a short story: You go to a bakery every morning to buy bread. So does your neighbor. One day, it occurs to you that the price of bread keeps going up and up. You decide to ask your neighbor if he has the same impression, and he says no. 

You find out that he still pays the previous prices. How can that be? You ask the baker. She stays vague and denies any differences in prices. You lose trust in buying from that bakery and start going to another one.

What happened in the bakery happens every day on many websites. We pay individual prices and we are fine with it, because in many cases we simply don’t know that other people could be paying less than we are due to factors we’re not informed about and have no control over.  

The differences among surge, dynamic, and personalized pricing

  • From market efficiency to digital predation: AI-powered dynamic pricing has moved beyond supply and demand into individualized psychological exploitation.
  • The trust collapse: Hyper-personalized, opaque pricing can erode trust in online transactions, leading to reduced user engagement and market participation.
  • The slow death spiral: Unchecked algorithmic manipulation could fragment the internet into gated, unequal experiences, where privacy is a privilege and fair pricing disappears.

When it comes to pricing, not all algorithms are created equal. Dynamic pricing is the broadest strategy: prices ebb and flow with the market. Airline ticket prices climb as seats fill up, hotel rooms prices drop midweek, and electricity costs more during peak hours. 

Surge pricing is a sharper, event-driven cousin. Fares spike dramatically when demand suddenly overwhelms supply, like Uber ride pricing doubling during a rainstorm or after a concert. Both models are visible and shared — everyone pays the same amount at that moment (whether elevated or not.) 

Personalized pricing, however, breaks this social contract. Instead of reacting to market conditions, it reacts to you. The same product might cost one shopper more than another, based on browsing history, device type, or predicted willingness to pay. 

If dynamic pricing is the tide, and surge pricing the sudden storm, personalized pricing is the invisible undertow — hidden, unequal, and potentially the most corrosive to hard-earned trust.

Problem statement: from market efficiency to digital (personalized) predation

Once, dynamic pricing was celebrated as a triumph of economic logic, balancing supply and demand with pinpoint precision. But now, AI has shifted it into a darker realm. 

No longer simply responsive to market signals, pricing algorithms are morphing into personalized predators, tracking your every click, mood, and vulnerability to squeeze out more profit — sometimes before you even realize you’re willing to pay it.

Personalized (surveillance) pricing is no sci-fi threat — it’s here. Airlines like Delta are expanding AI-powered fare adjustments tailored to you, the individual, based on location, browsing habits, and device type. 

Video Preview
Video Preview

We need your consent to load the YouTube Video service!

We use a third party service to embed video content that may collect data about your activity. Please review the details and accept the service to watch this video.

powered by Usercentrics Consent Management Platform

It’s not just airlines. Studies show that personalized ranking systems can actually raise prices across the board, eroding consumer welfare regardless of perceived fairness, according to Carnegie Mellon University in June 2025.

The trust collapse: when personalized pricing shatters consumer confidence

Prof. Maximilian Grafenstein von Grafenstein has published research (DE) in recent years on individuals’ attitudes toward how their data is used. 

He fundamentally wanted to answer the questions of what individuals really want to avoid when discussing how their personal data is treated, including intrusions into people’s private lives, behavioral manipulation, societal risks, and material harm — which can include you paying more than someone else.

No one wants to pay more than someone else. Unless they are convinced that they are getting more than the other person for what they’re paying, and they believe the extra cost is worth it.

Here trust comes into play. Trust is a positive attitude towards uncertainty. It means that I may not have control in a situation, but I think everything will end up fine and no one will harm me. 

Again, individuals’ fundamental expressed belief is that their personal data should not be used to materially harm them. But what is personalized pricing if not material harm? 

When you start losing trust, the slope gets slippery fast

Trust is the fragile currency of the internet. When users realize that identical goods may carry different price tags depending on who’s shopping, the idea of “fairness” begins to erode. 

This triggers a kind of algorithm aversion. Consumers may recoil from choices endorsed by machines — even overtly rational decisions — simply because they feel manipulated.

Studies underline this shift. Topping the search results may feel tailored to you, but if those adjustments result in higher prices, you’ve lost, not gained. Transparency, or lack thereof, deepens people’s suspicion. 

When pricing operates as a black box, neither users nor businesses can simply explain why people paid more, or why you paid more. Trust bleeds away.

The slow death spiral: fragmented web, gated access, lost equality

Unchecked, algorithmic pricing fractures the internet. It morphs it into a world of digital gated communities, where only the affluent or technically savvy enjoy privacy and fair treatment, while the rest are funnelled into zones of exploitation.

The consequence? An online world where privacy becomes a luxury. Fair pricing becomes a relic. The internet risks devolving into a bifurcated society: some in free creative zones, others in algorithmic cages of ever-inflating costs. The “real world” already often operates this way: it’s incredibly expensive to be poor.

The tension: innovation vs. exploitation

AI teases dazzling efficiency gains. Think real-time pricing engines, perfectly timed offers, dynamic ticket deals that fill seats — even during off-peak hours. But there’s a fork in the road. Will these tools serve users or turn them into targets?

Do we champion innovation at the cost of eroded trust? Or do we demand the limits necessary to preserve the internet as a level playing field? That’s the defining tension.

Solution: transparency, regulation and ethical AI governance

This isn’t a call to dismantle AI. It’s a call to humanize it. It’s another facet of privacy by design. Some of the important elements that need to be included:

  1. Transparent algorithms: Pricing systems should be demystified. Explain what factors influence your price and let users opt out of invasive personalization.
  2. Ethical governance frameworks: Let’s create and enforce Dynamic Pricing Governance (DPG) to balance profitability with responsibility through transparency, empowerment, and regulatory collaboration.
  3. Regulatory guardrails: Policies could cap pricing disparities, restrict opaque personalization practices, and enforce disclosure requirements. (Per Xu, Renzhe et al. 2022.)
  4. Fairness-aware algorithms: Reinforcement learning techniques can align pricing with fairness and long-term trust, not just short-term gains (Per Maestre, Roberte et. al 2018.)

Call to action: save the internet, one fair price at a time

Personalized pricing need not become the silent executioner of the internet’s soul. But time is running out. There are ways that we can all make a difference.

  • Individuals: Speak up. Demand clarity from companies you do business with. (Or before you agree to do business with them.) Advocate for your fair digital share.
  • Businesses: Recognize consumer confidence as a sustainable advantage. Don’t fall for the short-term benefits of exploitation. Use AI responsibly.
  • Policymakers: Enact transparent, consumer-first rules. Ban surveillance pricing, not just abuses, but the mechanisms themselves.
Report: The State of Digital Trust in 2025

Trust has become the internet’s key growth driver and competitive differentiator. Learn what consumers think about digital trust and where current challenges and opportunities lie.

Get report
How AI is reshaping organic visibility from SEO to GEO, with Semrush and Havas Market UK
Microsoft’s Navah Hopkins on consent, conversion, and scaling value
Dark patterns destroy digital trust. Fair Patterns founder says AI can redesign them for good
Can AI be ethical by design?
The duality of algorithms in a privacy-conscious world