
You check prices online for a flight to Melbourne today. It’s $300. You leave your browser open. Two hours later, it’s $320. Half a day later, $280. Welcome to the world of algorithmic pricing, where technology tries to figure out what price you’re willing to pay.
Artificial intelligence (AI) is quietly remaking how companies set prices. Not only do prices shift with demand (dynamic pricing), but firms are increasingly tailoring prices to individual customers (personalised pricing).
This change isn’t just technical – it raises big questions about fairness, transparency and regulation.
How different pricing models work
Dynamic pricing reacts to the market and has been used for years on travel and retail websites.
Algorithms track supply, demand, timing and competitor prices. When demand peaks, prices rise for everyone. When it eases, they fall. Think Uber’s surge fares, airline ticket jumps in school holidays, or hotel rates during major events. This kind of variable pricing is now commonplace.
Personalised pricing goes further. AI uses personal data – your browsing history, purchase habits, device, even postcode – to predict your willingness to pay. The price varies with the individual. Some call this “surveillance pricing”.
Two people looking at the same product at the same time might see different prices. A person who always abandons carts might get a discount, while someone who rarely shops might see a premium price.
A study by the European Parliament defines personalised pricing as “price differentiation for identical products or services at the same time based on information a trader holds about a potential customer”.
Whereas dynamic pricing depends on the market, personalised pricing depends on the individual consumer.
It started with airfares
This shift began with the airline industry. Since deregulation in the 1990s, airlines have used “yield management” to alter fares depending on how many seats are left or how close to the departure date a booking is made.
More recently, airlines combine that with personalisation. They draw on shopping behaviour, social media context, device type, past browsing history – all to craft fare offers uniquely for you.
Hotels followed. A hotel might raise its base rate, but send a special “member only” discount to someone who has stayed before, or offer a price drop to someone lingering on a booking page. In hotel revenue management, pricing strategies enable companies to target distinct customer segments with different benefits (such as leisure versus business travellers).
AI enhances this process by enabling automated integration of large amounts of customer data into individual pricing.

Jakub Porzycki/NurPhoto via Getty Images
Now the trend is spreading. E-commerce platforms such as Booking.com routinely test personalised discounts, depending on your profile. Ride-share apps, grocery promos, digital subscription plans – the reach can be broad.
How AI-driven personalised pricing works
At its core, such systems mine data, a lot of it. Every click, the amount of time spent on a web page, prior purchases, abandoned carts, location, device type, browsing path – these all feed into a profile. Machine learning models predict your “willingness to pay”. Using these predictions, the system picks a price that maximises revenue while hoping not to lose the sale.
Some platforms go further. At Booking.com, teams used modelling to select which users should receive a special offer, while meeting budget constraints. This drove a 162% increase in sales, while limiting the cost of promotions for the platform.
So you might not be seeing a standard price; you might be seeing a price engineered for you.
The risk is consumer backlash
There are, of course, risks to the strategy of personalised pricing.
First, fairness. If two households in the same suburb pay different rent or mortgage rates, that seems arbitrary. Pricing that uses income proxies (such as device type or postcode) might entrench inequality. Algorithms may discriminate (even unintentionally) against certain demographics.
Second, alienation. Consumers often feel cheated when they find a lower price later. Once trust is lost, customers might turn away or seek to game the system (clear cookies, browse in incognito mode, switch devices).
Third, accountability. Currently, transparency is low; firms rarely disclose the use of personalised pricing. If AI sets a price that breaches consumer law by being misleading or discriminatory, who’s liable — the firm or the algorithm designer?
What the regulators say
In Australia, the Australian Competition and Consumer Commission (ACCC) is taking notice. A five-year inquiry
published in June 2025 flagged algorithmic transparency, unfair trading practices, and consumer harms as central issues.
The commission said:
current laws are insufficient and regulatory reform is urgently needed.
It recommended stronger oversight of digital platforms, economy-wide unfair trading rules, and mechanisms to force algorithmic disclosure.
Is this efficient, or creepy?
We’re entering a world where your price might differ from mine — even in real time. That can unlock efficiency, new forms of loyalty pricing, or targeted discounts. But it can also feel Orwellian, unfair or exploitative.
The challenge for business is to deploy AI pricing ethically and transparently, in ways customers can trust. The challenge for regulators is to catch up. The ACCC’s actions suggest Australia is moving in that direction but many legal, technical, and philosophical questions remain.
Nitika Garg does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.