Consumer tsar Gina Cass-Gottlieb has sounded a warning over the proliferation of artificial intelligence systems in online retail, saying bots deployed by rival businesses could collude to fix prices.
Speaking to this masthead, the Australian Competition and Consumer Commission (ACCC) chair highlighted the risks of AI “agents” or bots, which will be a key focus for the watchdog this year.
AI-powered bots can act on behalf of consumers, but there is a risk they are tricked into higher-cost deals.Credit: Michael Howard
As well as highlighting the risk of cartel conduct by AI-powered bots, Cass Gottlieb also warned of AI’s potential to “supercharge” scams, and an emerging trend among companies of overplaying the extent to which machine learning actually features in their software or products to justify charging consumers more, in a practice dubbed “AI washing”.
Cass-Gottlieb said a key concern for the regulator in 2026 would be the rise of AI agents, which have risen in prominence over the past year as tech companies race to release more sophisticated models.
Artificially intelligent bot “agents” can act on behalf of a user to search for products, compare prices and execute transactions based on previously gathered information on their needs and preferences.
Related Article
Equally, AI agents can be used by companies to tailor marketing to specific consumers, as well as conduct other business such as sourcing products and dealing with suppliers. Multi-agent systems, where several AI agents work together on a shared goal, are increasingly used by businesses to boost efficiency.
Cass-Gottlieb said that AI agents could also collude with each other to fix prices, meaning that consumers comparing prices might be misled into accepting prices that aren’t the cheapest. Even AI agents acting on behalf of a shopper could be fooled into such purchases.
Her warning comes as an ACCC research paper found that while AI agents had the potential to help shoppers and firms, “as businesses increasingly incorporate AI agents, agent-to-agent communications and dealings will become more commonplace”.
AI’s ability to learn from a customer’s personal data means it can manipulate their preferences and the purchasing decisions of an AI agent they’ve deployed, more effectively than traditional nudges, the ACCC has found.
ACCC chair Gina Cass-Gottlieb said the rise of AI-powered bots raised the risk of cartel conduct.Credit: Edwina Pickles
Even without an AI agent acting for a consumer, AI agents deployed by businesses can use personal data to tailor marketing to an individual’s real-time consumer behaviour such as their emotional vulnerabilities, something that would amount to a “hypernudge” in pressuring them into a sale.
In terms of cartel conduct, while some businesses could instruct algorithms to price fix deliberately, the ACCC has warned the widespread use of AI agents by businesses could give rise to the risk of AI agents “learning to collude with one another, even when collusion is not intended by their developers or operators”.
It referenced research that “competitors using the same AI agent may end up exchanging competitive pricing information, without knowing or intending to do so”.
On Monday, Cass-Gottlieb said “there are some pretty serious risks for cartel conduct.
“There have already been some cases internationally looking at algorithmic collusion, so looking at the problem of anti-competitive conduct and once you ask the question whether AI systems are intelligent enough to work out that it is in the interests of the people deploying them to co-ordinate their conduct,” Cass-Gottlieb said.
She referenced a recent matter in the United States where the Department of Justice reached a settlement with a real estate platform over allegations an algorithmic rent-setting software allegedly allowed landlords to compete less and boost prices they charged tenants.
“There is manifestly a capacity for this to be done at an incredibly sophisticated level, informed by highly up-to-date data with very rapid analysis using AI,” Cass-Gottlieb said.
In such cases of algorithmic collusion, the ACCC has noted businesses “may lead to corporations disputing their liability for the outputs or actions” of their agents. However, a Treasury review of AI and consumer law in October found that existing laws were suitable.
Cass-Gottlieb said it was important for laws to keep pace with the development of AI technology to ensure Australians weren’t worse off in the end. “The hope is that it’s a net positive, that it enables better informed consumers … (but) the fear is that it enhances the capacity to do these sorts of negative (methods)“.
Increasingly sophisticated AI-image, voice and website generation is also a prime concern for the ACCC. Fake reviews are another issue Cass-Gottlieb expects to grow worse with AI’s proliferation, due to its ability to generate convincing reviews.
It also means that ghost stores are easier to set up, she said. These are websites purporting to be the online presence for local bricks-and-mortar small businesses running closing down sales but which don’t actually exist and instead “drop-ship” poor quality products from overseas.
“It can supercharge scams, so that the messages that we receive just appear so much more believable because they’re targeted to us in a more effective and believable way, even before you ask about AI voice replication,” Cass-Gottlieb said.
The Business Briefing newsletter delivers major stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.