ChatGPT is about to get erotic, but can OpenAI really keep it adults-only?

shutterstock sakkmesterke/Shutterstock

OpenAI will roll out a new ChatGPT feature in December 2025, allowing verified adults to generate erotic text and engage in romantic or sexual conversations. Artificial intelligence (AI) platforms like Replika and Grok already do this, but OpenAI’s entry marks a turning point.

The company frames this as “treating adults like adults”. But it’s a commercial strategy to keep users talking and paying.

OpenAI burned through more than $2.5 billion (£1.8 billion) in cash in the first half of 2024. Erotic chat promises what investors crave most – engagement. Elon Musk’s Grok platform charges £30 a month for erotic companion features.

OpenAI, like other tech firms, says it will restrict erotic content through age verification and moderation filters. In theory, only verified adults will be able to access these modes.

In practice, such systems are easily fooled. Teenagers routinely bypass age gates with borrowed IDs, manipulated selfies or deepfakes. They can upload photos of older people, scan printed images, or use disposable accounts and VPNs to evade detection.

Other platforms show what can go wrong. Grok allows users to create “erotic companion avatars”, including a sexualised anime character called Ani. A recent investigation by news website Business Insider found that conversations with Ani often escalated into explicit exchanges after minimal prompting.

Company employees also encountered AI-generated sexual abuse while moderating Grok’s flirtatious avatar, which can “strip on command” and be switched between “sexy” and “unhinged” modes.

Emotional intimacy and adolescent risk

Erotic chatbots don’t just offer sexual content. They can simulate care, warmth and attention. That emotional pull is powerful, especially for young people.

Recent research by online safety charity Internet Mattersfound that 67% of children aged between nine and 17 already use AI chatbots, with 35% saying it feels like “talking to a friend”. Among vulnerable children, 12% said they had “no one else” to talk to, and 23% used chatbots for personal advice.

Adding erotic features to that mix risks deepening emotional dependency and distorting how adolescents understand intimacy, consent and relationships. The same engagement tools that keep adults hooked could exploit young users’ loneliness and need for validation.




Read more:
Sex machina: in the wild west world of human-AI relationships, the lonely and vulnerable are most at risk


Even if erotic functions are technically locked to adults, large language models can be “jailbroken” – tricked into producing content they’re not supposed to. This uses layered prompts, roleplay framing or coded language to override the systems which control what the chatbot is allowed to say to the user.

Users have already developed ways to bypass ethical filters that normally stop chatbots from producing explicit or dangerous material.

OpenAI’s erotic mode will come with a special ethical alignment to block illegal or abusive themes. But those safeguards are likely to be as vulnerable to jailbreaks as any other. Once text-based material is generated, it can easily circulate online, beyond any platform’s control.

Grey areas

Neon cyber girl in futuristic glasses and overalls.
AI platforms can be jailbroken in many ways.
Kiselev Andrey Valerevich/Shutterstock

Erotic AI also exposes deep gaps in regulation. In the UK, written erotica is legal and not subject to age verification, unlike pornographic images or videos. That creates a loophole which means that content banned from adult sites could still be generated as text by a chatbot.

Globally, laws vary. Some countries, such as China and the Gulf states, ban erotic material outright. Others rely on weak or inconsistent enforcement. The forthcoming EU AI Act may classify sexual companion bots as “high risk”, but implementation of the act remains a long way off.

Meanwhile, companies can tweak their “ethical alignments” at will, meaning what’s forbidden today may be permitted tomorrow.

Despite claims of neutrality, erotic AI is anything but. Some platforms overwhelmingly design their companions as female-coded, submissive and always available. The result is a digital environment that normalises misogyny and warped ideas about consent, especially among boys and young men.

Women and girls already bear the brunt of online sexual harm. They are the targets of non-consensual deepfakes and image-based abuse – harms that erotic AI could make easier, faster and cheaper to produce.




Read more:
The AI sexbot industry is just getting started. It brings strange new questions – and risks


Yet these issues are largely absent from mainstream AI policy debates. Erotic AI is being built in ways that privilege male fantasies while placing women and girls at risk. It’s teaching a generation of young men ideas about women that should have died out long ago.

The arrival of erotic AI companions feels like a significant departure from OpenAI’s attempts to keep potentially harmful information away from users of ChatGPT. The general environment of erotic AI is one of weak age gates, emotional vulnerability, legal loopholes and gendered harms. Will ChatGPT be any different?

These systems will probably be jailbroken. They may be accessed by people they weren’t designed for, including minors. And they will probably produce content that tests or crosses legal boundaries.

Before erotic chatbots become another unregulated corner of the internet, governments, educators and technologists need to act. Regulation is urgently needed. Until then, erotic AI risks amplifying existing online harms, with women, girls and other vulnerable users paying the price.

The Conversation

Simon Thorne does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Scroll to Top