It’s 1 a.m. The argument is over, but you keep running it back in your head anyway. You replay the tone, timing and that one sentence that landed wrong. So you open an artificial intelligence (AI) chatbot and type, “Am I right or am I overreacting? What do I say to what they said? What did they mean by XYZ?”
Research on attachment, emotion regulation and online discourse helps explain why turning to AI is becoming increasingly popular . The reassurance that it provides, however, can consolidate a one-sided interpretation far too quickly and, ultimately, train expectations that real relationships struggle to meet.
But for many, that’s where relationship support begins nowadays. The privacy of AI chatbots has become the space people go to first, especially given that the alternative — professional help or family and friends — often either involves paying, explaining yourself at length or risking judgment at exactly the moment you feel least steady.
However, while it’s a private moment and a keyboard click away, should we be looking for neutral relationship advice from AI chatbots?
Why does AI feel like support?
At a time when therapy is expensive or out of reach, and most relationship learning comes from media rather than practical skill-building, immediacy can be deeply appealing for some.
The appeal intensifies when relationship talk implicates identity. Questions like “Am I needy? Am I unlovable? Am I the problem?” carry shame, which makes disclosure feel risky. A chatbot offers a low-stakes space to narrate events and voice what might feel too exposed with friends or family.
Notably, chat-based relationship coaching can feel immediately satisfying, and research on reward-based engagement in online platforms suggests that quick, reinforcing feedback can encourage people to come back again and again, forming an addictive effect that chatbot interfaces may amplify.
Related work on chatbots also finds that when users feel a sense of closeness with AI, they report higher satisfaction and stronger intentions to reuse it, which helps explain why the use of these tools can become a habit instead of a one-time check-in. Interestingly, recent research also notes that people with anxious attachment styles are more likely to become emotionally dependent on AI.
From anonymous forums to algorithmic advice
Before chatbots, people often did this work through anonymous crowds in forums like Reddit, and research on online disclosure and support communities shows that anonymity and low social cost can increase willingness to share, especially around stigmatized or emotionally charged experiences.
In those spaces, you can disclose without being fully known, gather language from strangers and feel less alone with your own thoughts. AI distills that and suggests next steps, which can make disclosure easier while also nudging one reading of the situation into something that feels settled.
A quick, overly simplified fix.
However, over time, instant affirmation can train expectations for constant reassurance and rapid closure that intimate relationships rarely sustain, since intimacy develops through slower, reciprocal work under strain.
AI as a relationship rehearsal room
In practice, people use AI for far more than crisis.
Many use it as a communication coach, such as for drafting messages after conflict, softening tone and practising repair language before they speak.
Others use it as a rehearsal room for difficult conversations or as a planning tool for reconnecting, whether through date ideas, routines or small rituals that rebuild intimacy after distance.
It also shows up readily in the less visible work of relationships.
That could look like asking AI about the benefits of planning sex, how to navigate menopause and vaginal dryness or what lubricant to use with a dilator after cancer treatment. Here, AI helps make sense of situations that may be difficult to discuss with others and helps bring clarity to an unfamiliar field.
Where this becomes complicated is not simply that people use AI, but how its structure changes what counts as a good explanation. Because the system only has access to one narrated perspective, it can produce a coherent interpretation with high confidence while keeping out perhaps major details like context, history, power dynamics or what the other person has said.
Assisting, but not replacing, relational work
Though it may seem coherent, AI readily compresses nuance into a single storyline and can only focus on a singular conclusion. A chatbot can only respond to what it is given; trained professionals probe, clarify and notice gaps.
This isn’t only informal use of general chatbots either. Some tools are explicitly designed to mimic relationship coaching and therapeutic support, like Mojo or Amanda.ai, and some are even designed to function as an “AI companion” and romantic partner.
AI’s appeal also comes with real costs and risks, including energy-intensive infrastructure, corporate and political interests that shape what these systems learn and reproduce, the possibility of misinformation when nuance is missing and privacy concerns when an individual’s intensely personal disclosures are routed through data systems they do not control.
AI can support reflection and communication, but the substance of a relationship is still built and repaired in real time through choices partners make with each other. So, if you want a nuanced human answer, then just ask the humans in your life what they meant when they said “XYZ.”
![]()
Maha Khawaja does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


