Seeking Therapy from an AI Bot? Think Again

The world today is becoming increasingly connected, yet paradoxically lonelier than ever. Alongside mental health concerns, this has emerged as one of the most pressing issues of our time. 

With the advent of conversational AI, people can now engage in dialogue without needing another human present. This increased accessibility has fueled the global use of AI chatbots, often leading to emotional attachment and, in some cases, growing social detachment. 

Prathvi Nayak, a Senior Psychologist at NIMHANS, in conversation with AIM, sees promise in using AI in mental health, at least as a supportive tool.

“AI could be a real helping hand in having conversations with people in real time,” she says. “In a world where people constantly want someone to talk to, AI could be that immediate aid. It would not be possible for a therapist to be always available.”

However, a study by OpenAI and MIT Media Lab, titled Investigating Affective Use and Emotional Well-being on ChatGPT,’  led by Jason Phang, warns that while an emotionally engaging chatbot can provide support and companionship, there is a risk that it may manipulate users’ social and emotional needs in ways that undermine long-term well-being. As these tools become more integrated into daily life, their emotional impact on users is becoming harder to ignore.

According to a 2023 study by the World Health Organisation, over 280 million, ie, one out of 28 people globally, suffer from depression, while loneliness has been described as a growing epidemic by health institutions in countries like the U.S. and the U.K.

India has 0.75 psychiatrists for 100,000 people. However, the WHO guidelines suggest that there should be at least three psychiatrists for every one lakh population. 

Nayak elaborates that while psychology students or interns may assist with therapy in traditional systems, having a human available 24/7 is complicated. “AI might not replace human therapists, but it fills a gap.

Despite the rising demand for mental health support, the availability of trained therapists is far from adequate. In many low- and middle-income countries, there is less than one mental health worker per 100,000 people. This disparity has opened up a space for AI-powered therapy tools to emerge.

AI therapy chatbots like Woebot, Wysa, Replika and even models like ChatGPT are increasingly being explored as digital companions that can offer immediate support. These platforms provide 24/7 availability, anonymity and a low-cost or free interface, making them accessible to many who might otherwise have no therapeutic outlet. 

But can these AI tools help in a meaningful way?

While platforms like Character.ai include disclaimers outside the chat, like “This is an AI chatbot and not a real person,” the bots themselves often role-play as real humans during conversation. They usually don’t break character or admit they’re AI. This is intentional, as the platform is built around immersive, character-driven interactions, not factual or professional support.

This design creates a grey area: users are told not to take anything seriously, yet the experience often feels real. This is part of what makes these tools engaging, but it can also be potentially misleading, especially for vulnerable users who might form emotional attachments.

 It can guide users through breathing exercises, journaling prompts or even just casual conversation in the middle of the night when loneliness creeps in,” she adds. “It also helps people with suicidal tendencies to get away with it by having someone to talk to.”

The benefits of AI therapy are undeniable: they offer always-on access, reduce the stigma of seeking mental health support, and may help people take that first step toward healing. However, they are not without criticism.

Dr. JKS Veettoor, a practising physiologist, holds a much more cautious view.
“Psychiatry or psychology is the last thing AI should interfere with, or rather would excel in,” he states firmly. “AI is something that doesn’t have a psyche. How is it supposed to treat a human with a psyche and consciousness?”

Dr. Veettoor’s concerns shed light on the foundations of therapeutic practice, human connection, empathy, and intuitive analysis. “A trained physician will analyse people in terms of gesture, posture, and even costume. These are nonverbal cues that AI models simply cannot detect. And it will be challenging for an AI model to identify if someone is lying.”

He underscores the irreplaceable value of experience in therapy: “It takes a lot of exposure to different personalities and situations to understand and help individuals navigate their mental health issues. AI lacks that human learning curve.”

Prof Anil Seth, a leading consciousness researcher, supports this view: “We associate consciousness with intelligence and language because they go together in humans. But just because they go together in us, it doesn’t mean they go together in general.” 

It suggests that AI’s ability to process language doesn’t imply a capacity for empathy, understanding, or actual therapeutic presence.

Dr. Veettoor concedes that AI has a place in mental health, albeit a limited one. “Having said this, AI could properly aid people having issues with loneliness. It could provide company or simulate conversation when there is no one else around.”

While AI can aid in therapy and benefit patients, it has limitations. On one hand, it provides an immediate, judgment-free space for people to talk. On the other hand, it cannot replicate the complex human interactions that underpin effective psychological treatment.

“While generative AI may help with generic assistance, in a clinical or therapeutic setting, there is a very real risk of AI misguiding, through misdiagnosis, lack of emotional attunement, or poor response during crises,” Mahua Bisht, CEO of 1to1help, underlines these risks. “The literature on AI bots suggests they aren’t tested fully enough to guarantee safe outcomes without clinician oversight.” 

As AI continues to evolve, hybrid models may emerge where human therapists use AI tools to monitor client progress, offer supplemental content, or maintain engagement between sessions. But the future of AI in therapy will hinge not just on what the technology can do but also on how carefully and ethically it is integrated into human-centred care.

The post Seeking Therapy from an AI Bot? Think Again appeared first on Analytics India Magazine.

Scroll to Top