ChatGPT as your therapist? UAE mental health experts warn of serious risks

AI therapy: A helpful tool, but not a replacement for human empathy

Last updated:
Lakshana N Palat, Assistant Features Editor
8 MIN READ
A healing relationship isn't just based on knowledge or clinical skills. It arises from a therapist's presence and the emotional sensitivity of understanding what a person went through in their adulthood and childhood.
A healing relationship isn't just based on knowledge or clinical skills. It arises from a therapist's presence and the emotional sensitivity of understanding what a person went through in their adulthood and childhood.
AP

What was life even like before we could say, ‘Just ChatGPT it?’ 

Well, there was Google.

Google’s still here. But for many today, the first stop isn’t a search engine. It’s ChatGPT. Or Grok. Ask a question, and you’ll get a sharp, condensed answer in seconds. You’re saved the extra effort of wading through links. 

And now, beyond answering everything from dinner ideas to historical trivia (with the usual disclaimer that it can make mistakes), ChatGPT has taken on another role: For some, it’s become a form of therapy.

Apart from an enormous amount of research, a scroll through Reddit gives you different answers. In one thread, a user has tentatively asked, “Can I use AI as a therapist?”

The responses are mixed. One user answered, “Absolutely, if you know what you are doing.” The person goes on to explain how to go about discussing problems with Chat GPT, saying that ‘you have to bring individual well-defined problems as opposed to trying to have long-winded conversations’. They go on to add, “Create multiple coaches to help with big areas that you are trying to work on, like procrastination, exercise, or breaking out of your shell.”

Others shared emotional experiences, saying how ‘calm and soothing’ Chat GPT was. It saved them in different ways. But there were others who expressed their scepticism. “The power of a therapist is them being able to understand your situation and using their knowledge to tailor your experience. I am not sure Chat GPT is up to par with this.”

But what do the mental health professionals say? Can AI really delve into the messiness, complexities of healing and trauma?

‘It’s a complement to traditional therapy. Not a replacement’ 

Dr Saliha Afridi, the director of LightHouse Arabia, a wellness clinic in Dubai acknowledges AI for its virtues, but also offers a few caveats. “It’s a great tool that supplements mental health care.It is an excellent tool to help in areas like screening, diagnostic support, and also provide relevant, practical coping skills based on evidence-based modalities like CBT (Cognitive Behavioural Therapy), ACT, or DBT (Dialectical Behavioural Therapy). "

It closes the gap in learning, especially for communities that have limited access to mental health professionals. But, as she says: It is a complement to traditional therapy. It isn’t a replacement. 

‘AI cannot replace human empathy’

Yet, a healing relationship isn't just based on knowledge or clinical skills. It arises from a therapist's presence and the emotional sensitivity of understanding what a person went through in their adulthood and childhood. "It doesn’t suffer, rejoice, or understand what it means to be vulnerable. Humans interpret emotions in rich, often ambiguous contexts; shaped by culture, history, body language, tone, timing," explains Dr Elena Gaga, a clinical psychologist at the Hummingbird Clinic, Dubai.

As Dr Afridi emphatically mentions: There is a significant and irreducible different between support tools and therapeutic relationships. As she highlights, research shows that the most powerful healing factor in talk therapy, is not the specific technique or modality, but the quality of the relationship between the therapist and the client.

When people process traumatic life experiences in the presence of a therapist, who is grounded, tuned, regulated and containing, the person’s capacity increases and that they are able to meet that difficult experience, in a different way. “That is the power of co-regulation, something that AI cannot do,” she adds.

Empathy is not scripted, rehearsed or a pattern recognition. It is living, personal and shaped by a client’s experience, body language and emotional cues. AI may simulate empathetic knowledge, but it cannot embody the deeply human process of emotional resonance. "That’s what makes therapy transformative,” adds Dr Afridi. 

Empathy cannot be coded

Even if empathy can be mimicked, healing is a more layered, turbulent process altoogether.

Asmaa Alkuwari, Executive coach, TEDx speaker, author, and founder of You’re Not Alone, a community dedicated to creating a safe, supportive, and empowering space for women, echoes a similar emotion. “Empathy is a presence, not a pattern. AI might mimic what empathy sounds like, but as a coach, I know that healing  comes from noticing subtle shifts, cultural undercurrents and body language. That’s not something that you can code, at least not meaningfully.” 

When AI-based therapy can turn harmful

Some trauma, wounds can run far too deep to be healed on AI, as both Dr Afridi and Alkwuwari say. “When the person presents with complex trauma, psychosis, personality disorders, it requires a high degree of nuance, containment and deep rational attunement,” explains Dr Afridi. “Some of these disorders also require empathic confrontation, which an AI will not provide. An AI might inadvertently retraumatise, invalidate or misinterpret meaning.”

Alkwuwari echoes similar sentiments. “When dealing with trauma, abuse, or identity-based challenges, where family dynamics, shame, and societal expectations run deep. An AI system, no matter how advanced, doesn’t carry the cultural literacy or contextual understanding to navigate these complexities. As someone trained in digital humanities, I know how power structures, gender norms, and language shape experience. Using AI without understanding these nuances risks reducing people’s pain to algorithms and that’s dangerous.” 

What does the research say?
A 2024 study published in PLOS Mental Health asked over 800 Americans to evaluate therapy responses written by ChatGPT-4 and real therapists. 

  • Most people couldn’t tell the difference—guessing just 5% better than chance.

  • Surprisingly, ChatGPT scored higher on empathy, cultural sensitivity, and warmth.
    But when people thought a human had written the reply, they rated it more favourably—highlighting a bias against AI, not the quality of the response.

Empathy is not scripted, rehearsed or a pattern recognition. It is living, personal and shaped by a client’s experience, body language and emotional cues. AI may simulate empathetic knowledge, but it cannot embody the deeply human process of emotional resonance
ChatGPT as your therapist? UAE mental health experts warn of serious risks
Dr Saliha Afridi Clinical Psychologist and Founder of LightHouse Arabia

Relying on ‘quick fix’ healing rather than actual psychological growth

You don’t heal with just information. You heal with processing, real, uncomfortable, painful confrontation with yourself. Sometimes, you need to get to a point to follow the steps come after that.

And while many AI mental health apps are built around immediate feedback loops, quick journaling, mood check-ins, or supportive messages, as Dr Elena Gaga, a clinical psychologist from the Hummingbird Clinic points out, these tools can condition users to seek fast relief, rather than engage in the often slow, nonlinear path of growth. "Over time, users might come to expect emotional regulation on demand, rather than developing tolerance for ambiguity or distress."

However, deep therapeutic work often happens within a relationship, one that mirrors, challenges, and reshapes a person’s relational patterns. AI lacks the interpersonal presence and responsiveness that fosters attachment repair, emotional validation, or corrective experiences.

As Dr Afridi adds, “Knowledge isn’t the same as wisdom. People can learn to cope, but deep processing, integration and wisdom-making are best facilitated with the guidance of a trained therapist. AI can support, but it cannot shepherd someone through the terrain of their psyche, the way a human therapist can.” 

Moreover, what's more significant, is that responses are dependent on the user's input. Dr. Alia El Naggar, Assistant Professor, School of Health Sciences and Psychology, Canadian University Dubai explains, "AI-generated feedback is shaped, by the way a question is framed. “The prompts it receives, determine the direction and quality of its answers. This makes AI highly reactive, but not necessarily insightful, in the way a human therapist would be.” 

A real therapist reads between the lines, challenge the thinking, redirect a conversation based on intuition and nonverbal cues. “AI lacks true interpretive depth. It cannot assess a context beyond the literal language its given. If a user frames a question in a based or a leading way, AI may fall into the pattern of confirmation bias, reinforcing the user’s assumptions, rather than challenging them constructively.” 

Therapists are trained to notice when a client avoids a topic, manipulates a narrative, or using cognitive distortions  like black-and-white thinking or catastrophising. “They can guide the conversation with intentional therapeutic strategies,” explains Dr El Naggar.

AI does have the advantage of being available 24/7, offering instant support and structure outside scheduled therapy hours. In that sense, AI can do certain things that therapists can’t, like providing consistent reminders, managing progress logs or delivering structured exercises on demand...
ChatGPT as your therapist? UAE mental health experts warn of serious risks
Dr. Alia El Naggar Assistant Professor at the School of Health Sciences and Psychology at Canadian University Dubai

The rise of AI-based therapy

Where do we stand with connection to AI-based therapy? Will it actually replace mental health professionals in future? As Dr Gaga explains, "It is possible, that in the future we will see a more sophisticated use of AI providing manualised therapies such as CBT based self-help and psychoeducation. We might also see the use of chatbox and virtual therapists analyse behavioural data to provide tailored personalised services."

She adds that there might also be advancements in Multimodal AI and Biometric Monitoring. "This could in the future enable passive monitoring of users’ emotional states, allowing for non-invasive and continuous mental health assessments." Nevertheless, she maintains they will possibly assist with handling data collection, and progress tracking, freeing clinicians to focus on high-impact therapy work.

If the AI fails to respond empathetically or misunderstands the user’s distress, it can lead to feelings of invalidation, loneliness, or betrayal. AI tools can also misjudge the severity of a mental health crisis, especially in nuanced or culturally specific expressions of distress....
Dr Elena Gaga, clinical psychologist at Hummingbird Clinic

‘Use the tool, but know its limits’ 

As the experts warn, for sure, AI tools are helpful. They can generate quick task-oriented solutions, tracking forms, monitoring daily mood logs and remind clients to complete therapy homework. As Dr El Naggar explains, “AI does have the advantage of being available 24/7, offering instant support and structure outside scheduled therapy hours. In that sense, AI can do certain things that therapists can’t, like providing consistent reminders, managing progress logs or delivering structured exercises on demand. While it can’t replace emotional depth, intuition or human connection that a therapist provides, it does support therapy by taking over logistical and repetitive tasks.” 

AI can be programmed to respond, but it cannot witness and use instinct as a mean to connect, build rapport and reflect back. And there’s a world of difference between being heard and being held. That’s what we must preserve in a world thats becoming increasingly digital.
ChatGPT as your therapist? UAE mental health experts warn of serious risks
Asmaa Alkuwari Executive coach and TEDx speaker

They firmly maintain: It isn’t the sole form support for clients with complex or severe mental health issues, where human judgement and clinical supervision are essential. In those cases, relying solely on AI is dangerous.

AI has its place. But, it must be used with caution, clarity and always with the understanding that it cannot replace the therapeutic relationship. As Alkwuwari concludes, the space between  a coach and client holds stories, tears, silence, and breakthroughs. AI can be programmed to respond, but it cannot witness and use instinct as a mean to connect, build rapport and reflect back. And there’s a world of difference between being heard and being held. That’s what we must preserve in a world thats becoming increasingly digital.

Lakshana N PalatAssistant Features Editor
Lakshana is an entertainment and lifestyle journalist with over a decade of experience. She covers a wide range of stories—from community and health to mental health and inspiring people features. A passionate K-pop enthusiast, she also enjoys exploring the cultural impact of music and fandoms through her writing.

Sign up for the Daily Briefing

Get the latest news and updates straight to your inbox

Up Next