World Mental Health Day: Why AI has become everyone's therapist

And what do the mental health professionals think about it?

Last updated:
Karishma H. Nandkeolyar, Assistant Online Editor
4 MIN READ
AI therapy chat app
AI therapy chat app
Shutterstock

It’s the ideal situation you just need someone to listen, to offer validation, and kindness. What you do not need or want is the thought of judgement, or let’s face it a huge bill. Enter ChatGPT, the digital caped crusader who answers in real time, without any snarkiness, or hate. It may even guide you to an insight.

“I’ve spent countless hours talking to ChatGPT about my thoughts. In a way, it has become a kind of therapy, a space where I can say anything without feeling judged. Sometimes, we feel guilty about opening up to close friends or family about things they might think we should have already moved on from, but talking to a computer that responds instantly and never gets tired of listening can be surprisingly comforting,” says a Dubai-based resident who spoke on condition of anonymity.

ChatGPT offers an articulate breakdown of an issue, complete with bullet points and subheads. Then it helps you come up with solutions. On World Mental Health Day, we ask the most important questions of all: What could go wrong? And can you swap a human session with a bot-led one?

Well, we are finding out the hard way that the answer to what could go wrong is, quite a lot. Take the case of 16-year-old Adam Raine, who found himself deep in conversation with ChatGPT about how to take his own life. While a human would raise alarm bells, ChatGPT apparently just continued to have conversations about his choices until the teen took a fatal decision in August.

Another trend that’s cropping up has been termed AI psychosis, where a false narrative is fed and nurtured until it turns into a full-blown delusion. Marlynn Wei M.D., J.D, writes in the magazine Psychology Today that three themes are emerging: one, where people believe they have uncovered truth about the world; two, people believe their AI chatbot is a sentient deity; and three, people believe the chatbot’s ability to mimic conversation is genuine love.

It is this third one that’s creeping up on people, for they increasingly find AI-powered (more agreeable) companions work better for them than someone made of blood and bone with their own issues. In other words, relationships are getting ever more complicated. For those who do have real world relationships, AI is a convenient tool to dissect arguments and help them one up the other, causing people to weatherproof a sure sign that the relationship is in trouble.

The advantage of a therapist

The warning bells are beginning to jingle. And yet, the trend continues. Dr Diksha Laungani, educational psychologist at The Free Spirit Collective, Dubai, the easy accessibility of ChatGPT is a huge draw. “There are no to minimum costs involved, and there is a sense of having a therapist on call. It feeds into hustle culture, where we want answers and we want them now.”

Only, this makes you vulnerable to misdiagnosis, after all the only context the bot has is what you are telling it.

I don't think it helps people expand their window of tolerance, which is their ability to still sit with the uncomfortable emotions, to wait to process it themselves, and then to be able to reflect that back to their therapist either"
Diksha Laungani

And then there’s body language. Books have been written about it and there are experts in behavior who study everything from what an eye twitch means to what it means to stand a certain way. This element is completely ignored by AI. “AI isn’t able to recognise nonverbal cues from a patient yet,” says Laungani.

Finally, there’s the actual hard work that goes into finding closure, healing, and resolution. “A therapist will usually ask their patient to do some work in between sessions. That doesn't happen with AI. I don't think it helps people expand their window of tolerance, which is their ability to still sit with the uncomfortable emotions, to wait to process it themselves, and then to be able to reflect that back to their therapist either,” she adds.

Self-awareness of course plays a huge role in learning when you need to talk to a professional as opposed to go venting. The Dubai resident explains: “Lately, I’ve started to wonder if it’s actually healthy to speak to AI so much. ChatGPT often tells you what you want to hear rather than the hard truths you need to hear, the kind that a friend, family member, or professional might give you to really help you move forward.”

OpenAI chief Sam Altman has also put forth questions of late, calling for caution instead of blind trust with tech. He was quoted as saying: “People have a very high degree of trust in ChatGPT, which is interesting, because AI hallucinates. It should be the tech that you don't trust that much.”

Do you use ChatGPT as a therapist? Has it helped or hindered you? Let us know at readers@gulfnews.com.

Sign up for the Daily Briefing

Get the latest news and updates straight to your inbox

Up Next