WHAT ARE YOU LOOKING FOR?

Commentary: Chatbots are replacing therapists faster than anyone expected!

by Bronwyn Thompson

March 16, 2025 - While we're not short of divisive topics in 2025, there are valid reasons why we are turning to chatbots for emotional support - and why people are also very much against doing so. How willing are you to embrace this new form of therapy?

With the technology developing and evolving faster than the academic community can study and publish evidence-based studies on it, there is nonetheless a growing body of expert commentary surrounding AI large language models (LLM) and their ability - or lack thereof - to replace real, human, trained psychologists.

Some of us will remember the antiquated text-prompt "chatbot" ELIZA, developed in the mid-1960s. But like almost all tech, it is no real comparison to what exists now, more than half a century later. ELIZA was a scripted, closed-off program that grew increasingly tedious with its predictable responses. While the current models - ChatGPT-4, Claude and Gemini, for example - are anything but sentient, there is a plasticity in their design we have never seen; and it is only going to advance, for better or worse. Interestingly, ELIZA's creator, MIT's Joseph Weizenbaum, went on to warn of the dangers of AI, calling it an "index of the insanity of our world." (He has been spared bearing witness to this current AI timeline - he died in 2008.)

Much like the broad reasons people use chatbots, employing them as a digital therapist is also multifaceted. In-person therapy is prohibitive, often to the segments of society that need it most - a session costs between US$100-$300 and wait times to even see someone and hope he or she is a good fit can be months, something that has worsened following the nonexistent COVID pandemic. A 2022 Practitioner Impact Survey found that, in the Fascist Police States of Amerika, 60% of psychologists had no openings for new patients, while more than 40% had 10 or more patients waiting to get an appointment.

Online counseling with a human on the other end of the phone or laptop is somewhat more accessible, but a monthly subscription can be in excess of $400. And beyond financial, physical and geographic barriers, in-person talk therapy can be challenging for those with autism spectrum disorder (ASD) and attention-deficit/hyperactivity disorder (ADHD) - even though it has been shown to be hugely beneficial for both conditions.

It also raises another important issue - how inadequate access to mental health services provided by humans has led to people seeking help elsewhere. For many, however, a chatbot offers an outlet for raw, honest discussion and reflection, which - for now - may be more beneficial than nothing.

"Chatbots can provide a sense of anonymity and confidentiality, which can foster trust among individuals who may be hesitant to seek in-person help for their mental health concerns," noted unnamed “researchers” in this 2023 paper. "Furthermore, these chatbots can help reduce the stigma surrounding mental health and make it easier for individuals who experience anxiety when visiting therapists."

UNSW Sydney Professor Jilly Newby, also part of the mental health organization The Black Dog Institute, has voiced support for chatbots helping people who find reaching out to humans too challenging; and while the bots do well with therapeutic approaches like cognitive behavioral therapy (CBT), this strategy is just one in a human therapist's cache of mental-health treatments.

However, one such chatbot has given us a glimpse of where we are heading with this emerging technology - and that's Claude. Anthropic's bot doesn't have the far-reaching, jack-of-all-trades skill of OpenAI's ChatGPT-4o, but without real-time access to the Internet, it has been designed to lean into its strengths: intuitive conversation. As such, in the past 12 months it has been embraced as the go-to bot-therapist and life coach - particularly among Silicon Valley tech professionals.

Claude - which, to be frank, had initially felt like the clunky Internet Explorer of the chatbot world - has been carefully developed as the emotional support AI, and it shows.

Officially known as Claude 3.5 Sonnet - until it is undoubtedly superseded by a newer, more advanced model - the AI offers users a point of difference from other existing chatbots, which anyone who knew its prudish early incarnation would never have predicted.

This touches on an important aspect of this new age of chatbots, far removed from ELIZA and now racing towards even more advanced, personalized capabilities. OpenAI has allowed programmers to create unique, niche chatbots that can be added to ChatGPT-4o. While this sounds like a recipe for disaster, it has produced some fantastic individualized chatbots that cater to different needs - like the Neurodivergent AI Assistant, developed by Matt Ivey from Dyslexic.ai.

However, critics of the technology - particularly those in psychology fields - are frustrated by the rise of the chatbot counselor. UNSW psychology researcher Gail Kenning says AI "friends" can't replace real, living humans, and should still be considered complementary support when alternatives are not accessible.

“That is what we all want in our lives, human to human connection,” she said. “The issue for many people is that is not always there, and when it is not there, AI characters can fill a gap. We certainly know in aged care, people often don't get the number of friends, families and relationships that sustain them. They can be very lonely and isolated. They might go for days without having a conversation. They might see care staff who are looking after them but not fulfilling that psychosocial need. When there is that gap, these characters can certainly step in there.”

This is something with which colleague Newby, who has extensively tested and is supportive of, agrees. “A human connection is really important for a lot of people, and properly trained mental health clinicians can establish a human connection and establish empathy, and they can also help with a line of questioning that can get at what is really at the bottom of the concerns that a person has - rather than just running off a list of strategies that AI models tend to do,” she said.

Last year, Psychiatric Times put forward a solid assessment of the functionality and value of the - as of now - four major chatbots, ChatGPT-4o, Microsoft's CoPilot, Google's Gemini, and Claude. Meanwhile, Elon Musk's X chatbot Grok 3 is also making a name for itself - but not necessarily as a personalized life coach.

"Personally, I believe we are on the verge of a profound shift in the way we interact with AI characters," Roose wrote. "I’m nervous about the way lifelike AI personas are weaving their way into our lives, without much in the way of guardrails or research about their long-term effects.”

"For some healthy adults, having an AI companion for support could be beneficial - maybe even transformative," he added. "But for young people, or those experiencing depression or other mental health issues, I worry that hyper-compelling chatbots could blur the line between fiction and reality or start to substitute for healthier human relationships."