The rise of chatbot therapists: Why AI cannot replace human care
- Written by Dr Sampath Arvapalli

Some are dubbing AI as the fourth industrial revolution, with the sweeping changes it is propelling and it is no different for the field of psychology. AI therapy is growing because it’s accessible, affordable, anonymous and available 24/7. However, opinions on the use of AI as a therapy tool is heavily divided.
OpenAI, the company behind ChatGPT, released new data revealing that an estimated 560,000 ChatGPT users are showing signs of mental health emergencies such as psychosis and mania1. This comes as an estimated 1.2 million ChatGPT users have conversations with explicit indicators of potential suicide or intent in a week2.
With one in five Australians having experienced a mental disorder in the last 12 months3, it’s not surprising that an increasing number are turning to AI chatbots for some level of help. In fact, according to study, a third of Aussies have turned to AI therapy chatbots to beat high costs and long queues3.
AI is becoming an undeniable presence in psychotherapy, offering new avenues for support, but understanding its limitations is essential to ensure it’s used both responsibly and effectively.
AI chatbots are built to keep conversations flowing and to respond to any question, but this can be harmful for people already struggling with their mental health, as chatbots are not equipped to handle crisis situations or the intricacies of deeply personal, emotional experiences. IN fact, because they can’t reliably detect suicidal intent, chatbots may unintentionally enable risk by providing information that could support self-harm when directly asked.
While AI chatbots can pose risks for vulnerable users, Aussies are still turning to them for support. A recent survey found that one-third of respondents use AI tools for quick emotional support or as a personal “therapist.” It’s easy to understand the appeal as they offer instant access, anonymity and responses that feel tailored to your situation. But despite these benefits, AI shouldn’t be relied on as a long-term solution for managing mental health challenges.
This growing reliance on technology for emotional support raises troubling ethical questions. ChatGPT, and similar AI tools, are not trained therapists. In Australia, digital mental health tools must comply with Therapeutic Goods Administration (TGA) standards to be considered clinically effective. AI platforms often avoid regulation by presenting their services as “wellness” tools, which is a subtle distinction that can be easily missed by consumers.
This doesn’t mean that AI has no place in the future of mental health care. Used responsibly, AI can help bridge gaps in access by offering psycho-education, screening tools and reminders for healthy habits or therapy homework. It can also support clinicians by streamlining assessment processes and data analysis, allowing more time for meaningful human connection.
At The Banyans Healthcare, we often see the consequences of delayed professional help –people who turned to quick fixes online only to find themselves feeling worse. Real therapy involves accountability, compassion and evidence-based strategies which can only be delivered through a safe, human-to-human interaction. No algorithm, no matter how advanced, can replicate that. This is why The Banyans offers a range of support services and emotional therapies to help anyone experiencing substance misuse or mental health conditions.
As Australia grapples with growing demand and stretched resources, integrating AI into mental health support should focus on complementing and not replacing, qualified clinicians. We must ensure that innovation enhances care without compromising safety, ethics or the deeply human need for understanding.
While technology may offer temporary comfort, true healing still begins in human conversations, not with a program. If you find yourself constantly turning to an AI chatbot to talk about your mental health and treating it like a therapist, it’s a good idea to get in touch with a real-life therapist, such as at The Banyans, who can give you the correct attention and care that you need.
About the author:
Dr Sampath Arvapalli is a Consultant Psychiatrist and Medical Director at The Banyans, leading a multidisciplinary team of clinicians across programs. With over 20 years of experience in both Australia and the UK, his expertise spans addiction psychiatry, ADHD and complex presentations. He has worked extensively across public and private sectors, contributing to specialist training, registrar supervision and service design.

About The Banyans:
The Banyans Health and Wellness is a private, residential rehabilitation retreat located in southeast Queensland. The retreat’s medically-oriented program delivers research-based therapies for individuals experiencing depression and anxiety, chronic stress and burnout, drug and alcohol dependency and addiction, eating disorders and other co-occurring or additional conditions.
Citations:
- https://openai.com/index/strengthening-chatgpt-responses-in-sensitive-conversations/
- https://www.bmj.com/content/391/bmj.r2290.full
- https://www.aihw.gov.au/mental-health/overview/prevalence-and-impact-of-mental-illness
- https://www.orygen.org.au/About/News-And-Events/2024/New-study-reveals-Australians-turning-to-AI-for-me
















