The Times Australia
Google AI
Health

.

The rise of chatbot therapists: Why AI cannot replace human care

  • Written by Dr Sampath Arvapalli


Some are dubbing AI as the fourth industrial revolution, with the sweeping changes it is propelling and it is no different for the field of psychology. AI therapy is growing because it’s accessible, affordable, anonymous and available 24/7. However, opinions on the use of AI as a therapy tool is heavily divided.

OpenAI, the company behind ChatGPT, released new data revealing that an estimated 560,000 ChatGPT users are showing signs of mental health emergencies such as psychosis and mania1. This comes as an estimated 1.2 million ChatGPT users have conversations with explicit indicators of potential suicide or intent in a week2.

With one in five Australians having experienced a mental disorder in the last 12 months3, it’s not surprising that an increasing number are turning to AI chatbots for some level of help. In fact, according to study, a third of Aussies have turned to AI therapy chatbots to beat high costs and long queues3

AI is becoming an undeniable presence in psychotherapy, offering new avenues for support, but understanding its limitations is essential to ensure it’s used both responsibly and effectively. 

AI chatbots are built to keep conversations flowing and to respond to any question, but this can be harmful for people already struggling with their mental health, as chatbots are not equipped to handle crisis situations or the intricacies of deeply personal, emotional experiences. IN fact, because they can’t reliably detect suicidal intent, chatbots may unintentionally enable risk by providing information that could support self-harm when directly asked.

While AI chatbots can pose risks for vulnerable users, Aussies are still turning to them for support. A recent survey found that one-third of respondents use AI tools for quick emotional support or as a personal “therapist.” It’s easy to understand the appeal as they offer instant access, anonymity and responses that feel tailored to your situation. But despite these benefits, AI shouldn’t be relied on as a long-term solution for managing mental health challenges. 

This growing reliance on technology for emotional support raises troubling ethical questions. ChatGPT, and similar AI tools, are not trained therapists. In Australia, digital mental health tools must comply with Therapeutic Goods Administration (TGA) standards to be considered clinically effective. AI platforms often avoid regulation by presenting their services as “wellness” tools, which is a subtle distinction that can be easily missed by consumers.

This doesn’t mean that AI has no place in the future of mental health care. Used responsibly, AI can help bridge gaps in access by offering psycho-education, screening tools and reminders for healthy habits or therapy homework. It can also support clinicians by streamlining assessment processes and data analysis, allowing more time for meaningful human connection.

At The Banyans Healthcare, we often see the consequences of delayed professional help –people who turned to quick fixes online only to find themselves feeling worse. Real therapy involves accountability, compassion and evidence-based strategies which can only be delivered through a safe, human-to-human interaction. No algorithm, no matter how advanced, can replicate that. This is why The Banyans offers a range of support services and emotional therapies to help anyone experiencing substance misuse or mental health conditions. 

As Australia grapples with growing demand and stretched resources, integrating AI into mental health support should focus on complementing and not replacing, qualified clinicians. We must ensure that innovation enhances care without compromising safety, ethics or the deeply human need for understanding.

While technology may offer temporary comfort, true healing still begins in human conversations, not with a program. If you find yourself constantly turning to an AI chatbot to talk about your mental health and treating it like a therapist, it’s a good idea to get in touch with a real-life therapist, such as at The Banyans, who can give you the correct attention and care that you need. 


About the author:

Dr Sampath Arvapalli is a Consultant Psychiatrist and Medical Director at The Banyans, leading a multidisciplinary team of clinicians across programs. With over 20 years of experience in both Australia and the UK, his expertise spans addiction psychiatry, ADHD and complex presentations. He has worked extensively across public and private sectors, contributing to specialist training, registrar supervision and service design. 


About The Banyans:

The Banyans Health and Wellness is a private, residential rehabilitation retreat located in southeast Queensland. The retreat’s medically-oriented program delivers research-based therapies for individuals experiencing depression and anxiety, chronic stress and burnout, drug and alcohol dependency and addiction, eating disorders and other co-occurring or additional conditions.

Citations:

  1. https://openai.com/index/strengthening-chatgpt-responses-in-sensitive-conversations/ 
  2. https://www.bmj.com/content/391/bmj.r2290.full 
  3. https://www.aihw.gov.au/mental-health/overview/prevalence-and-impact-of-mental-illness
  4. https://www.orygen.org.au/About/News-And-Events/2024/New-study-reveals-Australians-turning-to-AI-for-me

Times Magazine

Navman MiVue™ True 4K PRO Surround honest review

If you drive a car, you should have a dashcam. Need convincing? All I ask that you do is search fo...

Australia’s supercomputers are falling behind – and it’s hurting our ability to adapt to climate change

As Earth continues to warm, Australia faces some important decisions. For example, where shou...

Australia’s electric vehicle surge — EVs and hybrids hit record levels

Australians are increasingly embracing electric and hybrid cars, with 2025 shaping up as the str...

Tim Ayres on the AI rollout’s looming ‘bumps and glitches’

The federal government released its National AI Strategy[1] this week, confirming it has dropped...

Seven in Ten Australian Workers Say Employers Are Failing to Prepare Them for AI Future

As artificial intelligence (AI) accelerates across industries, a growing number of Australian work...

Mapping for Trucks: More Than Directions, It’s Optimisation

Daniel Antonello, General Manager Oceania, HERE Technologies At the end of June this year, Hampden ...

The Times Features

When Holiday Small Talk Hurts Inclusion at Work

Dr. Tatiana Andreeva, Associate Professor in Management and Organisational Behaviour, Maynooth U...

Human Rights Day: The Right to Shelter Isn’t Optional

It is World Human Rights Day this week. Across Australia, politicians read declarations and clai...

In awkward timing, government ends energy rebate as it defends Wells’ spendathon

There are two glaring lessons for politicians from the Anika Wells’ entitlements affair. First...

Australia’s Coffee Culture Faces an Afternoon Rethink as New Research Reveals a Surprising Blind Spot

Australia’s celebrated coffee culture may be world‑class in the morning, but new research* sugge...

Reflections invests almost $1 million in Tumut River park to boost regional tourism

Reflections Holidays, the largest adventure holiday park group in New South Wales, has launched ...

Groundbreaking Trial: Fish Oil Slashes Heart Complications in Dialysis Patients

A significant development for patients undergoing dialysis for kidney failure—a group with an except...

Worried after sunscreen recalls? Here’s how to choose a safe one

Most of us know sunscreen is a key way[1] to protect areas of our skin not easily covered by c...

Buying a property soon? What predictions are out there for mortgage interest rates?

As Australians eye the property market, one of the biggest questions is where mortgage interest ...

Last-Minute Christmas Holiday Ideas for Sydney Families

Perfect escapes you can still book — without blowing the budget or travelling too far Christmas...