AI as an Aid, NOT a Replacement for Human Lead Therapy: Why Human Judgment Will Always Lead in Mental Health Care

By Lyndsay Babcock

Director & Principal Psychologist – The Self Centre Australia

Artificial intelligence is rapidly changing the landscape of healthcare, and mental health care is no exception. From mood-tracking apps and online chatbots, to AI systems that can detect early signs of depression through voice or text, technology is increasingly being used to support psychological wellbeing. 

At The Self Centre, we see this AI revolution as both promising and cautionary.  

AI can enhance access, assist in screening, and provide valuable tools for self-awarenessBut AI cannot replace the depth, discernment, and empathy of a trained psychologist. Healing is not about information, it’s about transformation, and that depends on human clinical judgment. 

Clinical judgment is the skilled, evidence-informed decision-making process psychologists use to assess, interpret, and respond to a client’s needs with empathy and professional reasoning. 

It differs from being critically judgmental, which involves forming personal opinions or criticisms about someone’s behaviour or character rather than making objective, compassionate, and therapeutic decisions. 

Psychologists are extensively trained to exercise clinical judgment, drawing on research, ethical frameworks, and empathy to guide treatment decisions rather than being judgmental (which has no place in a therapeutic setting). 

How AI Is Helping Mental Health Care 

AI-powered mental health tools can play a valuable supporting role in care. They offer: 

  • Accessibility: 24-hour support for people in remote or rural areas who may not have easy access to a psychologist. 
  • Early detection: Algorithms can flag patterns in speech, writing, or biometric data that may suggest rising distress or depressive symptoms. 
  • Self-management tools: Apps that prompt mindfulness, track mood, or offer psychoeducation can empower people to take small, consistent steps towards mental wellbeing. 

Tech futurist Sinead Bovell often describes AI as “an amplifier of human potential” a tool that can make care more efficient, not a replacement for human expertise. In this sense, AI can extend the reach of mental health services, improving engagement between sessions or helping clients track their progress over time. 

At The Self Centre, we utilise technology that complements evidence-based therapy and aims to empower our clients in their healing, but not as systems that attempt to replace the valuable human connection. 

Understanding AI’s Limits: Data Isn’t Judgment 

AI thrives on data and pattern recognition. It analyses vast amounts of information, identifies correlations, and predicts outcomes based on probabilities. But it doesn’t understand why something is happening or how to respond to that accordinly, and that’s where human clinical judgment becomes essential. 

Clinical judgment is the psychologist’s ability to interpret context, history, and emotion; to weigh risk; to integrate evidence with empathy; and to tailor treatment for each individual. As Gerd Gigerenzer’s work on human decision-making shows, humans excel at reasoning through uncertainty — a skill no algorithm possesses. 

In therapy, clinical judgment determines how and when to challenge a client’s beliefs, when to explore trauma, when to pause for regulation, or when to involve additional supports. It’s dynamic, relational, and profoundly human. 

AI, on the other hand, operates without context or moral awareness. As Joanna Bryson notes, AI can imitate intelligence, but it cannot replicate values. It lacks the moral reasoning and intuitive understanding that guide psychologists in navigating sensitive, ambiguous, and deeply personal experiences. 

AI and Empathy: The Missing Ingredient 

Despite rapid advances in affective computing, systems designed to detect and respond to human emotions, AI cannot truly experience empathy. Researcher Maja Pantić has shown that while algorithms can recognise facial expressions or tone of voice, they often misinterpret emotion, especially across cultures or in trauma survivors. 

Empathy in therapy is not simply recognising sadness, it’s attuning to it, staying with it, responding to it appropriately and creating a safe environment where healing becomes possible. Machines can mirror empathy linguistically, but not emotionally.  This is where the danger can lie, without highly human judgement vulnerability can quickly become exaggerated. 

At The Self Centre, our psychologists bring the capacity for genuine attunement, reading not only words, but body language, timing, and energy. It’s this depth of presence that creates the neural and emotional conditions for change. 

The Future: Collaboration Between Humans and Machines 

The future of mental health care will likely be hybrid collaboration between human expertise and intelligent tools. AI can support psychologists in a growing number of ways, for example: 

  • Streamlining assessment: Automating data collection and symptom tracking so therapy sessions can focus more on meaning and less on measurement. 
  • Enhancing insights: Analysing progress over time to help therapists adapt strategies more effectively. 
  • Improving reach: Offering preliminary guidance or educational content to those not yet ready to engage in therapy. 

As Max Louwerse notes in Understanding Artificial Minds Through Human Minds, the key is recognising AI’s strengths and its limits. Machines can extend our capabilities, but they cannot replace human consciousness, compassion, or ethical reasoning. 

At The Self Centre, we aim to integrate technology thoughtfully to support care, not substitute it. Whether through secure telehealth, guided programs, or digital tools that complement therapy, we use technology to make psychological support more accessible and effective, but it is always led by human judgment and evidence-based practice. 

AI Without Oversight: The Risks 

While AI has potential, it also carries risks when used without professional guidance. Research by Michal Kosinski shows how algorithms can infer personal traits and vulnerabilities from online behaviour, sometimes more accurately than humans. Without ethical oversight, this data can be misused or commercialised. 

Unregulated AI “therapy” bots also risk, and have been known to offer unsafe or inaccurate advice. They cannot assess risk, identify suicidality, or offer the nuanced containment that trauma processing requires – and they likely never will because while they use data points, they cannot use clinical judgement. 

At The Self Centre, we hold ourselves to the highest professional standards, always aiming to provide the highest level of privacy, safety, and clinical excellence in every human interaction. 

Human Judgment: The Anchor of Healing 

In mental health care, clinical judgment is where science meets soul. It’s what allows a psychologist to recognise when symptoms are masking trauma, when resistance is actually self-protection, or when a client needs silence instead of strategy. 

Clinical judgment is not just expertise, it’s wisdom, built through years of experience, supervision, and reflection. AI may guide, assist, or support, but it will never feel or discern in the way a human mind can. 

At The Self Centre, that human judgment underpins every therapeutic interaction and decision. We integrate evidence-based modalities like EMDR, CBT, and schema therapy with empathy and professional insight, tailoring every session to the person in front of us, not the data behind them. 

Technology Can Support Healing — But Humans Create It 

As Sinead Bovell often reminds us, “The future of technology isn’t artificial, it’s human-centred.” 

At The Self Centre, we couldn’t agree more. AI can support mental health care, but healing requires warmth, intuition, and connection, things that cannot be programmed. 

If you’re looking for therapy that’s informed by science but grounded in humanity, our team of psychologists offers evidence-based therapy both in-person and online across Australia.