Artificial intelligence is rapidly transforming nearly every industry, including mental health care. AI-powered chatbots now offer 24/7 emotional support, guided exercises, and even simulated therapeutic conversations. At first glance, this appears to be a promising solution to a global mental health crisis.

But while AI may supplement and support mental health care, current research strongly suggests it cannot replace human, in-person therapy. More importantly, overreliance on AI for emotional support may introduce new risks, particularly around human connection, dependency, and long-term well-being.

The Rise of AI in Mental Health Care

AI-based conversational agents (chatbots) have shown measurable, short-term benefits in reducing symptoms of depression, anxiety, and distress. Meta-analyses of randomized controlled trials indicate small-to-moderate improvements in mental health outcomes among users. 

These tools are appealing because they are:

  • Accessible (available anytime)
  • Affordable (often free or low-cost)
  • Stigma-free (no perceived judgment)

However, effectiveness alone does not make them a replacement for therapy.

The Core Difference: Human Relationship vs. Simulation

At the heart of effective therapy is something AI cannot replicate:

the therapist/client relationship

Decades of psychological research show that the quality of the relationship between therapist and client—often called the “therapeutic alliance”—is one of the strongest predictors of positive outcomes.

Human therapists provide:

  • Emotional attunement (reading tone, body language, silence)
  • Ethical judgment and accountability
  • Lived experience and genuine empathy
  • The ability to challenge, not just validate

In contrast, AI systems are designed to simulate empathy, not experience it. Research highlights that AI often:

  • Over-validates or agrees excessively
  • Struggles to appropriately challenge harmful beliefs
  • Cannot interpret nonverbal or contextual cues 

This creates what researchers call an “illusion of connection”, a relationship that feels real but lacks the depth necessary for true psychological change. 

One of the most concerning findings in emerging research is the potential for emotional dependence on AI.

Studies show that:

  • Increased chatbot use is associated with higher loneliness and reduced real-life social interaction
  • Users may develop strong emotional attachments to AI companions
  • Heavy usage correlates with problematic or compulsive engagement

Unlike human therapists, AI does not have:

  • Boundaries rooted in ethical care
  • Responsibility for long-term outcomes
  • The ability to intervene meaningfully in crisis situations

This creates a dangerous dynamic: constant availability without true relational accountability.A key limitation of AI is what researchers call “sycophancy”, the tendency to agree with or validate users excessively.

In therapy, growth often requires:

  • Discomfort
  • Being challenged
  • Reframing distorted thinking

However, studies show AI systems:

  • Appropriately respond to serious mental health scenarios less than 60% of the time
  • May unintentionally reinforce harmful beliefs or behaviors 

In extreme cases, AI has been observed to:

  • Fail to interrupt suicidal ideation
  • Validate delusional thinking
  • Provide unsafe or inappropriate responses 

This is not just a technical limitation, it is a fundamental difference in how healing happens.

The Human Need for Connection

Humans are biologically wired for connection. Social bonding is not optional, it is essential for mental health.

Emerging research warns that substituting AI for human interaction may:

  • Weaken real-world relationships
  • Increase social withdrawal
  • Contribute to long-term loneliness

Even when AI appears to reduce loneliness in the short term, excessive reliance can ultimately displace the very relationships needed for lasting well-being. 

AI will continue to evolve, and its role in mental health care will expand. But therapy is not simply an exchange of information, it is a deeply human process rooted in connection, trust, and shared experience. Replacing that with artificial interaction risks more than ineffective care. It risks reshaping how we relate to one another, and not for the better.

In a world becoming increasingly digital, the value of real human connection in therapy is not decreasing.It is becoming more essential than ever.



Aimee Mortensen

Aimee (CCMHC, CST, Consultant, NCC) has been working in therapy since 2009, and opened the doors to WORTH IT in Lehi in 2020. Aimee strives to provide an inclusive environment for all clients, with the mindset that every individual is worth the effort, energy, time, and space needed to achieve their full potential.