top of page
Search

Can AI Replace Therapy? What the Current Research Actually Says

  • North Shore CBT Centre
  • 4 minutes ago
  • 6 min read

Artificial intelligence (AI) is quickly becoming part of everyday life, including mental health care. AI-powered chatbots and mental health apps now offer support anytime and anywhere. For some people, these tools feel comforting, accessible, and genuinely helpful. For others, they raise understandable concerns about safety, emotional depth, and what might be lost without a human therapist involved.


It is natural to wonder: Can AI replace therapy?And if not, how should we be using it?

While no single study can answer these questions definitively, a growing body of research helps clarify where AI appears to help, where its limits remain, and how it may be most responsibly used alongside human therapy.


Common Concerns About AI in Therapy

Much of the conversation around AI in mental health centers on a few recurring worries. People wonder whether AI can respond appropriately in moments of crisis, whether it risks over-validating unhelpful thinking patterns, or whether relying on a chatbot might delay or replace seeking real human help. Others question whether meaningful psychological change is possible without a human relationship, or whether AI can truly understand the complexity of human emotion.


These concerns are not abstract. They reflect real questions about safety, effectiveness, and what therapy is meant to provide.


Are These Concerns Unique to AI?

Not entirely. Human therapy can also fall short. Not every therapist is effective, and not every therapeutic relationship leads to meaningful change. Over-reassurance, missed risk, or a focus on short-term coping can occur in human therapy as well. These challenges are not unique to AI.


So the issue is not whether AI is “perfect” compared to humans. Neither is. A more

useful place to look is how AI tends to work in a therapy context, what that allows it to support well, and where important limitations naturally emerge.


How AI Tends to Work in a Therapy Context

AI-based mental health tools rely almost entirely on what is expressed explicitly. When someone types a concern, describes a symptom, or asks a question, the system responds based on the words that are provided. This makes AI especially well suited for tasks that are clear and structured, such as guiding someone through an exercise, offering consistent prompts, helping organize thoughts, or responding immediately to what is written.


For many people, this kind of support can be genuinely helpful. AI tools can make coping strategies easier to access, support reflection between sessions, or offer structure while someone is waiting for therapy. In these contexts, AI may reduce distress and help people stay engaged with their mental health rather than feeling stuck or unsupported. For many individuals, this kind of support can lower barriers to care that might otherwise prevent them from receiving help at all.


At the same time, this way of operating also sets natural limits on what AI can respond to.


How Human Therapy Works Differently

Human therapy is shaped not only by what is said, but also by what is not said. In conversation, people communicate through pauses, hesitations, changes in tone, shifts in posture, and emotional reactions that may be difficult to put into words. Therapists often notice when someone goes quiet, changes direction mid-sentence, avoids a topic, or reacts emotionally in ways they may not fully recognize themselves.


These moments are not side details. They actively guide the therapeutic process. They influence when a therapist slows down, softens an intervention, asks a different question, or gently names something that has not yet been spoken. Over time, this moment-to-moment responsiveness helps shape how therapy unfolds and how deeply it can reach.

In contrast, AI systems can only respond to what is explicitly provided. What is not typed or stated simply does not register in the same way. Humans often treat what is missing as meaningful information, while AI treats it as nonexistent. This difference is not a flaw or a moral failing of the technology. It reflects the current limits of systems that rely primarily on explicit input rather than relational and embodied feedback in real-time.


Does This Difference Actually Matter?

While it may be easy to notice how AI and human therapy differ, an important question remains: do these differences actually influence therapeutic outcomes in meaningful ways? In other words, does the more personal, emotionally attuned nature of human therapy translate into real differences in how effective treatment is over time?


Across large reviews of dozens of studies, AI-based mental health tools consistently show short-term benefits comparable to human-delivered psychotherapy. People using AI chatbots often report reduced symptoms of anxiety and depression over periods of several weeks, particularly when the tools are used for psychoeducation, structured cognitive-behavioural strategies, symptom tracking, and emotional support.


While short-term improvements are fairly consistent across studies, the picture changes when researchers look at long-term outcomes.


Across reviews, the benefits of AI-based interventions tend to:

·      Peak around 6-8 weeks

·      Weaken or disappear by 3 months

·      Lack strong evidence for durability beyond that point


This pattern contrasts with human-delivered psychotherapy, particularly cognitive behavioural therapy (CBT), where improvements often persist or even strengthen over time.


Importantly, this does not inherently mean that AI “fails”. Rather, it reflects the current state of the evidence. Most studies are short in duration and long-term follow-up is rare. Many AI therapy tools are evaluated before being fully validated. Finally, much of the research relies on self-report measures from relatively narrow samples lacking in diversity.


Even so, while the evidence base is still emerging, the pattern across studies is surprisingly consistent: AI-based tools tend to help people feel better in the short term, but those gains are less likely to persist over time.


This raises an important question. If AI can reduce distress initially, why do those improvements often fade?


Coping Versus Lasting Change


Researchers cannot yet say why long-term effects are weaker, only that they appear to be. However, one possible explanation, supported indirectly by patterns in the data, is the difference between coping and lasting change.


AI tools are very good at helping people calm down, organize their thoughts, reframe distressing ideas, and feel supported in the moment. These skills matter. They can reduce suffering and help people get through difficult or even critical periods.


Long-term therapeutic change often depends on something deeper. Over time, therapy typically involves learning to recognize patterns as they occur, tolerating uncertainty and discomfort, revising deeply held beliefs about oneself and others, and developing insight through an ongoing interpersonal relationship.


Research across many forms of psychotherapy suggests that these kinds of changes do not develop through information or exercises alone. How skills are practiced, supported, and carried forward into daily life appears to matter. Ongoing corrective experiences within a relationship often play an important role in sustaining change over time. Importantly, change often unfolds not during moments of harmony in the therapeutic relationship, but when moments of strain occur and are later repaired.


This distinction may help explain why AI-based tools tend to show strong short-term benefits but less consistent long-term outcomes, though specific research on long-term effects is still emerging.


So Where Does This Leave Us?

Seen this way, the most useful question may not be whether AI can replace human therapists, but where AI can meaningfully support mental health and where human involvement appears to matter most.


The evidence so far suggests that AI-based tools can be especially helpful for short-term support, structured learning, and increasing access to care. They may be valuable while someone is waiting for therapy, between sessions, or as a supplement to ongoing treatment. In contrast, human therapy often involves building trust, navigating misunderstandings, repairing moments of disconnection, and gradually reshaping deeply held beliefs. These processes tend to unfold within a relationship and remain difficult to replicate without another person involved.


From this perspective, AI and human therapy are not interchangeable. They can address different aspects of mental health care and may be most effective when used in complementary ways. AI can offer structure, accessibility, and immediate support. Human therapy can provide relational depth, flexibility, and the conditions that support enduring psychological change.


As research continues to evolve, so will our understanding of how these tools fit into mental health care. For now, the evidence suggests a balanced conclusion. AI can play a meaningful role in supporting mental health, but it is best understood as a tool rather than a replacement for human therapy.


During her practicum at the North Shore CBT Centre, Lena is offering reduced cost CBT for youth and adults.


Selected References

Gutierrez, G., Stephenson, C., Eadie, J., Asadpour, K., & Alavi, N. (2024). Examining the role of AI technology in online mental healthcare: opportunities, challenges, and implications, a mixed-methods review. Frontiers in psychiatry15, 1356773. https://doi.org/10.3389/fpsyt.2024.1356773


Li, H., Zhang, R., Lee, YC. et al. Systematic review and meta-analysis of AI-based conversational agents for promoting mental health and well-being. npj Digit. Med. 6, 236 (2023). https://doi.org/10.1038/s41746-023-00979-5


Wampold, B. E., & Imel, Z. E. (2015). The great psychotherapy debate: The evidence for what makes psychotherapy work (2nd ed.). Routledge/Taylor & Francis Group.


Zhang, Q., Zhang, R., Xiong, Y., Sui, Y., Tong, C., & Lin, F. H. (2025). Generative AI Mental Health Chatbots as Therapeutic Tools: Systematic Review and Meta-Analysis of Their Role in Reducing Mental Health Issues. Journal of medical Internet research27, e78238. https://doi.org/10.2196/78238


Zhong, W., Luo, J., & Zhang, H. (2024). The therapeutic effectiveness of artificial intelligence-based chatbots in alleviation of depressive and anxiety symptoms in short-course treatments: A systematic review and meta-analysis. Journal of affective disorders356, 459–469. https://doi.org/10.1016/j.jad.2024.04.057

 

bottom of page