AI Can Talk Like a Therapist. But It’s Missing Something Huge.

Author: Greg Phelps, LPC-Associate in Private Practice
What if the therapist you talk to every week… was just a machine? For more and more people, that’s becoming reality.
Some people are turning to AI for therapy, with some even claiming that they get more out of ChatGPT than therapists. And as a therapist myself, I have some concerns. Not about job security, but about what gets lost when we replace human connection with a machine. It reveals a fundamental misunderstanding about what good therapy actually does.
Therapy is not about getting advice or venting. It’s a relationship, first and foremost. And AI, for all its convenience, is missing some of the most important parts of human relationships.
In this article, I will argue that AI’s greatest strengths as a “therapist” are precisely what make it insufficient as a therapist.
Boundaries
AI is always available. You can talk to it anytime, anywhere. A therapist, on the other hand, has limits. You have to wait until your next session. You have limits on session length. As frustrating as this can be, learning to tolerate the discomfort of isolation is part of the healing process.
Many people have dysfunctional relationships where they experience being neglected and forgotten, or in the other extreme, micromanaged and overwhelmed. The boundaries of the therapeutic frame can teach you how to sit with emotions, how to trust that someone will be there when they said they will be, and how to accept that someone still cares about you even when they aren’t available 24/7. It can help you learn how to be vulnerable in real time, how to speak honestly with someone who might get it wrong, and how to forgive anyway.
A properly-trained therapist can model a healthy, boundaried relationship. Humans have limited time and energy and resources, and so does every other human in your life. AI is always there, and therefore does not model a real human relationship with all its limitations.
Rupture and Repair
In real relationships, misunderstandings happen. Conflict is inevitable. Your therapist might misattune to you, or say something that doesn’t land. But in therapy, that’s not the end of the story—you talk about it. You can work through it and attempt to repair the rupture. This often provides a corrective experience for clients. It can help a person learn how to speak up for themselves when they are misunderstood, how to notice and manage difficult emotions, how to experience conflict and then repair it. Conflict can be reframed as a necessary and healthy part of relationships that actually deepens your bond when done properly.
AI? It never gets it wrong (and if it does, you don’t care, because it’s a machine). It never needs repair, just a quick correction. And that might feel safer, but it’s also missing a core part of the corrective experience of real human relationships.
What makes AI feel therapeutic—its instant access, perfect responses, and nonjudgmental tone—is exactly what keeps it from offering the kind of growth that real therapy can bring.
Mortality
AI isn’t human. It doesn’t age, it doesn’t die, it doesn’t actually know what it’s like to lose someone. It can generate deep-sounding responses about grief, but it’ll never feel it. Therapy is often about confronting life’s biggest realities with another person who can offer compassion and support. Sometimes the therapeutic alliance serves as scaffolding for a client who is learning how to build and maintain those kinds of supportive relationships outside of therapy with other humans.
Healing often happens through co-regulation: your nervous system responding to another real human’s presence, voice, and cues. AI can’t breathe with you. It can’t respond to your tears in real time, or feel your silence.
AI offering you simulated support is not a corrective experience. It can mimic it, but it’s not the same as sitting with a real-life person. It can only simulate grappling with the existential concerns of freedom, responsibility, isolation, and death. While its words may be comforting, it cannot look you in the eyes as an equal and appreciate your temporary existence like a real human.
AI’s strengths as a therapeutic tool
I’m not saying AI is useless for therapeutic purposes. It can be a decent supplement, and a perhaps a great auxiliary tool when utilized correctly. Personally, I find it great for augmenting my own journaling process, and it has often led to great insights and personal growth. In this sense, I encourage people to use AI tactically to improve their life.
Has AI ever helped me with relationship problems? Actually, yes—but not in the same way therapy helped. In a sense, AI can replace some of the things a therapist does: pointing to information and resources, reframing cognitive distortions, giving encouragement, and more. But some of the most important work I’ve done in therapy, like learning to tolerate discomfort in relationships, standing up for myself when I feel misunderstood, and voicing my feelings when I am angry with someone, are not something AI can currently replicate.
A Case Study as a Therapist
I’ll leave you with this case study:
Imagine you are in therapy. You are starting to grow frustrated with your therapist because you don’t think it’s helping fast enough. You start talking to ChatGPT between sessions and you’re blown away by the ideas it provides you. Suddenly you are learning about attachment theory, and it’s providing you with helpful questions about your childhood and how certain patterns started when you were young.
You start journaling directly into ChatGPT. You realize you resent your parents for not providing enough structure or guidance in your childhood. AI has helped you realize you’ve been carrying resentment towards them your whole life, and that has expanded into a resentment for authority figures in general.
You are blown away by much you’re learning about yourself, so you terminate therapy. You’ve got a personal pocket therapist who is available any time, for free.
But look closer. You might be acting out the exact pattern ChatGPT helped you start to uncover. Your resentment for authority figures has turned into an avoidance for repairing relationships. You are expecting people to be nearly perfect, and you are hyperfocused on signs of invalidation or flaws. You turn away from relationships at the first sign of their human flaws. Because of your wounds, you do not have a healthy tolerance for disconnection in adult relationships, and you do not have the felt experience of someone working with you to repair. And now you’ve fired a real-life human who might have helped you grapple with these issues and provide a corrective experience.
If only you had been able to tell your therapist you felt you weren’t getting enough out of it, and open the door to the real vulnerability required in real human relationships.
Conclusion
AI can be an incredible mirror. But only humans can walk beside you, messy and mortal, offering presence instead of perfection. And sometimes, it’s in the imperfection that healing actually begins.
If you’re interested in talking to a human therapist, reach out today!