Using ChatGPT as Therapy? What It Can Do, and What It Can’t

If you’ve ever typed something into ChatGPT that you haven’t been able to say out loud to anyone, something about how you’ve really been feeling or what’s actually going on, you’re not alone. A lot of people are doing this. And I don’t think it’s something to be embarrassed about.

There are real reasons why talking to an AI feels easier sometimes. It’s available at 2am. It doesn’t react. It doesn’t get tired of hearing about it. It doesn’t have its own stuff going on that makes you feel guilty for bringing yours up. For people who grew up in environments where being honest about feelings wasn’t safe, or who carry cultural pressures around asking for help, the low stakes of a chat window can feel like a genuine relief.

I want to take that seriously before I say anything else.

There are real reasons why talking to an AI feels easier. I want to take that seriously before I say anything else.

What AI actually does well

AI tools like ChatGPT and Claude are genuinely useful for some things that overlap with what people want from therapy:

  • They’ll listen without judgment, or at least without the version of judgment we fear from other people

  • They can help you articulate something you’ve been struggling to put into words

  • They can offer information about what anxiety looks like, what a particular therapy approach involves, what’s normal to feel after a loss

  • They can reflect your words back to you in a way that sometimes creates useful distance

  • They’re available immediately, at any hour, with no appointment and no waiting list

These are not nothing. For someone who has never talked to anyone about how they’re feeling, an AI conversation can be a genuine first step, a way of finding language for something that has only existed as a vague, nameless weight. That has value.

Where it starts to fall short

The limitation isn’t that AI is cold or unempathetic. Modern AI tools can produce responses that feel remarkably warm. The limitation is something more fundamental: what AI produces is a very sophisticated pattern-match to language. It doesn’t understand you. It doesn’t know what you’re not saying. It can’t notice that your voice changed when you mentioned your mother, or that you deflected with a joke at exactly the moment something real was about to surface.

Therapy isn’t primarily about the words exchanged. It’s about what happens in the relationship between two people over time: the experience of being genuinely seen and held by another human being. That experience is not something AI can replicate, because it requires a witness who actually exists, who is affected by what you share, who remembers and builds on it session after session, and who brings their own humanity to the room.

Therapy isn’t primarily about the words exchanged. It’s about what happens in the relationship between two people over time.

There’s also a clinical dimension. A trained therapist isn’t just listening and reflecting. They’re assessing, tracking patterns across sessions, noticing what isn’t being said, and adjusting their approach based on a real understanding of your history and nervous system. AI can approximate some of the surface features of that process. It cannot do the actual work.

The risks worth naming

I want to be direct about a few things that genuinely concern me as a clinician:

AI gives the feeling of support without the substance of it. This matters because the feeling of being heard can temporarily reduce the urgency we feel to actually address something. If someone is using AI conversations to feel a little better about a situation that genuinely needs attention (depression, trauma, a relationship in serious trouble), that temporary relief might be delaying real help.

AI doesn’t flag risk. A trained therapist is listening for signs of self-harm, suicidal ideation, crisis, or escalating patterns that require a different level of care. AI tools are not equipped to do this reliably, and they cannot call for help or coordinate a safety response if something serious emerges.

AI confirms rather than challenges. It’s very good at validating. It’s much less good at the kind of honest, caring challenge that often produces real change. The moment when a therapist gently reflects back a pattern you’ve been unable to see, or holds space for something you’re avoiding. Consistent validation without challenge can reinforce stuck patterns rather than shifting them.

AI has no continuity in the way that matters. Even when a chat history is saved, AI doesn’t accumulate a real understanding of you the way a therapist does over months or years. Each conversation is, in a meaningful sense, starting over.

Why people choose AI over therapy

When I think about why someone reaches for ChatGPT or Claude instead of a therapist, I don’t think the primary reason is laziness or avoidance. I think it’s usually one of a few things:

  • Cost: therapy is expensive, and not everyone has extended health benefits that cover it

  • Access: finding a therapist with availability, the right fit, and the right approach can take time

  • Fear: of being judged, of what might come up, of being a burden, of not being ‘bad enough’ to deserve it

  • Cultural pressure: for many people, especially in communities where seeking help carries stigma, AI feels less exposing

These are real barriers, not character flaws. And I think it’s worth acknowledging that for some people, in some moments, AI is genuinely the most accessible option available. That’s a problem with the mental health system’s accessibility, not a problem with the person.

A way to think about it

AI is probably best understood as a bridge, not a destination. It can be a place to begin when beginning feels impossible. It can help you find language for something you don’t yet know how to say to a person. It can be a middle-of-the-night companion when the alternative is lying alone with your thoughts.

What it’s not, and can’t be, is a substitute for the kind of sustained, human, relational work that actually changes things at the level they need to change. The research on what makes therapy effective points consistently to the therapeutic relationship as the primary mechanism of change. That relationship requires a human being on the other end of it.

If you’ve been using AI and wondering if it’s enough

The fact that you’re asking that question is probably meaningful. It suggests you’ve been getting something from it, and that you sense there’s more available.

There usually is.

The free 20-minute consultation is a low-pressure place to find out whether working with an actual therapist might offer something different. You don’t have to stop using AI. You don’t have to have it figured out. You just have to be willing to try one conversation that’s a little more real.

A note from me

I know what happens in a therapy room, or on a therapy call, that doesn’t happen anywhere else. The moment when someone finally says the thing they’ve been circling for months. The shift that happens when they realise they’ve been heard in a way they didn’t know was possible. That’s not something I’ve seen a chat window produce. And I don’t think it can.

If you’re ready to try the real version, I’m here.

 

Virtual, across BC, in English or Spanish. No pressure, no commitment.

Yenny Paez, RCC

Yenny Paez is a Registered Clinical Counsellor (RCC) based in British Columbia, offering virtual therapy in English and Spanish across BC. She works with people navigating anxiety, depression, life transitions, and identity. Her approach is grounded in ACT and CBT, and shaped by a belief that good therapy starts with feeling genuinely understood.

Previous
Previous

How to Cope with Anxiety: Practical Strategies That Actually Help

Next
Next

Finding a Therapist in BC: What to Look For