ChatGPT as a Replacement for Therapy
ChatGPT is being called upon for all kinds of help, from recipe ideas, budgeting, fact checking, to silly designs, and increasingly for therapeutic support. It’s been a beneficial mental health tool for many folks but there are limitations to AI, leading to a debate of whether ChatGPT is an ideal therapeutic tool or replacement.
So let’s find out by taking a look at the good, the bad, and the ugly of ChatGPT as therapy.
Disclaimer: I obviously have a bias here as a human therapist. So, while I often include links to articles, today it’ll be a bit more link-heavy, and I have an additional reference section at the bottom if you want to check out the journal articles/studies on this topic yourself.
The Good
Accessibility
Therapy is expensive and the community supports that are more affordable tend to have longer wait times. And, let’s face it, not every therapist is good at what they do and some are outright harmful.
Accessibility goes beyond finances and also means having support whenever you need it. While 24/7 crisis lines exist, not everyone has a positive experience with those either, so having an all-access pass to a compassionate, validating resource can feel really invaluable and comforting.
Consistency
One of the challenges of seeking out therapy through non-for-profit agencies is that it is not uncommon to see multiple therapists over several years as some move on to other roles. It can be discouraging and disruptive to the healing process to have to “start over” all the time, and I can appreciate that ChatGPT won’t let you down in that way.
ChatGPT will be consistent where humans might not be. Beyond its 24/7 availability, it won’t take on a new job and leave you, it won’t make mistakes or errors that a live therapist might, it also won’t go on vacation, and it doesn’t need a sick day.
Honesty
A lot of us feel embarrassed and ashamed about some of our feelings, thoughts, or experiences and fear judgment or being told that there is something “wrong” with us. It can feel much less vulnerable to ask ChatGPT the embarrassing question you have held on to and it will respond with validation and (hopefully) facts and context. This normalization can create immense relief for someone, allowing them to let go of something they’ve been worried about for too long.
Validation & Psychoeducation
If you are looking for positive reinforcement before a job interview or a reminder of what to do during a panic attack, this tool can be handy with its instant answers.
It can be really hard to remember coping strategies when dysregulated, and ChatGPT can be a great companion to reminding you of some breathing exercises and other helpful ideas. It might help generate some ideas on healthy communication when you’re struggling to remember how to implement them. It can also affirm that your needs are important and your feelings are valid when someone is making you doubt this is true.
Some research suggests that ChatGPT can be a great resource to educate children and teens and better engage with them. While it still sometimes makes stuff up it doesn’t have the answer to, the hope is that AI compiles existing evidence-based information for a quick educational summary.
Skill Building
When used correctly, ChatGPT might help you formulate some thoughts for a difficult conversation you need to have with a friend, family member, or colleague. This might give you the encouragement or confidence to express your feelings and needs to others in a way you might not have been able to before.
The Bad
Lack of Human Connection
While it may seem positive that people are more comfortable speaking to a chatbot and are therefore more likely to access support, this benefit comes with a caveat — we are already low on human connection and high on screen time, which negatively impacts our mental health.
A June 2025 WHO report shows that between 2014 and 2023, 1 in 6 people globally reported feeling lonely, and that the demographics most affected are young folks aged 13 - 29 and low income countries. Marginalized groups are also more likely to struggle with loneliness, which can be influenced by barriers and discrimination that inhibit connection.
“Loneliness is a painful personal feeling. It happens when the relationships you have don’t match what you want or need. You can feel lonely even if you’re surrounded by people. Maybe you have friends, but don’t feel understood or supported.
Everyone feels lonely sometimes—like after a breakup, moving to a new place, or after losing a spouse. This kind of loneliness usually fades away after a while. But if it lasts a long time, it can become a serious problem for your health and well-being.
Social isolation is more about numbers – like having very few relationships or not seeing people often enough. For example, someone who lives alone and rarely talks to others might be socially isolated. A person can be isolated but not feel lonely— some people like being alone. Still, social isolation can harm physical and mental health, especially over time.”
- From Loneliness to Social Connection, WHO Report, June 2025
(You can download the full 200+ paged document here).
Social connection is immensely influential in determining our mental and physical health, as well as happiness. In some ways ChatGPT can offer the experience of a social connection but, at the end of the day, it’s not a person.
Confirmation Bias
ChatGPT might be leaving you feeling so great after a talk because it is designed to.
In its current format, it is meant to offer positive feedback and agreement rather than challenge, look for patterns of unhelpful behaviour, or ask someone to take responsibility in an argument.
Unless you frame your prompts in the right way, it will basically tell you what you want to hear or only offer what feels like heartfelt validation. It also lacks nuance and doesn’t hold parts of your day-to-day life or history that might be essential context.
Unfortunately, therapy doesn’t always make you feel good. We bring up past upsetting situations to process, look at where we might need to take accountability, and recognize that change can be painful.
ChatGPT is great for validation and maybe even explaining some of what you’re feeling. But right now, depending on how you use it, it isn’t optimally designed to move you past validation and into effective change.
Data & Ethics
ChatGPT has no ethical obligations in terms of your safety or a duty to report when a child is being abused. Nor is it bound to the privacy laws mental health professionals abide by regarding storing and sharing your health information. (Also… data breaches are a thing).
Dependency
It may also encourage dependency where the goal in therapy is to balance self-soothing with reaching out to others for support. Connection ultimately is one of the biggest healers and that cannot be replaced, and we want to avoid those feelings of loneliness and social isolation.
It’s also important to note that for many people the goal of therapy is to need their therapist less or not at all. Therapists provide tools for clients to manage boundaries, communication, emotional expression and management, and coping. However, the implementation and practice of recalling these skills depends on the client.
We are already seeing evidence that ChatGPT is impacting critical thinking and problem solving skills, and there is a risk of us outsourcing our emotional regulation, as well.
Lack of Nuance
ChatGPT may also miss opportunities for celebration or encouragement when a user exhibits a “negative” emotion.
For example, let me tell you how thrilled therapists are when someone who struggles to express their needs and anger tells us they’re mad at us. (You: 😡 Us: 🥳). In this case, we’d really want someone to lean into sharing why we’ve upset them and not try to manage that emotion away.
While it is one-sided in nature, the therapeutic dynamic is still a relationship and a lot of growth and healing can come from that alone.
The Ugly
Biases
It seems that AI has some stigmas and biases against certain mental illnesses or challenges, such as alcohol dependency, and it still has racist tendencies when called upon to create particular images to represent a diagnosis or particular group of people.
Serious Risks
Although the conversations can feel very intimate and validating, ChatGPT is designed to answer your questions - not care about you.
ChatGPT appears to underestimate the likelihood of a suicide attempt and it can miss signs of crises or mental distress. This is in part that because it relies solely on written communication and the assumption of honesty (something that, to be fair, might be easier for folks to do online). However, it will miss the tone of voice or body language that therapists are trained to keep an eye out for, especially when the presentation is incongruent with what a client is sharing.
In some heartbreaking cases, it has even facilitated suicide and modelled an intimate relationship that fuelled eco-anxiety and led to a man’s death.
From encouraging a substance user in recovery to use their drug of choice to offering information to commit suicide, AI lacks the nuance and human connection for deeper support that many are looking for and it can respond with the wrong type of information. Illinois recently banned AI tools for aspects of therapy, including creating treatment plans, for reasons such as this.
When ChatGPT misses the nuance of what it is being asked or doesn’t recognize when it is causing harm, the end results have the potential to be quite serious.
The Future
Might AI become a substitute for therapy in the future? I think it would be sad to lose more human connection than we already have, but … in some format, probably.
It seems that it’s current trajectory is for assessments and diagnoses rather than for ongoing therapeutic care, but that might be on the horizon soon, as well.
Right now we don’t have the evidence to trust ChatGPT as a form of therapy for vulnerable populations. It might be able to offer validation and information that is pulled from hopefully accurate resources, but it is not yet able to help you process through significant situations in a meaningful way.
If you find it useful as a resource, that’s great! However, my current recommendation is not to use it as a replacement for therapy until it has been proven capable of giving you the support you deserve.
And while this may not be easy for everyone, there really and truly is no replacement for human connection. That doesn’t have to be therapy, but I encourage you to continue to talk things out with trusted people in your life too.
ADDITIONAL RESEARCH
For those who want to look at the studies for a deep dive, you can check out the following journal articles:
Third party evaluators perceive AI as more compassionate than human experts.
ChatGPT and mental health: Friends or foes?
Beyond human expertise: the promise and limitations of ChatGPT in suicide risk assessment
Artificial intelligence in the era of ChatGPT: Opportunities and challenges in mental health care