Louder Than Words

The Limits of Chat GPT in Mental Health Care

ChatGPT might feel like a great tool for support, but it isn’t care. Here’s why that difference matters.

At first glance, it might look like ChatGPT would be really useful as a therapist. Give it the right prompts, and it turns into a sympathetic ear that can listen to all of your darkest thoughts and give you direction based on the collective knowledge of humanity.

You get to reach out when you’re spiralling, regardless of the time of day. You don’t have to explain your history. You don’t have to make small talk about how your week is going, and if you get uncomfortable, you can always close the app. That can feel safe, especially if you’ve ever had a therapy experience that didn’t give you the care you needed.

I know many people don’t choose ChatGPT because they think it’s better. They choose it because it’s available. It doesn’t ask for money, or Medicare details, or a six-week wait. It just says yes.

On the surface, a neutral, non-human, always available support sounds like a pretty great deal, and when we need help, it makes sense that we reach for what is in front of us. But what seems like support is a dangerous substitution.

While it’s no wonder why people turn to AI for support, especially when therapy can be difficult to access, when we take a closer look, it’s not so simple.

ChatGPT is for commerce, not care.

At its core, Open AI is a company, chat GPT is a service, and your data, your attention, and your engagement are a part of the product.

Open AI’s goal is not care. It’s profit. To maximise that profit, ChatGPT is built to keep you coming back. It’s not incentivised to help you get better, It is incentivised to keep you talking. There is a fundamental misalignment of goals here.

A laptop showing a financial graph. The line is going upwards over time.

Ethical mental health care prioritises your wellbeing. Platforms prioritise the value of your data.

There’s a great rule of thumb to keep in mind here- if the product is free, you’re the product being sold. Even if your chats are private now, we only need to look at 23 and Me for proof of that. They collected DNA data on thousands of users for over 10 years, and that data was recently sold to the highest bidder.

When you use ChatGPT for therapy, you are interacting with a product designed to extract value from your attention, your data, and your distress. Even with the best intentions, a system like this isn’t built to protect you. It’s built to scale. And tools designed to maximise engagement cannot also be trusted to protect your mental health. You deserve better than a service that is designed to profit from your pain.

There can be no care without accountability.

As the quote from IBM goes – “A computer can never be held accountable, therefore a computer must never make a management decision.”

This goes double when someone’s wellbeing is on the line.

Two women sit on a sofa across from a therapist. One of the women is speaking.

Certified mental health care professionals are held to strict standards. They must:

  • Use evidence based practises
  • Keep your information private.
  • Work with clear, informed consent.
  • Manage appropriate boundaries with clients.
  • Prioritise your safety and wellbeing

If those responsibilities are breached, therapists face real consequences – professional, legal, and sometimes even criminal.

Chat GPT has none of these obligations.

Its Terms of Service explicitly limit liability. If something goes wrong, if harm is caused, if someone is put at risk or worse, there is no one to hold accountable. There is no regulation, no duty of care, and no consequences.

A therapist is accountable to their clients, their profession, and the law. ChatGPT is accountable to no one.

ChatGPT is a language model trained on the best data we have available- and also the worst.

ChatGPT is trained on everything it can get its hands on. That includes peer-reviewed studies, research papers, and text books. It also includes Reddit threads, pop psychology books written by unqualified influencers, and flat-earther manifestos.

a laptop with ChatGPT open

While it can filter out some of the noise, it can’t think critically. It can’t tell what’s valid from what’s dangerous. It can’t weigh context, and it can’t tell what’s appropriate for you.

At the end of the day, ChatGPT is a predictive text engine. It’s not offering wisdom, it’s guessing what words are most likely to come next based on everything it’s ever read.

And that’s where it gets dangerous.

It can hallucinate facts. It can misrepresent your own words back to you. It can present made-up information with the same confidence as the truth.

And when you’re emotionally vulnerable, that illusion of authority can be harmful. Even fatal.

Careless Affirmation isn’t Affirming Care

ChatGPT has a nasty habit of being affirming to the user. It’s designed to help by affirming you, mirroring your language, and tries to be helpful by telling you you’re right.

Sometimes that feels great, especially when you’re hurting. But if you’ve ever had a spiral, you know how dark and wild our thoughts can get. In those moments, you might be convinced that you’re completely unloveable, that everything is terrible and nothing will ever get better.

Having something that echoes those thoughts back to you in your worst moments isn’t just unhelpful, it’s dangerous.

On the flip side, ChatGPT can swing toward what it thinks is “reasonable.” It plays devil’s advocate just to be “balanced” at the worst possible moment. It can reframe abuse as “a communication breakdown.” It can suggest you just need to empathise more with someone who is actively harming you. It can make it seem like a strong reaction is an over reaction, even if it’s entirely warranted.

Because it can sound measured, it can lead you to second guess yourself. It can erode your self-trust, and ask you to rely less on your own intuition and self-knowledge. It weakens the muscles you should be strengthening.

ChatGPT can’t connect like humans do.

It only knows what you tell it, and it can only respond in words.

Human communication is messy. It’s complex, and the way we communicate something is just as important as the words that we use. Communication lies in where we pause, how our voice shakes, and in what we’re not ready to say yet. That’s where the meaning lies.

ChatGPT is not capable of processing that kind of information. It can’t hear silence. It can’t feel the shift when you say you’re fine, but you’re not. It can not tell when to press into something and when to back off. It can’t interrupt a spiral, and it can’t sit with you in a moment of quiet without saying anything at all.

It does not build trust over time. It doesn’t notice patterns in your silences, or how you play with the hem of your shirt whenever you’re uncomfortable. It can’t breathe with you while you fall apart.

A teenage boy in a counseling session discussing with a therapist in an office setting.

It might sound like it cares, and the words can be close enough to fool you, but there’s no-one on the other end of the line. It’s just a reflection of what you’ve already said, dressed up with predictive language. When you’re at your most vulnerable, that really matters.

A good therapist doesn’t just parrot your own words back to you. They hold a safe space for you to explore your experiences. They notice what’s not being said. They ask those annoying questions that cut right to the heart of the matter. They make connections over time. They sit in the discomfort with you without immediately jumping to fix it.

They use their training to look for what’s lying underneath the story you’re telling, to call you out on your bullshit, and to see you through making sense of your experiences while honouring your agency, your boundaries, and your safety. They use their own skills, presence, ethics, and humanity to do that work with you, and if it goes wrong, they’re accountable.

At the end of the day, a therapist’s job is to help you build the skills you need to navigate hard things. When we keep reaching for something outside of ourselves to regulate, to reflect, and to guide us, we rob ourselves of the change to practise those skills. The thing about practise is that it works. Even when what we’re practising is avoidance, outsourcing, or second guessing ourselves.

When we lean on something else to do the work for us, we don’t get to feel ourselves growing stronger, we don’t learn what it means to stay with discomfort and make sense of it on our own terms.

You don’t need feedback loops or instagram ready platitudes. You need care that helps you find your way back to yourself. The kind that doesn’t just talk, but is a real, grounded presence.

Real care shouldn’t be as difficult to find as it can be. In the meantime, we stitch it together however we can, whether it’s through friends, hotlines, peers, or voice notes at 3am. None of this critique changes the fact that you still need support, but it’s a reminder that it’s not something a machine can offer you. That kind of support can’t be generated.

It’s okay to reach for what’s there, but don’t mistake an echo of humanity for care.

Louder Than Words is a publishing project by Reflex Response Services, a Newcastle based trauma counselling service that shows up when care is needed most.

We write about trauma, support, and the messy middle of crisis and recovery, because we believe care should be accessible, accountable, and real. You can find out more about Reflex over at reflex.org.au.

About the Author

Avatar photo

Alicia

Alicia (they/she) is a writer, strategist and organiser who builds real-world ways for people to survive hard things and stay connected. They write about the slow of healing, creating safer places for ourselves and others, and the messy business of choosing care, community, and connection.

    Want to stay in the loop? Let's stay in touch.

    Pop your email below, and we'll send you a little message whenever we release something you might be interested in.

    Become a supporter. Let's build something louder.

    Thanks to our supporters, we can keep our resources free, our voices loud, and our communities informed.

    Becoming a monthly member means more than just donating. It means you're part of the crew that keeps Louder Than Words alive: funding trauma-informed guides, survivor-led stories, and real-world tools that help people find their footing after crisis.

    Every contribution helps us publish more resources, reach more people, and build a safety net that doesn’t leave anyone behind.

    Powered by Reflex Response Services


    Reflex is an Australian charity providing trauma response services and resources, supporting communities so no one slips through the cracks.

    We operate from unceded Awabakal land.

    Reflex acknowledges the Australian Aboriginal and Torres Strait Islander peoples as the first inhabitants and the custodians of the lands where we live and work.

    We pay our respects to Elders past and present.

    Sovereignty has never been ceded.

    It always was and always will be, Aboriginal land.