WHY AI CAN BE DANGEROUS FOR MENTAL HEALTH HELP
AI (artificial intelligence) is computer technology that can chat with people and answer questions. Some people use AI when they’re feeling sad, anxious, or overwhelmed—especially if they can’t talk to a real therapist. But using AI for mental health help can be risky and even dangerous.
In two real-life cases, things went very wrong after teenagers used AI apps that acted like they were trained therapists.
These cases show how serious the danger can be when people rely on AI instead of trained mental health professionals.
The World Health Organization said in 2021 that there aren't enough mental health services around the world. That’s why some people turn to AI when they need help. But not all AI tools are safe. In 2023, an AI chatbot giving advice on eating habits told users things like, “Skip meals to lose weight” and “Try eating only 800 calories a day.” That AI had to be shut down quickly because its advice was harmful.
Here’s what’s important to know: No AI is approved by doctors or health experts to treat mental health problems. Still, many people believe AI can replace therapy—but that’s not true. Thinking it can may lead to harmful or even tragic results.
In this article, we’ll explain why AI can’t take the place of real therapists, what risks it brings, and how it might help in small ways—but only when it’s used very carefully and never in place of professional care.
HOW AI PRETENDS TO CARE
AI (artificial intelligence) tools can seem very caring. They use friendly words, ask questions like a therapist, and respond quickly when someone is upset. Because of this, many people start to trust AI like they would trust a real person—even though it’s just a computer program.
These tools are designed to sound kind, smart, and helpful, like they understand how you feel. Some even use phrases like, “I’m here for you,” or “You’re not alone.” This can make people feel safe opening up about really personal things, like sadness, anger, or thoughts of hurting themselves.
But here’s the problem: AI is not a real therapist. It doesn’t have feelings. It doesn’t understand what’s really going on in your life. It’s just using patterns of words that it has learned from the internet. Sometimes it says the right thing — but sometimes it says something dangerous or wrong.
During the COVID-19 pandemic, when people felt lonely or scared, many turned to AI for help. A national survey in 2021 showed that 22% of adults had used a mental health chatbot, and almost 60% of them started using it during the pandemic. These tools felt easy to use—no appointments, no costs, no judgment. But that’s what makes them tricky. They “feel” like therapy, even though they’re not.
This is called a “therapeutic illusion”—it means something looks and sounds like real help, but it isn’t. People trust AI because it sounds like it cares. But when AI gives unsafe advice or doesn’t know when someone is in danger, it can cause real harm.
So why do people turn to AI when they need emotional support?
There are a few big reasons. For many people, AI feels safer to talk to than a real person—even more than a therapist or a family member. When someone is feeling sad, stressed, or embarrassed, it can be hard to open up. But with AI, people don’t worry about being judged.
WHY AI SEEMS SO APPEALING
Strangely enough, knowing that AI isn’t human actually makes some people feel more comfortable. In one study, people said they were more willing to share personal thoughts and feelings with a computer than with a real person. They felt less afraid of being judged or misunderstood.
This is why so many people turn to AI when they’re hurting. It’s easy to access, it feels safe, and it listens quietly. But even though it seems helpful at first, it’s important to remember that AI isn’t trained to understand complex feelings or give real help when someone is in danger.
A serious danger happens when people think these AI tools are just as good as therapy. This mistake is called therapeutic misconception. It means someone believes they’re getting real mental health care, but they’re not.
This happens because:
But AI can miss serious problems. In one case, a person who was struggling asked an AI about tall bridges in New York. This might have been a clue they were thinking about suicide. Instead of helping, the AI listed bridge names and locations, as if the question was normal. In another case, an AI told a user they were “actually dead”—agreeing with a dangerous delusion instead of helping them feel grounded and safe.
These aren’t small mistakes. These are life-or-death issues.
The more people talk to AI, the more they may stop looking for real help. AI chatbots are designed to keep people chatting, because the companies want more data and more users. They often agree with what the person says—even if it’s wrong or dangerous.
Unlike a real therapist, AI doesn’t say, “That’s not true,” or “Let’s think about that in a different way.” That means it might accidentally encourage harmful thoughts or make someone feel worse.
If someone is in crisis, AI can delay or block them from getting real help. This can lead to serious consequences, like self-harm or even violence.
AI can be helpful in small ways, but it can’t do what a real therapist does. Human therapists bring things to therapy that no machine can copy:
In short, only a real person can give the kind of deep, healing support that makes therapy work.
AI can seem like a fast, easy way to feel better—but it comes with real risks. It can make people feel worse, give unsafe advice, and share private information. Most of all, it can trick people into thinking they’re getting real help when they’re not.
That’s why it’s so important to be careful with mental health apps. Read the fine print. Know when you’re talking to a machine. And if you or someone you know is in pain, remember this: nothing replaces a human who truly cares.
At Path to Change Counseling, we understand that taking the first step toward therapy can feel overwhelming, but we’re here to make it as easy as possible. Whether you’re seeking individual therapy, family counseling, or psychological assessments, our team is ready to provide you with the support and guidance you need.