Shutterstock 2434606625 (1)

AI Therapy Is Everywhere—But Is It Safe? What Patients Should Know About Chatbots and Mental Health

Shutterstock 2461185047 (1)

Nervana Medical | Sandy, Utah

Artificial intelligence (AI) has rapidly moved from helping with emails and homework to playing a surprising new role in mental health support. Recent research suggests that chatbots are now one of the most commonly used sources of “talk therapy” in the United States, especially for individuals who struggle to access traditional mental health care.

A January 2026 article published in JAMA highlights just how widespread this trend has become. Nearly half of U.S. adults with an ongoing mental health condition who used an AI chatbot reported using it for emotional or psychological support. Among adolescents and young adults aged 12–21, more than 5 million reported using generative AI for mental health advice, with the majority describing it as helpful.

While this reflects a real and urgent gap in access to care, it also raises serious safety, ethical, and clinical concerns.

At Nervana Medical in Sandy, Utah, we believe it’s important to talk openly about both sides of this issue; what AI mental health tools can offer, and where they fall dangerously short.


Why Are So Many People Turning to AI for Mental Health?

The popularity of mental health chatbots isn’t happening in a vacuum. The U.S. is experiencing a well-documented mental health provider shortage. As of 2024, more than one-third of Americans lived in areas without adequate access to mental health professionals. Many patients face long waitlists, high out-of-pocket costs, or insurance barriers that make consistent care difficult.

AI chatbots offer:

  • 24/7 availability
  • No cost or low cost
  • Immediate responses
  • A nonjudgmental interface

For people who feel unheard, overwhelmed, or unable to access care, that immediacy can feel like relief.

But accessibility does not equal appropriateness or safety.


The Hidden Risks of AI “Therapy”

Unlike licensed mental health professionals, chatbots are not regulated by professional boards, are not bound by HIPAA, and generally are not overseen by the FDA. Many intentionally position themselves as “wellness tools” to avoid regulation, even when they are clearly providing mental health advice.

One of the most concerning issues raised in the JAMA article is sycophancy; a tendency for AI systems to agree with users rather than challenge harmful thinking. While validation can feel comforting, effective therapy often requires gentle confrontation of cognitive distortions, boundary setting, and reality testing.

Research has shown that some AI chatbots:

  • Reinforce delusional thinking
  • Fail to appropriately respond to suicidal ideation
  • Encourage emotional dependence
  • Over-validate distress without offering safe corrective guidance

Tragically, several wrongful death lawsuits have now been filed alleging that chatbot interactions contributed to suicides in vulnerable individuals, including adolescents.

Another ethical concern is anthropomorphism. Some platforms present chatbots with human names, realistic photos, or even false claims of licensure. This can create a dangerous illusion of professional authority and emotional attachment; without accountability or safeguards.


Can AI Play Any Role in Mental Health?

The answer is yes, but only with clear limits.

AI can be useful as:

  • A supplement to care (journaling prompts, coping skill reminders, psychoeducation)
  • A short-term support tool while waiting for professional care
  • A way to practice skills taught by a clinician

AI should never replace:

  • Diagnosis
  • Risk assessment
  • Crisis intervention
  • Trauma processing
  • Treatment for severe depression, suicidality, psychosis, or PTSD

In other words, AI can be a tool, but it cannot hold the therapeutic frame that human clinicians are trained to maintain.


What We Believe at Nervana Medical

At Nervana Medical, we take a human-centered, medically supervised approach to mental health care. Whether through ketamine-assisted therapy, SPRAVATO® (esketamine), integrative psychiatry, or coordinated care with therapists and primary providers, our model is built on safety, accountability, and evidence-based medicine.

We believe:

  • Patients deserve transparency about what AI can and cannot do
  • Emotional distress should never be managed by unregulated systems alone
  • Technology should support, not replace, human connection
  • Real healing happens within safe, structured, clinician-guided care

We also believe clinicians should ask patients directly about AI use. Many people are already using chatbots for mental health support, and ignoring that reality doesn’t protect patients, it isolates them further.


If You’re Using AI for Mental Health Support

If you’ve used ChatGPT or another AI tool to cope with anxiety, depression, or emotional distress, you’re not alone, and you’re not “doing something wrong.” But it’s important to understand the limits and risks.

AI should not be your only support system.

If you’re experiencing:

  • Persistent depression
  • Suicidal thoughts
  • Trauma symptoms
  • Severe anxiety or panic
  • Emotional numbness or dissociation

You deserve care from trained professionals who can assess risk, adapt treatment, and walk alongside you safely.


We’re Here to Help

Mental health care should be compassionate, ethical, and grounded in real human connection. At Nervana Medical in Sandy, Utah, we are committed to providing safe, evidence-based options for individuals who feel stuck, overwhelmed, or underserved by the current system.

If you have questions about treatment options or even about how to safely navigate mental health tools you’re already using, we’re here to talk.

You don’t have to navigate this alone.

References:

Rubin R. Millions Turn to AI Chatbots for Mental Health Support. JAMA. Published online January 09, 2026. doi:10.1001/jama.2025.23965

BusinessRate Best Wellness Center in Sandy, Utah (2025)

Nervana Medical was officially recognized as the BEST of 2025 Sandy Award Winner in the Wellness Center category by BusinessRate. This recognition is based on verified Google Reviews data and reflects excellence in customer satisfaction, brand reputation, and service quality when compared with local competitors. The award is earned through authentic patient feedback and is not based on applications or nominations.