Key Takeaways (TL;DR):
- Benefits: AI chatbots are helpful add-ons for learning coping skills, managing mild stress, and getting 24/7 mental health education.
- Risks: Bots lack human empathy and cannot safely handle mental health crises or complex trauma.
- Privacy Gap: Most AI tools are not HIPAA-protected, meaning your data may not be as private as it is with a human psychotherapist.
- Bottom Line: AI is a useful helper for mental health information and skill-building, but it cannot replace the safety and expertise of a licensed human therapist.
In the United States and around the world, the need for mental health care is growing and many peoples’ needs often go unmet due to several factors.
Mental health in the US: Increasing need and limited resources
Mental Health America’s “The State of Mental Health in America 2025” report reveals that:
- In 2024, almost 1 in 4 U.S. adults experienced a mental illness (also known as a mental health disorder).
- In 2022 and 2023, 1 in 4 adults with mental illness were unable to access treatment.
- Across the U.S., there is one mental health provider for every 320 residents.
- A shortage of licensed mental health providers, as well as cost, location, and stigma are among the many reasons that mental illness often goes untreated.
This is where AI-generated mental health support comes in.
Benefits of using AI for therapy
While there are plenty of reasons for caution, there are some encouraging findings as researchers begin to study the potential benefits of using AI alongside therapy.
Despite many clearly identified risks, the accessibility of AI and our increasingly technologically-based society suggest that AI could play a meaningful role in mental health treatment in the coming years.
According to the APA, “The responsible development of AI technology is critical for public well-being. By ensuring these powerful consumer products are designed with safety and psychological science in mind, we can help realize their benefits while mitigating their risks.”
Some ways that AI can be helpful to individuals needing mental health support are:
- Providing education about mental health problems
- Teaching skills to reduce stress and anxiety
- Recommending treatment options for specific mental health concerns.
The risks of AI-assisted therapy
The escalating need for mental health care in recent years has led to more people seeking advice and support from AI platforms, because they’re low cost and relatively easy to access.
While some studies have shown that mental wellness apps and other forms of AI chatbots can offer some support to people managing mental health problems, these resources are limited at best and potentially quite dangerous. At present, they are poorly regulated and lack guardrails and universal safety standards.
Arthur Evans Jr., PhD., CEO of the American Psychological Association, explained, “We are in the midst of a major mental health crisis that requires systemic solutions, not just technological stopgaps. While chatbots seem readily available to offer users support and validation, the ability of these tools to safely guide someone experiencing a crisis is limited and unpredictable.”
How AI chatbots compare to traditional therapy
In one study cited by the APA, AI-based therapy does not match the effectiveness of human delivered treatment, yet it may serve as a useful adjunct to therapy.
This study, which looked specifically at ChatGPT-3.5 in comparison to human-delivered psychotherapy, found that ChatGPT “shows potential” though it lacks the “nuanced empathy and therapeutic alliance that characterize effective human therapy.”
Stanford University researchers similarly found that AI’s ability to detect subtext in users’ prompts can lead to dangerous outcomes such as suicide.
These dangers and the lack of interpersonal relationship that is central to traditional human-delivered psychotherapy causes many professionals concern about patients using them for mental health support.
What type of patients can benefit from AI chatbot therapy
AI has the potential to provide useful assistance to licensed therapists, and to increase access to mental health resources for many patients.
AI tools are not recommended for people in crisis or those with serious mental health disorders, where support from a licensed health provider is essential. But for individuals looking for coaching, tools, and education on mental health, a chatbot may be a useful resource.
The safety and effectiveness of AI chatbots
Although many people have turned to AI chatbots and mental health apps for emotional support, skill building, and other mental health needs, evidence to confirm the actual effectiveness and safety of these tools remains inadequate. AI cannot replace the professional training and skills of a human therapist.
Likewise, AI technologies are being developed at a rapid pace and our ability to fully understand their capacity has not caught up. As a result, reports of significant harm, particularly to adolescents and other vulnerable populations, are mounting.
Dr. Evans warned, “For some, this can be life threatening, underscoring the need for psychologists and psychological science to be involved at every stage of the [technology] development process.”
Specifically, the American Psychological Association (APA) has identified safety concerns, worsening mental health symptoms, and self-harm as a result of individuals using chatbots for therapy.
In a paper on mental health apps, The Psychotherapy Action Network wrote, “We liken the situation to a hypothetical antibiotic shortage. Imagine if the response were merely to sell diluted antibiotics or untested remedies. That’s what we have today. Too many mental health technology offerings are either watered down versions of safe, effective treatments or some form of digital snake oil.”
It’s also important to keep in mind that AI chatbots are not licensed therapists, and are designed to keep users engaged so they can use their data for profit.
Privacy concerns related to AI therapy
While the FDA has authorized many AI-enabled medical devices in recent years, none of them addresses patients’ psychological health. In fact, there is currently no federal framework designed to regulate AI therapy chatbots despite a long list of concerns for individuals’ safety and privacy.
Some states have begun to handle this situation by passing laws that address AI in mental and behavioral health interventions. Nevada and Illinois have banned the use of AI for mental health treatment. Utah has adopted disclosure and data-protection requirements.
However, some of these regulations intended for AI therapy-specific chatbots do not cover generic chatbots, like ChatGPT.
HIPAA and AI chatbots
HIPAA (The Health Insurance Portability and Accountability Act) protects a patient’s personal health information, yet these provisions only apply to “covered entities”. Health plans and healthcare providers, including psychotherapists, are required by law to protect patient privacy.
AI is not a covered entity at this time and despite privacy assurances noted in chatbots’ terms of service, no oversight or transparency now exists to assure confidentiality of patient conversations and data.
So, while some states are working to establish laws, policies, and requirements to assure patient safety and privacy, these efforts are still in the early stages.
Additionally, the Associated Press reported in September that “Regulators struggle to keep up with the fast-moving and complicated landscape of AI therapy apps.”
Experts continue to raise questions about the safety and efficacy of AI in mental health care, while policymakers express the need for clear standards regarding transparency, data management, and consumer safety.
The bottom line on therapy chatbots
As technology evolves, we will see what AI offers patients in the future and how it can safely and effectively support patients’ mental health care. For now though, many valid concerns and limitations exist related to patient safety and privacy, and experts caution the public about relying on AI chatbots alone for psychological therapy.
Technology is changing many aspects of how we live, and how we care for our health. These changes come fast, and many changes could be beneficial. Others require careful study and cautious use.
If you have concerns about your mental health, WWMG’s Psychology department is here to help you navigate the challenges you face, and our Psychologists are open to discussing how technology can be part of your journey to mental wellness.
Request an in-clinic or telehealth appointment with WWMG Psychology today by calling (425) 259-1366.
If you are in crisis
If you or a loved one is in crisis, call or text 9-8-8 or call the Crisis Services Line at (800) 584-3578. This service is available 24/7/365.
If you’re in immediate danger, call 911.
