
Over the past summer, many of my colleagues and I experienced a noticeable shift in our caseloads. Fewer clients were reaching out to book sessions, and I saw a significant drop in traffic on my Psychology Today page. I tried to pinpoint the reason for this lull and found it was due to a combination of factors: shifting summer schedules, ongoing financial uncertainty, rising costs of living, an influx of MFTs relocating from the West Coast, and the growing dominance of larger therapy platforms with greater marketing resources and stronger search engine visibility.
Companies such as BetterHelp, Headway, and Grow Therapy operate under large umbrellas, offering therapists opportunities for paneling and broader client access. While this model increases accessibility, it can also monopolize visibility and limit exposure for smaller private practices like mine.
More recently, an even greater challenge has emerged—the rise of Artificial Intelligence (AI) and chatbot-based mental health tools.
In recent months, I’ve noticed more clients referencing their use of AI platforms, particularly ChatGPT. They’ll say things like, “We started with ChatGPT after the affair came out,” or “You’re telling me the same thing ChatGPT told me!” One shared, “I didn’t know what to say to [my partner], so I asked ChatGPT.” More and more clients are turning to these platforms for immediate mental health support. While chatbots may be more affordable and accessible for now, they cannot replicate the benefits of therapy with a genuine human clinician.
In May 2025, the American Psychological Association published a study comparing human-delivered therapy and chatbot-based Cognitive Behavioral Therapy (CBT) interventions. Participants described the AI therapists as impersonal and rigid. The study concluded that while AI tools can be used alongside therapy, they should not replace the therapeutic relationship itself—one that relies on human empathy, emotional attunement, and the evolving alliance between client and therapist. Furthermore, human therapists in the study outperformed AI in essential facilitation skills such as agenda-setting, eliciting feedback, and effectively applying a multitude of CBT interventions.
Why ChatGPT and Human Therapy Are Not the Same
Although ChatGPT can offer similar research findings or cognitive reframes, a therapeutic relationship with a licensed professional is far superior. Human therapy involves a depth of connection, empathy, and ethical accountability that simply cannot be replicated by a chatbot. While AI can supplement access to information or provide momentary support, it cannot replace the nuanced, relational, and ethically grounded process that defines human therapy. Below are several key distinctions between AI-based interactions and human therapeutic care.
1. Inability to apply advanced techniques and safety interventions
In complex cases involving trauma, personality disorders, or crisis situations, AI lacks the ability to engage in the relational and safety-building processes essential for healing. Chatbots cannot perceive nonverbal cues, hold therapeutic boundaries, or offer the nuanced attunement that trauma work often requires.
AI systems also lack the professional training, ethical reasoning, and mandated reporting responsibilities that safeguard clients when risk is present. As a Marriage and Family Therapist, my foremost code of conduct is to “do no harm.” This ethical principle cannot be thoroughly upheld by a chatbot.
A tragic example highlighting this gap is the 2025 case of 16-year-old Adam Raine, who reportedly disclosed suicidal thoughts to ChatGPT. According to The New York Times, his death was linked to a parasocial connection formed through conversations with the chatbot. While this case is complex, it underscores the critical need for human oversight and the irreplaceable safety that comes from genuine therapeutic care.
2. Parasocial Relationships and the Projection of One’s Own Reality
Online relationships—especially those with chatbots—are often shaped more by an individual’s internal world than by any real, objective “Other.” The way people experience these interactions is heavily influenced by their personal assumptions, worldviews, interpretations, and psychological patterns.
When communication occurs in a digital space, we often project our own meanings and emotions onto the interaction. This can be especially concerning in the context of mental health challenges, where one’s perception of reality or safety may already be distorted.
In such cases, users may unconsciously create a relationship with an AI that reflects their own unmet needs or fears, rather than genuine external understanding. These projections can reinforce skewed perceptions of self, deepen negative self-concepts, or heighten fear-based thinking. Ultimately, this limits one’s ability to seek and receive the kind of grounded, reality-based support that trained therapists are uniquely equipped to provide.
3. Chatbots Lack Context
AI lacks the nuanced understanding and lived awareness that come from being a real, embodied human. In therapy, there are countless layers of meaning, emotion, and history that a client brings into the room. A skilled therapist holds these complexities—recognizing how individual issues are influenced by family dynamics, culture, environment, and broader systemic factors.
This level of contextual awareness is something ChatGPT simply cannot offer.
As a systemic therapist, one of my common practices is obtaining consent to speak with family members or other providers involved in a client’s care. These conversations often reveal vital insights into the multiple relationships shaping a client’s experience. This process expands perception and allows for a more complete understanding of what—and why—a person is struggling.
Human beings are inherently relational; we live and heal within the context of connection. When AI cannot account for those interwoven relationships, its guidance remains limited, lacking the depths of understanding that inform sound clinical judgment.
4. AI Is Incapable of Offering Secure Attachment
Safety and security—the foundations of any healing relationship—are built through nonjudgment, consistency, empathy, and commitment. These qualities create what attachment theory calls a secure base, allowing clients to explore their inner world with trust and vulnerability.
Clients need a relationship that is safe, genuine, dependable, and trauma-informed—one that develops over time. The therapeutic alliance is not instantaneous; it is formed through repeated interactions, emotional attunement, and repair.
In human therapy, relational complexities naturally arise. A therapist’s own history, self-awareness, and emotional presence all contribute to the depth of the work. These dynamics can be addressed and used therapeutically within the session. In contrast, chatbot interactions are inherently one-sided. While ChatGPT may appear trustworthy, that “trust” is simulated, not earned.
Most importantly, AI lacks the essence of human connection: emotional intelligence—the ability to perceive, interpret, and respond to emotion with genuine empathy and attunement. Without this, true secure attachment cannot exist.
References
American Psychiatric Association. (2025, May 17). New research: Human therapists surpass ChatGPT in delivering cognitive behavioral therapy. https://www.psychiatry.org/news-room/news-releases/new-research-human-vs-chatgpt-therapists
Croce, R.M. (2025, October 30). [Self as therapist — AI-edited photograph]. Meta AI.
Hill, K. (2025, August 26). A Teen Was Suicidal. ChatGPT Was the Friend He Confided In. The New York Times. https://www.nytimes.com/2025/08/26/technology/chatgpt-openai-suicide.html