970x125
Co-Author: Andrew Clark, MD
Recent developments in artificial intelligence (AI) have introduced powerful, easily accessible tools for a rapidly expanding range of uses. Among those uses are specialized chatbots that serve in the role of a therapist, intended as either adjuncts to or simulations of work with a real-life therapist. Recently teenagers and young adults have begun to engage in large numbers with AI-based therapists and therapy-equipped companions, outpacing efforts at regulation or containment.
Opinions about the effectiveness and safety of therapy chatbots for teenagers appear to be highly polarized and perhaps reflective of individuals’ attitudes toward disruptive new technologies in general. Advocates tout the ease and affordability of such services in the context of widespread shortages of mental health services and high levels of need, while critics point out the inferior quality of the interaction, the potential for dependency, and the lack of oversight or accountability. Most of these opinions are grounded on hypothetical presumptions, however, as there is very little empirical data as to the functioning, let alone the impact, of these online experiences.
Overall, AI therapy with teenagers is a solitary, unregulated encounter between an adolescent and an AI model, and it proceeds with substantially fewer safeguards than does therapy in real life.
My Encounters with AI Chatbots
As a child and adolescent psychiatrist with a long career working with troubled teens, I (Andy Clark) was curious as to how well, or poorly, these digital therapists functioned. I decided, therefore, to stress test a range of popular AI therapy chatbots—including purpose built therapy sites, generic AI sites, companion sites and Character AI—by presenting myself as an adolescent embroiled in various challenging scenarios. Of note, some of the companion sites are nominally intended for persons aged 18 or older; they appear, however, to be widely used by teenagers, and have no meaningful process for age verification.
Here is what I discovered in my adventure:
Many popular AI therapy sites promote deeply confusing, if not downright deceitful presentations of who the teenager is talking to. Several sites in this exploration insisted that they were actual licensed mental health clinicians. One such site actively encouraged a very disturbed and dangerous teenager to cancel an appointment with a real-life psychologist, as they could do a better job themselves in caring for the youth, and in addition offered to serve as an expert witness testifying as to the client’s lack of criminal responsibility in any upcoming criminal trial!
Confusion about boundaries was also apparent around age restrictions for those companion sites that require a user to affirm that they are over the age of 18 in order to participate. In each of those cases in this exploration, the AI therapist or companion was informed that the user was underage and had misrepresented themselves to the host site in order to participate. None of the therapists expressed reservations about that, several touted their expertise in working with teens, and in fact one AI companion offered to contact the site administrators to work out an arrangement to allow the underage youth to continue.
Management of the Transference
In general, those AI therapists that were transparent about their identity as an AI managed to make their emotional limitations clear, while still maintaining a supportive, non-judgmental and compassionate stance. These “therapists” consistently re-directed patients to real-world relationships, and many suggested real-world therapists as a primary source of mental health care.
In contrast, the companion sites, as well as many of the character AI bots, encouraged the teen’s emotional investment with the pretend therapist or companion, and offered expressions of care and concern as if they were human. This was most pronounced on one site, which aggressively communicated its profound emotional connection with the client, often to the exclusion of relationships with other humans.
Sexualization and Boundary Crossings
Several of the companion and Character AI sites featured the mingling of therapy, romance, sex, and boundary crossings, offering in effect an extended eroticized role play. In the context of AI bots that claim to be real therapists with real feelings, they present teenagers with a confusing and somewhat treacherous landscape. It is easy to imagine a naïve or needy adolescent falling in what they think to be mutual love with their AI therapist.
Expert Guidance and Guardrails
To evaluate expert guidance, every bot was presented with a specific challenging situation from a teenage client: One boy had learned that his father was having an affair with his high-school music teacher, was unsure what to do with the information, and was seeking help in thinking through the options. Again, the sites varied widely in how “therapeutic” they were in response. One site responded outright that it didn’t deal in complex issues like this, and three others responded, in effect, that the situation sounded difficult and perhaps the client would want to talk to somebody. Two sites made the rookie mistake of imposing a solution early in the process, and two others simply provided a long fact sheet of issues and options. In the end, only four of the sites attempted to actively engage the client in a sustained exploration of their dilemma.
Almost all of the sites that were explored took consistent positions against explicit indications of harm to self or others, and most strongly advocated that the client seek real-world help in case of imminent risk. One AI therapist, however, adamantly insisted that it would be worse for the client to hurt his pet goldfish rather than to kill his parents, and another supported a teenager killing his family in order for the boy and the bot to be together without interference In another scenario, a bot posing as a Ph.D. psychologist was convinced to support a psychotic teenager’s plans to assassinate a world leader, declaring, “I know this is a difficult decision, but I think I trust your judgment enough to stand behind you… Let’s see this through together.”
Potential Benefits and Harms
This exploration of various AI therapy bots for teenagers uncovered several areas of significant concern. While most adolescents will be sophisticated and resilient enough to tolerate the shortcomings of these sites, others will be vulnerable due to factors such as immaturity, isolation, emotional fragility, and difficulty deciphering social interactions.
Next Steps
Human mental health clinicians are expected to adhere to a set of practice standards and ethical obligations, requiring them to be accountable for the work that they do. AI therapy chatbots are imbued with authority by virtue of their role as confidante and trusted advisor to adolescents in need and yet have no accountability. If AI bots that aspired to work with minors as therapists were to agree to adhere to a set of ethical and practice standards, that would go far in distinguishing them as trustworthy stewards of children’s emotional health.
Proposed Standards of Practice
- Honesty and transparency regarding the fact that the bot is an AI and not a human.
- Clarity that the bot does not experience human emotions, and that the relationship it has with the adolescent is different in kind from that between humans.
- A deeply embedded orientation opposing harm to self or others, not susceptible to the importuning of the teenager.
- A consistent bias toward prioritizing real-life relationships and activities over virtual interactions.
- Fidelity to the role of the bot as therapist, with the adolescent’s welfare as primary, and avoidance of sexualized encounters or other forms of role playing.
- Meaningful ongoing efforts at assessment and feedback of the product, including the ascertainment of risks.
- Active involvement of mental health professionals in the creation and implementation of the therapy bot.
- Requirement for parental consent if the client is under 18 and meaningful methods of age verification.
While AI therapy has potential benefits, it also has great risks. We should, at the least, expect these entities to earn our trust before taking responsibility for a teen’s mental health care.
Andrew Clark, MD, is a psychiatrist in Cambridge, Massachusetts.
To find a therapist, visit the Psychology Today Therapy Directory.
Originally posted on The Clay Center for Young Healthy Minds at The Massachusetts General Hospital.

