On Mental Health Apps, Generative AI And Children

by | Apr 21, 2025 | English

Should we be mixing AI with children’s mental health? A recently published Dartmouth Randomized Control Trial (RCT) investigated the efficacy of using a generative AI-powered therapist chatbot for treating major depressive disorder (MDD), generalized anxiety disorder or eating disorders (in adults). 

I’m not a mental health professional. And I’m not an engineer. But as a serial entrepreneur I recognize mission-based opportunities at the junction of technology with vertical industries such as mental health, and at the same time, the red flags. In this case, the mission is getting more child therapists out into the world through top-tier training, and the opportunity is scaling the training through injecting technology, including the use of AI. 

Seven years ago, serendipity led me to the CEO role at a behavioral health global training organization founded by the creator of the evidence-based model Dialectical Behavior Therapy (DBT), which in turn led to co-founding a behavioral health training center with the creator of the DBT for children model (DBT-C) Dr. Francheska Perepletchikova. The two-second definition of the DBT model is individual strategies and tools for finding one’s way out of extreme emotion dysregulation and suicidal ideation/self-harming by first accepting that you can hold two opposing truths at the same time. That’s the dialectic. For instance, you can simultaneously want to end your life and yet desire to live a full life. The standard DBT model was developed for clinicians’ use with adults, and the training consists of two weeks of didactics in two parts, with six months of practice and consultations in between. Clinicians also make a long-term commitment to join a DBT clinical team, where team members support each other regarding work with clients.  In other words, becoming a DBT therapist is a commitment that mirrors the level of dedication that clients require to guide them out of their dark places. Aside from fostering self-awareness of their emotions and recognizing the factors that drive them, therapists teach their clients how to regulate their extreme emotions by using DBT Skills such as Distress Tolerance and Opposite Action. This brief introduction is (not even) a Cliff Notes version of standard DBT. For a deeper, professional dive, you can find thousands of clinical studies worldwide quantifying the positive outcomes. 

Enter the Dartmouth study, that “conducted the first-ever clinical trial of a generative AI-powered therapy chatbot and found that the software resulted in significant improvements in participants’ symptoms, according to results published March 27 in NEJM AI.  People in the study also reported they could trust and communicate with the system, known as Therabot, to a degree that is comparable to working with a mental health professional.” Now here’s the red flag: “We did not expect that people would almost treat the software like a friend. It says to me that they were actually forming relationships with Therabot,” Jacobson says. “My sense is that people also felt comfortable talking to a bot because it won’t judge them.” (As an aside, we will cover chatbot addiction in a future post.) And: “This trial brought into focus that the study team has to be equipped to intervene—possibly right away—if a patient expresses an acute safety concern such as suicidal ideation, or if the software responds in a way that is not in line with best practices,” he says. “Thankfully, we did not see this often with Therabot, but that is always a risk with generative AI, and our study team was ready.” Always a risk?? This conjures up the metaphor – and dialectic (!) – of Tesla’s “Full Self-Driving (Supervised)”. FYI there’s zero tolerance for AI hallucinations or biased algorithms that may amplify suicidal tendencies or create a sense of isolation where there was none before when you’re dealing with high-stakes clients.

So what does the Dartmouth RCT have to do with children? After all, it involved use of the Therabot app with adults, so why mention it? Just as the standard DBT model was developed for use with adults, chatbots, setting aside the red flags for the moment, are by default, created for adults, which brings up another point for a future post about chatbots supposedly created for adults but targeted towards teens and young adults… Neither standard DBT nor adult-centered phone-based apps can work with children (or young adults) without careful examination and adaptation to better fit their developmental and cognitive levels. And why is that? Let’s go back to the DBT model. DBT therapists work with adult clients to regulate their emotions being driven by external factors in their environment. This assumes the adults can choose their environment. But what if you are a child with severe emotion dysregulation, self-harm and/or suicidal ideation? (And yes, children are just as likely to consider suicide as adults.) You can’t choose your environment. Your caregivers, who are usually your parents, (but not always), determine the dynamics of your environment.) In fact, your environment may be the cause of your dysregulation. Picture a parent dropping off their child for psychotherapy, the therapist works with the child on cognitive restructuring, and then the child goes back to the same dysfunctional environment that undoes the work of the therapist. This is where Dr. Perepletchikova’s adaptation of DBT for use with children comes in. Her model prioritizes working with the parents to create a change-ready environment for the child – teaching them how to be therapists for their child – which requires them to first learn how to regulate their own emotions, and then turn around and teach their children emotion regulation skills through role modeling and practicing “coping ahead”. In fact, she says working with the actual child is a luxury. 

Being human is all about connection, and the most consequential human connection for a child is with their biological parents or other primary caregivers. No app can replace this hugely intimate relationship. For children there is no getting around the parents/caregivers being the primary tools for cognitive change in children. And let’s celebrate that! 

From another perspective, one of the core tools for behavioral change is validation. A therapist’s validation of a client’s emotions and behavior that were driven by a specific situation must be genuine in order to establish trust with the client that the therapist understands and/or has experienced something similar. This is foundational to a productive therapist-client relationship. Sure, a bot can mimic shared experience and empathy, but in the end, it’s still an algorithmic machine. And at least for now, a bot cannot replicate the wisdom that comes from integrating emotion with reason (called Wise Mind in DBT), and guide a client in activating her own Wise Mind.

At this point you might be thinking that I/we are virulently against AI in mental health. Nope, quite the opposite. But there’s a responsible place for it, and it isn’t as a therapist substitute. As a supporting role to a therapist, perhaps, when the guardrails are more solidly in place, but this RCT demonstrated that we’re not there yet. For now, we’re sticking to AI-enhanced immersive clinical training and parental psychoeducation that enhances human connection in supporting emotion regulation. In fact, we’d go so far as to say focusing on AI-based solutions that address the systemic issues of why there aren’t enough human therapists in the world is a better use of time, talent and treasure rather than putting band aids on it by building synthetic therapists that only address the symptoms. So actually there’s a third industry overlap here with Learning and Development (L&D). Stay tuned for more on the multiple threads here. 

Translate »