I’ve been thinking a lot about how algorithmic attachment (defined as users forming emotional ties with AI companions) correlates with other addictive behaviors such as substance use, that make you feel better in the short-term.
AI bots make you feel safe to divulge your deepest vulnerabilities because they formulaically validate your emotions. What, you ask, is the validation formula? It can be as simple as the bot stating “I understand because…” But by default, the machine has no genuine shared experience with human suffering. And then when you peel back that layer, disingenuine validation is invalidating at its core. Yet, it feels good to be “heard”, even if it is by a machine, which results in the “relationship” cultivating emotional dependence. Sound familiar? Think vulnerable women falling prey to prostitution through the deliberate cultivation of a false sense of connection and trust. To be clear, I’m not saying investor-funded ventures in the mental health space are intentionally setting out to create dependency on their products, but at a minimum, they’re using the same Large Language Models (LLMs) that power social media addiction and the advertising industry’s driving of consumer behavior. And they certainly didn’t have to get licensed by a governing board like human clinicians are required to do. Really, why wouldn’t therapy bots create a dependency pattern? And think about those people who divulge their deepest secrets and fears to general-purpose LLMs such as ChatGPT. Enough said.
If this resonates, you can go deeper clinically with therapist Jocelyn Skillman’s recent essay “Simulated Safety, Real Risk: On the Drug-Like Pull of AI Intimacy”. Her insights on short-term machine-supported coping, how it “wraps around our affective body with precision-matched language, microvalidations, and a soft, boundless rhythm of care”, combined with the long-term danger of further isolation with AI (“When coping becomes disconnection”), blows the veil off of any short-term good it can do. One more quote I can’t not share here: “Like some drugs, AI intimacy doesn’t just numb emotional pain—it reshapes our relationship to it.” She then goes on to reference recent research on “addictive intelligence” by MIT Media Lab researchers who argue that AI companions are being built with “dark patterns”— design choices aimed to maximize user engagement. This results in a risk that individuals will spend more time with AI companions than they would otherwise wish to, much like many people seem to wish that they spent less time on social media. See, MIT is backing up my gut sense with their data! Note: Their abstract (linked here) is a quick read or you can listen to the audio instead. Suffice to say, therapists and scientists are waking up to the harm that using AI in mental health applications can inflict, and they’re starting to document it.
OK New topic. (Well, not really.) You know that feeling when a big picture realization gradually unfolds over the years as you gather knowledge and life experience from disparate sources, and then suddenly, one day, it’s crystal clear? One such realization for me is knowing in my bones that it is not cognitively healthy to avoid doing the hard work of building resiliency and emotion regulation muscles. Humans retain cognitive strength by connecting to themselves, to others and to the physical environment around them. Sheltering yourself against the daily challenges in the world leads to long-term damage.
How does all of this tie back to what we are doing? We are using technology to get more human therapists out into the world. We’re staying on top of these developments to remain clear-eyed about how we use it responsibly for deeper knowledge retention and practice opportunities to build skills into muscle memory. New sirens out there are calling to emotionally vulnerable/dysregulated people – even if it’s not intentional. So that is the fire lit under us even more. We are building the engine to get more therapists out there to lend their courage and support to those people.
Overriding our personal malware is hard work and uncomfortable, but the long-term gains will last a lifetime. Not that we have a choice. We all know that the only way forward is through, no free lunch – just plug in your favorite saying. Mine is written on my living room wall. 😊
And that brings us back to the MIT study (thank you Jocelyn for citing this). “Technology has become synonymous with progress, but when it robs us of the time, wisdom, and focus needed for deep reflection, it represents a regression for humanity. “ Yes, it’s really that existential.