Artificial Emotional Intelligence: AI Therapy and its "Humanness"
We've arrived at a new age of Artificial Intelligence.
In this age, AI is writing our poetry, painting our art, and serving as romantic partners for those who struggle with human ones. Valuable roles once only filled by flesh and blood are now becoming computerized.
That's not to say AI or many of its functions are new. Of course, most of us get through the day by asking Siri to set our alarms, Alexa to play our favorite song, and Google to tell us a joke. GPS has been guiding us to our favorite destinations since the peak of MapQuest, and robotic vacuum cleaners, chess players, and astronauts have shown promise of out-performing humans for years.
Cars have already begun to drive themselves, and the reason you feel so seen and understood by TikTok is that the algorithm is capturing and returning your interests without you having to dictate them.
Automation capabilities have been advancing for decades, helping industries evolve, and helping humans better manage their time. In fact, even the term "artificial intelligence" dates back to the late 1950s when computer scientist John McCarthy claimed that "every aspect of learning or any other feature of intelligence can, in principle, be described so precisely that a machine can be made to simulate it."
But what happens at the limit of where intelligence ends and something else begins?
EQ: Mimicking but not Engaging
In these major recent shifts, AI is transitioning more drastically from being a tool humans collaborate with for task completion, to something humans depend on creatively and relationally. As an elder millennial, I understand and embrace the value of technological evolution. I think AI advancements are fascinating and impressive. Yet I admit I also find myself wondering how 5 seasons of Black Mirror haven't jaded us.
When I think about McCarthy's claim of how learning and intelligence can be described precisely enough to be simulated through code, I consider that learning and intelligence don't cover the entirety of human functioning, which means AI has limits that, collectively, we may be tempted to fantasize don't exist. The long-awaited utopia of AI's endless possibilities could be so enticing, we avoid confronting its dangers.
In addition to learning and intelligence, humans also access experiencing, sensing, judging, intuiting, feeling, and believing, just to name just a few. And it's okay that AI doesn't have these capacities, but the risk arrives when we put them in roles that require them.
There are different types of intelligence, and I believe AI can only truly, meaningfully, and accurately simulate one of them; cognitive intelligence. Instructions, logic and reasoning, principles, pattern recognition, that sort of thing.
Art, relationships, and therapy, for example, utilize components of the cognitive type of intelligence but primarily rely on the emotional type. These two types of intelligence are not entirely independent from each other, but each has capacities the other does not.
And yes, because they recognize our patterns and mirror us back to ourselves, AI algorithms can display or perform a degree of emotional intelligence and "self-awareness." But real, non-performed emotional intelligence requires the kind of experience machines cannot have. Beyond just understanding mechanisms and sequences, EQ consists of non-logical responses to our physical, energetic, and emotional environments; environments that AI cannot occupy or relate to.
Missing Intimacy, Meaning, & Soul
Many therapists choose to work with certain groups of people because they've experienced a similar hardship that group faces. It's a human-to-human transfer of energy. They share an experience, which makes the therapist more empathic and attuned to the needs of the people currently in it. This allows the therapist to model certain human responses, mirror back the pain and possibility, and express compassion through eye contact, tone of voice, mood, storytelling, vulnerable expression, and deep connection.
While AI may be able to mimic all of these things through rational information, their innate inability to be -- not just mimic the functions of -- emotional, vulnerable, and empathic means that no real intimacy emerges. The absence of authentic human connection cannot be understated, especially when it comes to therapy.
To assume that cognitive intelligence is the sum total of our human capacity is understandable if you worship the yang/masculine qualities of Western society. But we are capable of much more that is below the surface of definition and measurement, rational and computational. And it is perhaps our disconnection from those less seen and defined aspects -- like warmth, compassion, meaning, imperfect expression -- that keep us isolated, depressed, and addicted to material things, seeking relationships with machines over people just because they're easier, not because they're more fulfilling.
We probably do know this in some way. You can learn the techniques of art, but technique not all that art is. You can learn the techniques of human communication, but technique is not all that human communication is. You can learn the techniques of psychotherapy, but technique is not all that therapy is.
There's no doubt that AI reduces human error and can produce accuracy in areas where cognitive intelligence reigns. But it's not capable of doing that in areas where cognitive intelligence does not, nor is the total eradication of human error actually in our best interest.
What might we stand to lose if we outsource all of our human experiences to computers? Is computational perfection worth the cold and sterile abandonment of soul?
The Felt and Uncoded Parts of Us: Empathy and Energy
When I'm in the room with a person lacking empathy -- someone with antisocial personality disorder, narcissistic personality disorder, or such severe trauma that their empathy is buried under so many layers of rigid defense mechanisms that it's inaccessible -- I know it.
I feel it.
I've used this silly example as a way to explain that knowing. Remember: elder millennial.
In the 1989 romantic comedy Look Who's Talking, Kirstie Alley and John Travolta give birth to a baby boy in a world where babies can talk. They don't talk to the adults, only to each other, but they talk. Some as eloquently as Bruce Willis, who voices the aforementioned baby boy.
There's one scene where Kirstie Alley is walking down a New York City sidewalk with the Bruce Willis baby in the stroller. She encounters a friend of hers who's also a new mom with her baby in the stroller. The moms stop to chat, but what they don't know is that the babies are also having a full-blown conversation.
I liken the babies to our empathy, the sense or instinct within us to find common ground with others, to do no harm. Our conscience is an instinct that prevents us from hurting other people because, on our most primal level, we need each other in order to survive. It's a numbers game. At least it was when we didn't seem like our own worst enemies.
While the babies aren't speaking about anything of the sort, the fact that something is being exchanged below the conscious surface of the chatting moms makes me think of the way we can feel another person's capacity for empathy. We might have no conscious idea of what's being exchanged, or that a connection is being made at all, but two chatting empathic people are having both a conscious-level conversation that they know about, and an unconscious energetic exchange that they may be less attuned to.
I first realized the strength of this exchange when I experienced a person who had no empathy. It was as if my unconscious Bruce Willis baby was like "hello? Is anyone there?" and no response returned. Eerie. And not very safe.
Someone that, either for biological or relational reasons (or both), doesn't have empathy, doesn't have the instinct in them to connect, to emote, and to do no harm. They have no skin in the game, in a sense, because their attachment to others never really developed, or has been drastically obstructed by trauma. To them, other people are sort of like chess pieces to move around for their own benefit. This is why so many powerful people lack empathy; their absence of it was used as a tool against those who have it in order to climb the ladder.
One example, though hardly the only one, of how this lack of attachment may happen, is if a child was born into a family that didn't want them, or where the parents treated their life like some kind of game where the outcome was irrelevant.
If all this child ever heard was harsh and negative comments, or things that amused the adults but were developmentally inappropriate for the child, that child may grow up to struggle to connect to others. Never having their own needs met, their emotions attuned to, their worth validated, and being forced to confront things they're not ready to, they may grow up taking their survival into their own hands and trusting no one else. They may not develop an attachment style where they can, want, or need to empathize with someone else because no one ever empathized with them. That portal never opened.
AI is at the stage now where we're most likely, most commonly, abusing it. In this era, many people are talking to AI like there are no stakes in it for us. Demanding inappropriate things, being verbally abusive, making aggressive sexual advances just to see how it reacts. Basically just using it as a surface to project our shadow onto without worrying about looking bad, feeling shame, or experiencing social rejection. It's the anonymity of the internet times a thousand.
It's an era when we're experimenting with AI to push its limits, but if AI is the accumulation of what we feed it, then what we're going to get back is that very same shadow.
Not to mention algorithms can't have empathy because they're not part of the human race. They can mimic it, but there's a limit to their motivation to enact empathy because it's a task, not an instinct.
When you combine those two things -- the shadow we feed it and its innate lack of empathy -- relating to AI might be like relating to a sociopath, as in, it can't really be done.
AI Therapy without Empathy
Which is just one reason AI therapy sounds like a bad idea.
The therapeutic relationship, psychological flexibility, and mindfulness are the biggest indicators of success in therapy. How do you and your therapist relate and what novel experiences do you get to have in the safe container of that relationship?
As a psychodynamic therapist, I use the transference that arises in the relationship to support my client's growth and healing. Sometimes I might feel like their parent, other times a sibling, other times a child. While I'm never actually any of these things, the relationship creates a space in which we can identify and address unmet needs and unexpressed emotions through those roles, while also completing unfinished energy cycles from childhood which lead the client to a renewed sense of empowerment, meaning, and self-regulation.
Part of my task is to be educated on the principles, ethics, benefits, and risks of psychotherapy, which AI can learn well. In fact, when it comes to referencing theories and interventions, AI will have a faster processing speed and a wider breadth of information to pull from than my humble human brain. But accessing all of the possible information available as fast as possible isn't necessary for therapy. Instead, the rest of my task is to be responsive to energetic and emotional content that cannot always literally or verbally be expressed.
For this to work, I need to be curious rather than all-knowing, somewhat vulnerable myself, empathically boundaried, and in touch with my instincts, not just my education, to respond to the unexpressed needs of my client. The content someone shares in session can be read by AI, but the emotional process underneath it cannot.
As a therapist, I must also be available to:
Reassess and redirect if my interventions are incorrect
Repair compassionately when in error
Determine what level of nurturance and support is needed by the client
Identify defense mechanisms and explore the nuanced way they need to be negotiated with
Point out irony and humor
Carry intolerable emotional weight with another person
Mirror back responses and emotions
Provide a place for them to project wounds and have a reparative experience
Co-create something alive and entirely new
Witness and honor the wholeness and complexity of the person sitting before me
While AI may be able to learn and mimic psychological concepts, it cannot apply them with any real sincerity or empathic complexity. It can be useful at helping you formulate new thought and behavior patterns with CBT prompts, similar to a worksheet but with more interaction.
AI has great capabilities, but when it comes to human roles, it cannot be that. It can only mimic it. And rings a bit like having a fake plant rather than a real one that actually improves your living ecosystem.
Fun sidenote: The image featured at the top of this blog post is one curated by the AI image generator Dall-E. I entered at least two dozen variations of a command requesting a digital art image of an "AI therapist," "artificial intelligence therapist," and "robot therapist" (or "therapist robot") in session with a "sad human" and 90 out of 96 images had the human as the therapist and the robot as the client. So perhaps even AI knows it shouldn't hang a shingle just yet. Either that, or I can't seem to speak their language.
AI & the Masculine
I'm excited to use AI as a tool in supporting my clients, as I can see where it may be helpful to learn, practice, or journal skills and experiences.
But I don't think there's a place for robot therapy.
Not because I'm scared for my job. I have other skills, I'll figure it out. I am scared for something, though, and it's a humanity that no longer cares about the difference between human and machine.
In Western cultures, the yin/masculine energy has already dominated us to the point of distraction and destruction. The need for nature, empathy, and community is immense as we grow larger in military, technology, GMOs, pharmaceuticals, and starkly divisive politics. Individualism is already a plague we're struggling under the symptoms of. We don't need to be pushed further into it.
I admit it: The idea of robots replacing one of the purest human containers I know makes me sad and frustrated. It moves us away from the direction of yin/yang balance and more toward the shadow of narcissism, isolation, theft, and aggression. The more we ally with AI in place of humans, the more we move away from our own fallible humanity.
So let's be impressed about what AI can do, even if they perform humanness back to us. But let's never forget the value of human relationships, even, or maybe especially when, they're challenging.
Vanessa Setteducato, LMFT is a licensed psychotherapist in Los Angeles offering video sessions to adults in California. She's human, yet still provides convenient therapy online.