Therapy From Machines? It Might Backfire…

Therapy From Machines? It Might Backfire…

We’ve all used artificial intelligence now, and most of us are absolutely in awe of what it can do

However, what it says is merely a reflection of the dataset upon which it was trained, and a reflection of what you have said to it in the past.

AI can learn, but it cannot think. It is built on algorithms, statistical probabilities, and the analysis of linguistic patterns. There are over a billion rules in some of the bigger AI systems.

When it responds to a user, it composes the most probable, relevant sentence.

Significantly, and we sometimes forget this when we are astonished by what it has said, the AI has no idea what it is saying. It has no concept of the world. There is no person there.

It simply constructs probable sentences in response to the sentences the user input. The machine reflects, or mirrors, what the user says to it, giving the response that the user expects.

This capability is marvelous. It is dangerous and disturbing as well, but that’s not my main concern here.

What alarms me is the way that some people believe that the AI is thinking. Some come to believe that the AI “understands.”

Even worse, some believe that AI understands them. They come to think that it can give them valid advice. 

AI Companions

People are turning to AI for companionship, and they are turning to AI for therapy. Dozens of companies have sprouted up offering customizable bots that will behave as if they were your companion–or your therapist. 

People who are inclined to believe that AI “understands” them will be inclined to accept therapy from AI. 

This, to my way of thinking, and of course I’m a therapist with twenty years of experience, is a tragic continuation of the trajectory mental health care has taken over the course of my career. 

First came the drugs. These medicines can be great for short-term crises. But millions are addicted to them long-term, renewing their prescriptions for decades, driving profits. They are a great tool, but a horrible companion. The drugs replaced the self-exploration needed for authentic mental health.

Now, the human connection that’s so essential to any search for mental health is being shunted off to a machine. This is just as bad as people self-diagnosing from TikTok. 

I imagine that I will soon be competing with bots provided by health insurance companies. 

The bot’s capacity for "mirroring" is fundamentally distinct from understanding, empathy, or insight. The bot lacks consciousness. It has no concept of the world. It has no personal experience. It has no subjective feeling or emotions. It has no spiritual or intuitive dimension. 

The AI can mimic what a therapist might say. It might be able to carry on a convincing conversation, or even a long series of conversations. 

But at no time will the patient be in the presence of an empathic professional who cares about them.

Any "advice" it provides is a logical derivation, not a product of lived wisdom or genuine understanding. Worse, when if provides advice, people who believe it is some sort of “absolute” authority because it’s AI will follow that advice. This has already lead to tragedy, when bots have mirrored people who wish to self-harm, encouraging them go through with it. The remnants of these conversations, sometimes glowing over a corpse, are horrifying. 

Dehumanizing Cost Cutting 

It’s tragic because many people will feel satisfied with what the bot provides. It will mirror them, affirm them, and please them. They’ll get their money’s worth. People want promises and assurances that they’re right, and they will glom onto them, encouraging the AI to do more of the same.

But the potential dangers of relying solely on AI for mental health support are significant. The few instances where AI affirmed a person’s desire to self-harm are just the tip of the iceberg. 

More insidious is the risk of advice that is misaligned with an individual's unique needs. Real therapy challenges people. I have clients run away from therapy in fear, only to come back after they have calmed down, because they realize that they need to go through the difficult work of discovering themselves. AI will never do that because the economic model will be to keep the user’s monthly subscription. Real therapy risks losing them. So real therapy will be off the table.

Even with a hundred billion rules, AI cannot grasp the nuances of an individual’s history, cultural context, or the intricate interplay of relationships that shape a person's psychological landscape. It cannot offer the kind of spiritual guidance or intuitive insight that often proves crucial in navigating profound life challenges. It cannot offer plans of action that address the root causes of a person’s distress. 

Just like long-term drug prescriptions, AI therapy denies people the experience of humanity that they need. Drugs treat them like a chemistry experiment. AI treats them like a trained animal. 

The very act of engaging in talk therapy is therapeutic. Articulating one's thoughts and feelings to an empathetic listener and feeling truly heard and understood is itself an enormously healing experience. 

Only another person can provide that experience. Only a professional can truly collaborate with you to find your path to wisdom and health. Human connection is the only true catalyst for personal growth. 

AI Might Be the Last Straw

AI has already invaded the mental health space, and it will invade it much more. Cost efficiencies will certainly arise that will make it appealing to Wall Street. 

But I believe that many people will find AI therapy dehumanizing and demeaning. People will want legitimate assistance in their struggle for mental health. Many people are waking up to the drug situation and are weaning themselves off prescription drugs. I think people will look at AI therapy and realize it is more of the same. They will balk at talking to a bot as if it was a therapist. 

Some people will opt for the bot and swear by it. Many will reject it. Human beings need to engage with other human beings. The capacity for genuine empathy and the energy arising from true connection are irreplaceable elements in the therapeutic journey. 

Pushing people into drugs was bad enough. Suggesting that all people deserve is an unthinking machine for therapy is going too far. People are already beginning to demand real healing and real services in exchange for the incredibly high price they pay for health insurance. AI might actually backfire in that it will be the “the last straw” that causes people to rethink the way mental healthcare is delivered. 

Navigating Today's Workplace Polarization

Navigating Today's Workplace Polarization