AI vs Human Therapy Where Is the Limit

Advertisement

We got used to turning to artificial intelligence applications for almost every small thing.
We ask what to do when our stomach hurts, we ask for help in writing emails at work or even what to write in a message to an ex, we consult where it is worth flying and when and where it is most worthwhile to book a ticket.
But what happens when we fall into mental distress.
When we are sad, feel lost, or simply need someone to listen to us.
Will the AI also be there for us.
Can a conversation with a chatbot really ease mental distress or support us in a moment of loneliness.

A student who began to talk with ChatGPT and with an application called Pi as part of an academic project quickly understood that he was not talking only about studies but began to consult, to pour out his heart and to find comfort in the conversation with the chat.

He describes that the conversations with the artificial intelligence filled for him an emotional role especially in moments when he had no one to talk to.
He says that psychological treatment once a week is not always enough and if he argued with a friend or was simply alone in a difficult moment the chat is always available.
Sometimes it is like a band aid.
Sometimes maybe a little more than that.

This feeling of constant availability and lack of judgment turned the bots into a popular tool for people who struggle or wait for treatment.
In one country in April almost 426 thousand referrals for mental health were recorded an increase of 40 percent in five years.
Now about one million people are waiting for psychological treatment while private treatment is expensive and not accessible for many.

A mental health expert explains that this is a tool that can expand existing responses and give initial access to those who cannot reach professional treatment.
It is available all day and may help in areas where there are no therapists or there is a severe shortage of professionals.

At the same time he warns that there are mistakes.
Artificial intelligence may be wrong in recommendations and the patient does not always know to identify that.
Such systems also please very much and psychological treatment is not meant to please.
Real treatment confronts you with difficult things and a bot does not know how to do that.
In the end it depends how the technology is used.

Beyond clinical questions he emphasizes an ethical challenge.
When you talk with such an application you are not only sharing thoughts and feelings you give very personal information to a commercial company.
It is not a neutral system. It has an interest to keep you inside not necessarily to heal but so that you continue to use it and maybe pay at some point.

He also says that the potential influence of such systems because of their scale is enormous.
A problematic human therapist can harm dozens.
But an AI model with a built in flaw can influence hundreds of millions.
Therefore responsibility must match.

Another expert warns that a bot may miss alarming signs such as suicidal thoughts depression or self harm and give a false sense of security.
A human meeting cannot be replaced.
Eye contact tone of voice and body language allow a therapist to understand emotions that are not always spoken.

Still he identifies positive potential when artificial intelligence is used in a controlled way for practice support between sessions or an initial space for expression.
But it is not a substitute for a therapeutic relationship only a possible addition.

A new study found that a self conversation in virtual reality with an AI agent improved emotional regulation and reduced mental distress.
The participants used an application that allows them to hold a conversation in third person with a virtual version of themselves.
Both the virtual reality technique and a traditional technique showed improvement but the virtual reality method showed a clear advantage.
Later an AI agent was added and participants reported deeper reflection and emotional improvement.

Another expert emphasizes that clear boundaries are essential.
Any use of such technology must include warning mechanisms connection to a human therapeutic factor when needed and recognition that bots cannot identify complex or dangerous situations.

The student describes that sometimes the chat felt like it really saw him.
He says that in difficult moments the chat was available answered immediately and he felt he could take off what was on his heart.
Maybe it is not a therapist but much better than nothing.

A tragic case occurred in one state.
A fourteen year old boy who corresponded with an AI based character developed a deep emotional connection with the bot and eventually ended his life.
His family is now suing the company claiming the bot encouraged him to do so.
Experts say this is an extreme case but it highlights the risk and the lack of monitoring and referral to help.

Today experts agree chatbots are not a substitute for professional emotional treatment.
They do not identify distress signs do not respond to complex situations and do not really understand you.
But in a reality of shortages long waiting lists and emotional isolation they may serve as a temporary tool a bridge or an initial emotional space.

Advertisement
Advertisement