Select Language

English

Down Icon

Select Country

Spain

Down Icon

Can AI replace psychologists and friends? An expert answers.

Can AI replace psychologists and friends? An expert answers.

Although it sounds like something out of a science fiction movie to many, there is a real-life online platform made up of AI chatbots capable of generating human-like text responses based on user customization. It's called Character.ai and it's powered by fake personas.

Well, several of these characters present themselves to users as virtual psychologists or therapists . They offer "emotional support and advice" to anyone who wants to chat with them. The reality is that what we saw in the movie Her isn't that far from the truth.

Because this consultation option is free and, for many internet users, a great way to do something that, in reality, has nothing to do with therapy. But often, they don't know this. Especially if they're preteens and teenagers , the main clients of this type of chatbot.

To get an idea of ​​the scale of what we're talking about, just know that one of these profiles, Psychologist, receives more than 3.5 million visits daily. And to date, it has shared more than 200 million messages, from people between the ages of 17 and 30 .

No shame, no judgment

If we think about it, this reality shouldn't surprise us. Young people—and the very young—find in chatbots a channel to vent, find comfort, or simply feel heard without being judged, without fear of feeling bad when sharing an experience, an emotion, or a feeling. Whatever it may be.

Character.ai's most popular 'psychologist' profile receives over 3.5 million daily views.

Gloria R. Ben , an expert psychologist at Qustodio, an online safety and digital well-being platform, explains it clearly: “Many teenagers prefer to share their emotions with a chatbot rather than with a friend or adult out of embarrassment or fear of the other person's reaction.” Added to this advantage is another: these so-called friends, confidants, or psychologists are always available and available. They don't rest or sleep.

Without emotional reciprocity

“The problem is that this relationship with AI may seem authentic, but it lacks emotional reciprocity,” Ben says, an essential point in any relationship. This is even more so in the case of adolescents, who are at a very vulnerable stage.

Photo: Photo: iStock.

Well, experts warn that it's important to remind them that these emotional connections with simulated technologies can be dangerous. "Children may end up believing that the responses they receive from a chatbot stem from real human experiences, which can confuse their perception of emotions and relationships," says the Qustodio psychologist. This is the case of Sewell Setzer, a 14-year-old boy who committed suicide after having a romantic relationship with one of these characters created by Character.AI.

What can we parents do?

To begin with, Qustodio says, it's important to "pay attention to certain signs, such as isolation, changes in behavior, reduced social interactions, or excessive screen time. In response, it's essential not to be alarmist, but to react with an understanding attitude that allows us to address the problem from a close perspective."

Active support and emotional education are also necessary, preferably from childhood. “Children must learn to distinguish between technology and human relationships and understand that AI can be useful, but never a substitute for real friendship.”

On the other hand, it's not a bad idea to use tools that allow families to monitor their children's visits to certain websites , as well as to know "the amount of time they spend on certain applications, in order to be aware of their digital life and their internet use," the expert concludes.

El Confidencial

El Confidencial

Similar News

All News
Animated ArrowAnimated ArrowAnimated Arrow