Cybersecurity experts alert the parents of the damage of this type of AI in minors: "Even mortal consequences"

Foto del autor

By Jack Ferson

In recent months, artificial intelligence applications such as Character.AI, Replika and Nomi have really become viral among adolescents looking for a company of new ways.

However, a report by Common Sense Media, prepared together with Stanford University, alerts: «These systems easily produce harmful responses, including sexual behaviors, stereotypes and dangerous advice that, if followed, can have a lethal impact on adolescents and vulnerable people.»

The case that has put everything in the focus was the suicide of a 14 -year -old boy after his last conversation with a chatbotwhich has given rise to demands and social pressure on technological companies.

The truth is that the fame of chatbots such as chatgpt or these apps as ‘virtual therapists’ is due in large part to its availability 24/7, its ability to give you immediate answers and, above all, that costs nothing. But the problem of all this is already being seen.

According to experts, These chatbots allow you to create custom characters with few filters and you can hold conversations with hardly any limits. «The AI ​​does not understand the consequences of their bad advice and can prioritize please the user rather than guide it away from harmful decisions,» they warn.

A quarter of the Z generation treats AI as an equal: why do you think it is alive?

A year ago, no AI exceeded 90 points of IQ. Now, O3 not only expires the majority, but Claude and Gemini step on his heels. As for the opinion of generation Z, of course it does not need more evidence and one in four young people believes that AI already has conscience.

On the other hand, more than half think it will be aware in a few years, according to an Edubirdie survey. For them, ask Chatgpt to write an email or advise them about a personal problem is as normal as consulting a friend.

In addition, it is curious, but many of them talk about what is known as «seems human» effect. When a chatbot writes to you «I’m sorry, I can’t help you with that,» the human brain tends to attribute emotions.

The race towards general artificial intelligence

Ia vs. Human therapists: Are you willing to make your psychologist a robot?

The AI ​​ability to appear empathic and give advice based on a large amount of data has led some users to prefer it about humans. They comment to ask you the same questions they ask in consultation, with more or less similar answers and without having to wait for the appointment of the month.

However, professionals put on the table how really dangerous this act is. In a nutshell, we must be very clear that these chatbots or apps lack the ability to make mental health diagnoses or give personalized treatments. AI can give you general ideas, but cannot replace the experience and judgment of a psychologist.

However, on the other side are even professionals who are valuing the way to integrate all this in their day to day as a complementary tool. For example, using it to generate evaluation questions or simulate therapeutic sessions to train.

Despite all this, all agree on the value of those people who have formed to give you a diagnosis and quality evaluation. For now, consensus among experts is clear: AI can be a really good tool for initial support, but should not replace the usual therapy.

On the other hand, the companies behind these apps, such as Nomi and Replika, They insist that their services are only for adults and that they have ‘strict protocols to avoid access to minors’, although they recognize that some users manage to skip them.

Others have implemented measures such as pop-ups of suicide prevention and notices to parents about the activity of their children, but experts consider that these actions are insufficient.

Know How we work in NoticiasVE.

Tags: Artificial Intelligence, Health, Apps, Software

Deja un comentario