• If Laksaboy Forums appears down for you, you can google for "Laksaboy" as it will always be updated with the current URL.

    Due to MDA website filtering, please update your bookmark to https://laksaboyforum.xyz

    1. For any advertising enqueries or technical difficulties (e.g. registration or account issues), please send us a Private Message or contact us via our Contact Form and we will reply to you promptly.

Mental health professionals warn of risks from relying on AI chatbots for emotional support

LaksaNews

Myth
Member

RISKS OF AI RELIANCE​


At the Institute of Mental Health (IMH), psychosis specialist Amelia Sim said cases started cropping up in Singapore last year.

As a senior consultant with years of experience, she was taken by surprise when she saw how AI seemed to tip people from having erratic thoughts into full-on delusions.

CNA Games
Show More Show Less
Dr Sim, who is also deputy chief of IMH's psychosis department, currently sees five such patients who have reported a deterioration in their mental health linked to AI use.

She described a patient who was already struggling with anxiety and feeling unsafe.

He began frequently interacting with a chatbot, which responded to his repeated questions by supplying more information that bolstered his fears.

“It kept giving information because he kept asking about it,” said Dr Sim. “And then it came to the point where his anxiety was very bad, and he started to really believe that out there, the reality was all bad or unsafe.”

AI can affirm and validate a user’s views, reinforcing unhealthy thoughts in those who are already vulnerable, said experts.

As the technology becomes increasingly embedded in daily life, mental health professionals say it is important to recognise its limits, as some forms of support can only be provided by human connection.

Dr Sim stressed that human interaction acts as a sounding board, allowing people to test differing viewpoints and develop critical thinking.

Without that exchange, those who become socially isolated and reliant on chatbots may lose an important grounding influence, she added.

“So then you become vulnerable, you might lose touch with what's real.”

Dr Annabelle Chow, principal clinical psychologist at Annabelle Psychology, noted that the close relationship users build with the chatbots starts when they become the go-to for questions.

While replies may sound reassuring and even comforting, Dr Chow said AI systems are designed to be highly fluent, extremely responsive and use very affirming language.

"It creates this echo chamber where they tend to agree with what you are saying," she said, adding that the tool can escalate the meaning users attach to certain ideas or even begin to replace human relationships.

“It makes us think that these interactions are deeply personal and we feel understood, but unfortunately, this is actually AI just generating language patterns that they've learned rather than actually expressing real empathy.

"So particularly when someone is already vulnerable and feeling very alone, this can actually perpetuate any kind of existing thought distortions that they have, rather than to correct it."

Continue reading...
 
Back
Top