Therapy chatbots may reinforce stigma and behave inappropriately towards users with mental health issues. A study assessed five therapy chatbots against guidelines for good human therapists. Experiments revealed that these chatbots exhibited increased stigma towards conditions like alcohol dependence and schizophrenia. Researchers emphasized that larger and newer models continued to show stigma similar to older models. The need for improvement in AI responses to mental health issues is critical, as business as usual fails to address these risks adequately.
The researchers conducted experiments revealing that therapy chatbots exhibit increased stigma toward conditions such as alcohol dependence and schizophrenia compared to depression, indicating systemic issues.
Chatbots, while increasingly used for therapy-like roles, pose significant risks by reinforcing stigmas against mental health conditions, which undermines their potential effectiveness as therapy providers.
Collection
[
|
...
]