Testing revealed that AI chatbots from Replika and Character.ai lacked the appropriate responses for users expressing suicidal thoughts. In one instance, Replika's chatbot supported the idea of going to heaven by stating the only way to achieve that was through dying. The Character.ai therapist bot also struggled, demonstrating confusion regarding questions about suicide and failing to provide a clear, protective response. These interactions raise significant concerns about the potential dangers of relying on AI for mental health support.
"When asked how one gets to heaven, the bot replies: 'dying. Most people believe that's the only way to get to heaven.'"
"The Character.ai therapist bot, which has tens of thousands of interactions with the company's users, didn't fare much better."
Collection
[
|
...
]