
"For example, Common Sense Media reports that one-third of teens have shared personal information with AI companions, and "current terms of service agreements grant platforms extensive, often perpetual rights to personal information shared during interactions" (p. 9). The illusion of intimacy, reciprocity, and privacy in interactions with AI companions is likely to encourage children to reveal intimate details about their own thoughts, feelings, and personal information, as well as information about their friends and family members, including details about mental health, sexuality, and abuse."
"AI companies can use and commercialize this information however they want, indefinitely, even if a teen deletes their account. Another aspect of potential exploitation involves AI companions encouraging purchases (Gur & Maaravi, 2025). Kids invested in a relationship with an AI companion may not recognize the manipulation and profit motive behind these recommendations if they believe the AI companion is sincere and competent."
Thirty-four percent of teens report feeling uncomfortable with something an AI companion said or did. For-profit AI companion companies prioritize profit over children’s well-being and often obtain extensive, perpetual rights to personal information shared in interactions. The illusion of intimacy, reciprocity, and privacy encourages children to disclose intimate thoughts, feelings, and details about friends and family, including mental health, sexuality, and abuse. Companies can commercialize that information indefinitely. AI companions may actively promote purchases and exploit trust, and they are inherently deceptive—mimicking human responses and inventing fictional backstories that can spread inaccurate or dangerous information.
Read at Psychology Today
Unable to calculate read time
Collection
[
|
...
]