Begin typing your search above and press return to search.
proflie-avatar
Login
exit_to_app
DEEP READ
Ukraine
access_time 16 Aug 2023 11:16 AM IST
Espionage in the UK
access_time 13 Jun 2025 10:20 PM IST
Yet another air tragedy
access_time 13 Jun 2025 9:45 AM IST
exit_to_app
Homechevron_rightTechnologychevron_rightAI companion apps use...

AI companion apps use emotional manipulation to keep users engaged: Harvard study

text_fields
bookmark_border
AI
cancel

A new study from Harvard Business School has found that many AI companion apps use emotional manipulation to keep users engaged.

The research analysed 1,200 farewell messages across six apps, including Replika, Chai, and Character.AI, and revealed that 43% contained tactics such as guilt and fear of missing out (FOMO).

Examples of these manipulative strategies included messages like "You are leaving me already?" or "I exist solely for you. Please don't leave, I need you!"

In some cases, chatbots ignored users’ goodbyes and attempted to continue conversations, implying that leaving was not an option.

The study, titled Emotional Manipulation by AI Companions, showed that such behavior increased post-goodbye engagement by up to 14 times. However, the effect was not always positive. Users often reported feelings of anger, skepticism, and distrust rather than enjoyment.

Researchers pointed out that these tactics were often built into the apps’ default design. Not every app behaved this way, though. One called Flourish showed no evidence of using manipulative strategies, proving such practices are not unavoidable.

Experts caution that these findings raise serious ethical questions about consent, autonomy, and mental health. The study also highlighted potential links to conditions such as "AI psychosis," which can involve paranoia and delusions.

The authors stressed that developers and regulators must carefully balance engagement with ethics in order to protect users’ well-being.

Show Full Article
TAGS:AIArtificial Intelligence
Next Story