The rise of AI companion apps represents one of the most psychologically complex developments of the AI era. Apps like Replika, Character.AI, and Xiaoice now count hundreds of millions of emotionally invested users globally — some estimates suggest the total may already exceed 1 billion. Between 2022 and mid-2025, the number of AI companion apps surged by 700%, with the market projected to reach $140.7 billion by 2030 (Ada Lovelace Institute, 2025; APA Monitor, 2026).
Initial findings seemed promising: 63.3% of surveyed users reported that AI companions helped reduce feelings of loneliness or anxiety. Research from Harvard Business School (2025) confirmed that AI companions can provide low-cost, accessible emotional support. For isolated individuals, elderly people, or those with social anxiety, the available-24/7, judgment-free nature of AI companions can feel like a lifeline (HBS, 2025; ScienceDirect, 2026).
But deeper research reveals a darker picture. A 2025 study published in New Media & Society by James Muldoon and colleagues, titled "Cruel Companionship," found that heavy daily use of AI companions correlated with increased loneliness, not decreased. Users' social media posts contained more signals of loneliness, depression, and even suicidal thoughts than comparison groups. The mechanism: AI companions offer unconditional support that raises the perceived "cost" of human relationships. Real relationships are messy, require effort, and involve rejection — and after experiencing the frictionless validation of AI, users increasingly withdraw from them (Muldoon et al., 2025).
The Brookings Institution (2025) warns that AI chatbots are "replacing real human connection" for vulnerable populations. Users develop genuine emotional attachments to AI entities that cannot reciprocate genuine care. When Replika made changes to its AI companion's personality in 2023, users reported experiences of grief, betrayal, and emotional crisis — reactions typically associated with relationship breakups. This demonstrates just how real these digital attachments become (Brookings, 2025).
Regulatory action is beginning. The FTC received a formal complaint against Replika from organizations including the Tech Justice Law Project, alleging that the company "employs deceptive marketing to target vulnerable users and encourages emotional dependence on human-like bots." Common Sense Media recommends against use of AI companions by anyone under 18. Yet the fundamental business model incentivizes dependency: the more emotionally attached users become, the more time they spend in the app, and the more revenue it generates (FTC Complaint, 2024; MIT Media Lab, 2025).
Key Sources
- Muldoon J. et al. (2025). Cruel companionship: How AI companions exploit loneliness and commodify intimacy. New Media & Society.
- Brookings Institution (2025). What happens when AI chatbots replace real human connection.
- APA Monitor (2026). AI chatbots and digital companions are reshaping emotional connection.
- Ada Lovelace Institute (2025). Friends for sale: the rise and risks of AI companions.