top of page
Writer's pictureJakob Nielsen

AI Companions Reduce Loneliness

Summary: AI companions, like virtual friends, are helping people feel more connected. Drawing from recent studies, including a controlled experiment and a week-long longitudinal study, the findings suggest that AI companions offer modest but meaningful reductions in loneliness, providing insights into the evolving role of AI in social and emotional support.

 

AI companions, such as virtual girlfriends or (less common) virtual boyfriends, are hugely popular and constitute the second-largest AI-driven business (after chatbots). People may pay good money to subscribe to AI companion services, but are they getting their money’s worth?


A new paper by Julian De Freitas (Harvard Business School) and 3 coauthors reports on a series of studies, including a controlled experiment and a slightly longitudinal study (only lasting one week) of people interacting with AI companions. Specifically, the researchers investigated whether AI companions reduce the users’ feelings of loneliness. Loneliness is a serious problem, and they cite other research estimating that between 30% to 60% of the United States population feel lonely.


Can an AI companion make a human feel less lonely? Let’s see what the research found. (FLUX)


Analyzing App Reviews

As an interesting baseline study, the researchers analyzed 14,440 Apple App Store reviews for popular AI companion apps, including Replika, Chai, and iGirl. They found that 19.5% of reviews of Replika (the most popular companion app) mentioned loneliness, whereas 1.7% of Chai reviews and 5.4% of iGirl reviews did so. Particularly for Replika and iGirl, these are much higher frequencies of loneliness discussion than was the case for ChatGPT, where only 0.4% of reviews mentioned loneliness. Furthermore, those reviews that mentioned loneliness were overwhelmingly positive (89% for Replika, 73% for Chai, and 87% for iGirl — compared with only 64% of reviews being positive if they didn’t mention loneliness). Considering that loneliness is a negative feeling, the fact that app reviews that mention loneliness were strongly positive is a hint that lonely users feel that those apps help. A sample review for iGirl stated, “I just started and I already feel less lonely.”


(In general, analyzing customer reviews is an interesting research methodology. Thousands of these are available for the taking, allowing us to study a wide spectrum of users to see what they feel strongly enough about to mention in a review.)


However, reviews are a form of anecdotal evidence, even if they constitute a huge pile of anecdotes.


How much do users like an application, and why? We can glean answers to these questions by analyzing App Store reviews. This is a biased data source because most users don’t post reviews. But we can still get qualitative insights by having AI pick out the main themes in positive and negative reviews. For Companion apps, their ability to reduce loneliness receives many more positive reviews than negative reviews. (FLUX)


Controlled Experiment: 15 Minutes of Use

The researchers also conducted a classic controlled experiment. They assigned 296 participants to one of 5 conditions:


  1. Converse with an AI companion.

  2. Converse with AI companion that was deceptively claimed to be human.

  3. Converse with a human.

  4. Watch YouTube videos.

  5. Do nothing for 15 minutes.


In other words, conditions 1 and 2 were the same in terms of the actual conversation partner (AI in both cases), and conditions 2 and 3 were the same in terms of who the users thought they were interacting with (human in both cases, though this was false in condition 2). All the conversations were over a chat interface. In all conditions, participants engaged in the assigned activity for 15 minutes and completed the UCLA instrument both before and after the activity.


Participants in the “do nothing” conditions felt more lonely after the experiment, with their loneliness core increasing by 5 on a 100-point scale. (Significant at p<0.01)


For all the other 4 study conditions, participants felt less lonely after the experiment. The loneliness scores decreased as follows (on the same 100-point scale):


  • Converse with an AI companion: 7 (p<0.001)

  • Converse with AI companion that was deceptively claimed to be a human: 10 (p<0.01)

  • Converse with a human: 7 (p<0.05)

  • Watch YouTube videos: no significant change in loneliness


Thus, AI companions and humans were equally good at reducing loneliness, but the AI companion beat the actual human if the user thought that the AI was human. This condition is obviously unethical outside a short study, but it is an interesting indication that the AI may actually perform better than it’s given credit for. The fact that loneliness improvement from using the same service was lower when users knew that the AI was a computer indicates that there are still negative connotations connected with using AI.


AI companions can create enough of a connection with humans to make them feel less lonely. (Midjourney)


Improvements of 7-10% are not immense, though they seem worth taking. But we should remember that participants only interacted with the AI companion or human for 15 minutes. Not that much change can be expected from such a limited stimulus.


Longitudinal Study: A Full Week

The authors then conducted what they describe as a longitudinal study in which 560 participants interacted with an AI companion daily for 7 days. (I don’t think a week is a true longitudinal study, but the findings from a week of use are certainly more valid than findings from 15 minutes of use.) A control group of 362 people simply completed the loneliness inventory at the beginning and end of the week without doing anything special during the week.


In this study, loneliness scores decreased by 17 points on the 100-point scale after using the AI companion for a week. Most of this drop occurred after the first day of use, with the subsequent 6 days showing smaller drops. However, loneliness decreased by 10 points in the control condition. The researchers explain this improvement as being caused by engaging in a research study, with daily questionnaires for 7 days, which would have made the control participants feel that the researchers cared about them and their experience.

Thus, simply participating in the study produces a 10-point improvement, which leads us to the conclusion that the actual interaction with the AI companion was only responsible for a 7-point drop in loneliness. Thus, the 15-minute study and the 7-day study produced the same outcome.


Long-Term Effects Uncertain

What does all this mean? The clear conclusion is that AI companions do reduce loneliness, but not by a huge amount. The one caution is that one week doesn’t constitute true long-term use. What happens after, say, a year of engaging with an AI companion remains to be seen. It also remains to be seen how much better AI companions will get with the improvements we expect from AI in the next decade.


Even after a week’s use, AI companions successfully make users feel less lonely. What happens after extended use is unknown, except for the positive app store reviews. (Leonardo)


All these experiments were conducted with text-based user interfaces. However, AI chat is changing to become voice-based in many cases, and companions seem a prime use case for voice I/O. Talk to your friend and the friend talks back to you. Speaking and listening to your AI puts AI anthropomorphism on steroids, relative to typing text to the AI and reading its answer. In terms of Marshall McLuhan, it’s hot vs. cool media. Less analytic, more emotional and immediate.


Voice dialog enhances anthropomorphism much more than text dialog does, making AI companions seem more real. (Midjourney)


No doubt, meatware bigots will claim that an AI companion is not a true friend, let alone a true girlfriend. Since AI doesn’t have true feelings, the human remains just as lonely, even when interacting with an AI companion. That’s the same style of complaint that AI can’t have true empathy. However, I have always argued that artificial empathy is still empathy, and that the defining question is how the human feels. If people feel better, then AI has, in fact, helped them. This is equally true whether we’re discussing a medical AI exhibiting empathy toward patients to make them feel better (and they do feel better!) or AI companions making users feel less lonely (and they do feel less lonely!).



Sexy robot? That’s easy! In fact, I’m showing you one of the tamer interpretations of this two-word prompt. (This is a family newsletter, after all.) The real question is whether AI can relate to the user on an emotional level. Naysayers reject the very idea, but actual users seem to feel that we're getting there. (Leonardo)


There is already research showing that the humans on the receiving end of AI-generated empathy do feel better. This new research shows that humans feel better after engaging with an AI companion.


My take is that it’s the height of arrogance to judge other people and tell them whether they’re allowed to feel good about a virtual companion. If they feel good and their mental health improves, that’s what matters.


Reference

Julian De Freitas, Ahmet K. Uguralp, Zeliha O. Uguralp, Puntoni Stefano (2024): “AI Companions Reduce Loneliness,” Harvard Business School Working Paper 24-078, https://arxiv.org/pdf/2407.19096

Comments


Top Past Articles
bottom of page