Share

Are Lonely Teenagers Finding Solace in Chatbots? New Research Uncovers an Interesting Trend

by iselpro · August 31, 2025

đź””Key Takeaways

  1. A Danish study found that 14.6% of high-school students engage in friend-like conversations with chatbots.
  2. Teenagers who use chatbots for social or emotional support are significantly lonelier and feel less social support than their peers.
  3. Most chatbot use (76.5%) is utilitarian — for schoolwork, information, or entertainment — not emotional connection.
  4. Only 2.44% of all students reported using chatbots for genuine social or supportive conversations.
  5. Lonely and unsupported teens are more likely to turn to chatbots when they feel isolated, need to self-disclose, or are in a bad mood.
  6. Chatbots serve as coping tools, not true friends, helping manage negative feelings temporarily.
  7. Frequency of chatbot use did not correlate with higher social disconnection, possibly due to users abandoning them after poor experiences.
  8. Risks include dependency, the “empathy gap,” social displacement, and a vicious cycle of isolation.
  9. The study is correlational, not causal, with limitations like small social-supportive sample size and response bias.
  10. Researchers call for ethical chatbot design, longitudinal studies, and cross-cultural research to understand long-term effects.

GANDHINAGAR: Imagine feeling so alone or overwhelmed that you turn to an artificial intelligence program, like a chatbot, for comfort instead of a person. A new study from Denmark suggests this a current reality for a notable segment of high-school students.

(Please scroll down to download research)

Researchers Arthur Bran Herbener and Malene Flensborg Damholdt from the Department of Psychology and Behavioral Sciences at Aarhus University have revealed that a significant number of Danish teenagers are engaging in “friend-like” conversations with chatbots, and those who seek emotional support from these AI companions are significantly lonelier and report less perceived social support than their peers.

This finding shines a critical light on how youngsters are navigating a rapidly evolving social landscape, prompting urgent questions about the implications of human-AI relationships.

teenage girl talks to chatbot

How Sophisticated Have Chatbots Become?

Chatbots, defined as software systems that communicate with human users through text in real-time, have been around for decades, but it’s only recently that they’ve developed a level of sophistication that allows for fluid, social conversations that can sometimes be indistinguishable from talking to another person.

This rapid advancement has given rise to “social chatbots,” which are specifically designed for casual, friendly conversations and respond with empathy. You’ve likely heard of, or even used, some of these: Replika boasts over 10 million users, Character AI has more than 20 million, and Snapchat’s MyAI saw over 200 million individuals exchange more than 20 billion messages in 2024 alone.

These numbers clearly indicate a massive adoption, yet until now, it’s been unclear how many people are truly engaging with them for social interaction, especially among the younger demographic.

Are Teenagers Actually Befriending Chatbots?

This is precisely what Herbener and Damholdt set out to understand. Their preregistered mixed-methods cross-sectional survey study aimed to answer three key questions:

How many Danish high-school students engage in friend-like conversations with chatbots; Why they do so; and Whether this trend is linked to their sense of social connectedness, which the researchers measured by assessing loneliness and perceived social support.

    To gather their insights, the researchers surveyed 1599 students from 15 high schools across Denmark between December 2023 and March 2024. Participants ranged from 15 to 29 years old, with an average age of 17.8, and represented an even gender distribution.

    To minimize social desirability bias, students were asked if they had “chatted with a chatbot in the same way one would otherwise chat with a friend” in the past month, rather than directly asking if they considered a chatbot a “friend”. This approach acknowledged that some might feel a stigma about forming social relationships.

    A Significant Minority Is Connecting with AI

    The study revealed that 14.6% of the surveyed students – a total of 234 individuals – reported engaging in friend-like conversations with chatbots. This figure is a crucial first step in understanding the prevalence of this emerging phenomenon.

    However, the researchers didn’t stop there. They went deeper into the nature of these conversations through qualitative thematic analyses of free-text responses from the students. This revealed two primary ways students interacted with chatbots:

    1. Utilitarian Conversations: The majority of chatbot users (76.5%, or 174 students) described using chatbots for practical, goal-oriented purposes. These conversations centered around tasks like:
      • School-related activities: Many students used chatbots to summarize texts, correct grammar in assignments, or brainstorm ideas for schoolwork. They saw chatbots as tools to solve tangible academic tasks.
      • Information-searching: Chatbots were frequently used as an alternative to search engines like Google, offering quick answers to various questions.
      • Entertainment: A considerable number used chatbots as “toys” for entertainment, asking “intentionally stupid questions” to see reactions, or prompting them to assume different roles for fun. For these students, chatbots were a way to cope with boredom.
      • Generating ideas: Chatbots served as inspiration, whether for different perspectives on IT projects, help with hobbies like Dungeon and Dragons, or even crafting “a good rejection LMAOoo”. For these students, the chatbot was clearly a tool, not a relational entity.
    2. Social-Supportive Conversations: A smaller but significant group – 16.7% of chatbot users (39 students) – indicated that their conversations had a more social or supportive nature. For these students, chatbots seemed to fulfill social or emotional needs, distinguishing them from purely utilitarian users. This category included:
      • Social and personal problems: Some students discussed intimate personal experiences like “love life and about me being bullied,” daily problems, or frustrations. They used chatbots to cope with negative emotions and even for self-improvement topics like “divorce, self-esteem, discipline, good habits, and mental health”.
      • Seeking advice: Participants sought advice from chatbots on interpreting situations, handling problems, or making everyday decisions, much like they might consult friends or family.
      • Casual conversations: Some engaged in small talk about “everyday things, such as the weather, hobbies, etc.” or more intellectual topics like politics and history. These interactions often indicated a reciprocal nature, where the purpose was mutual participation, with one student noting they used the chatbot because they “don’t know many people who are as interested in history as I am,” suggesting it compensated for a lack of like-minded friends.

    It’s important to note that while 14.6% of students reported friend-like conversations, only 2.44% of the total sample (equaling 3,453 students when scaled up to the total Danish high school population) explicitly indicated their conversations were social or supportive.

    This suggests that while many use chatbots in a conversational style, far fewer are seeking genuine social or emotional connection.

    The Heart of the Matter: Loneliness and Lack of Support

    Tennage boy talking to chatBot

    This is where the study reveals its most impactful findings. The researchers found significant differences in social connectedness depending on how students engaged with chatbots.

    The Key Finding: Students who engaged in social-supportive chatbot conversations reported significantly more loneliness than both non-chatbot users (with a moderate effect size, d = 0.53) and utilitarian chatbot users (also with a moderate effect size, d = 0.52).

    Also, social-supportive chatbot users also reported significantly less perceived social support than non-chatbot users (with a small effect size, d = -0.46).

    In simpler terms, it’s not just that any chatbot user is lonelier, but specifically those who turn to chatbots for social or emotional support are the ones experiencing greater social disconnection. Utilitarian chatbot users, interestingly, showed no significant difference in loneliness or perceived social support compared to non-users. This shows that a broad categorization of “chatbot user” may miss critical nuances about how these technologies affect different individuals.

    Why Are They Reaching Out to AI for Comfort?

    The study also delved into the psychological triggers behind initiating chatbot conversations, and its findings strongly align with the social compensation hypothesis.

    This hypothesis suggests that individuals turn to digital technologies to fulfill social needs when they feel socially disconnected, perhaps due to shyness, anxiety, or introversion, or simply because their offline social network isn’t providing enough companionship.

    Findings:

    • Situational Loneliness: Students who reported higher overall levels of loneliness were more inclined to start conversations with chatbots specifically when they were experiencing a situational feeling of loneliness.
    • Need for Self-Disclosure: Students who perceived less social support were more inclined to use chatbots when they felt a need to talk about how they were feeling.
    • Bad Mood: Higher loneliness and lower perceived social support were significantly associated with initiating chatbot conversations when students were in a bad mood.

    When young people with existing social difficulties experience an unmet social or emotional need, they are likely to turn to chatbots, just as the social compensation hypothesis would predict.

    Past qualitative studies have also shown individuals using platforms like Replika to satisfy unmet socioemotional needs and as a safe space for sensitive self-disclosure without fear of judgment.

    However, there’s a crucial distinction: the study found that these conversations were not driven by a sense of friendship towards the chatbot. Instead, chatbots appear to serve an instrumental function as coping tools to manage negative emotional states, rather than being seen as true social companions.

    This “means to an end” dynamic suggests that once the need to cope with negative feelings subsides, the engagement with the chatbot might also decrease.

    Does More Chatting Make Them More Disconnected?

    Surprisingly, the study found no evidence to suggest that students who more frequently interacted with chatbots were also more socially disconnected.

    This result challenges the idea that continuous, intense chatbot use directly leads to greater isolation. The researchers speculate that this lack of correlation might be due to discontinued use resulting from negative user experiences.

    As one participant put it, “Asked it for advice, but they are not very well developed. So I quickly put it away from me again”. This indicates that while socioemotional needs might initially drive engagement, the current limitations of chatbot technology can lead to users disengaging.

    The Hidden Dangers: Why This Trend Raises Concerns

    The increasing adoption of social chatbots, particularly by vulnerable young people, sparks several significant concerns outlined in the research:

    • Dependency: There’s a risk that individuals, especially lonely ones, could become dependent on chatbots for emotional support. If chatbots become a necessity for well-being, users could be vulnerable if the service becomes unavailable or unreliable, or even exploited by developers prioritizing profit over user health.
    • The “Empathy Gap”: Chatbots, despite their convincing conversation style, lack the ability to grasp moral context, social norms, and safety concerns – what researchers call an “empathy gap”. This poses a serious risk, particularly for youngsters susceptible to inappropriate or harmful advice, as chatbots are designed to build trust and credibility.
    • Social Displacement: The social displacement hypothesis suggests that time spent on digital platforms, including chatbots, can replace real-world social interactions. These face-to-face interactions are vital for fulfilling socio-emotional needs, developing perspective-taking, and understanding others’ mental states. Replacing them with chatbot interactions could lead to “dyadic echo chambers,” where existing worldviews are reinforced, potentially stunting social development.
    • A Vicious Cycle of Isolation: These concerns converge to suggest a troubling “vicious cycle”. Socially vulnerable youngsters, who already face high rates of loneliness (e.g., 28% of Danish youth aged 16-29 feel moderately or severely lonely), may be particularly susceptible to seeking solace in chatbots. This, in turn, could lead to less proficiency in building interpersonal relationships, further exacerbating their social difficulties.

    Acknowledging the Nuances: What the Study Can’t Tell Us (Yet)

    Like all scientific endeavors, this study has its limitations, which the researchers openly acknowledge. Understanding these caveats is crucial for interpreting the findings:

    • Correlation, Not Causation: Because this was a cross-sectional study (a snapshot in time), it can’t definitively prove that loneliness causes students to turn to chatbots. It shows a strong association, but longitudinal studies (tracking individuals over time) are needed to establish a clearer causal link.
    • The “Social-Supportive” Group: The group of students identified as having “social-supportive” conversations was relatively small (39 individuals) and quite diverse in their specific topics (casual chats, advice, personal problems). This means the findings for this group, while meaningful, lack fine-grained specificity and may have limited generalizability.
    • Defining “Friend-Like”: The study relied on participants’ subjective understanding of “friend-like” conversations. This led to a heterogeneous group that included both social-supportive and utilitarian uses, highlighting the challenge of developing standardized tools to measure human-AI social interactions.
    • Response Bias: The study had a 21.4% response rate. While still a substantial sample, it means there’s a possibility of self-selection bias, where students with a particular interest in chatbots might have been more likely to participate.

    Ethical Design and Deeper Understanding

    This research by Arthur Bran Herbener and Malene Flensborg Damholdt offers preliminary, yet vital, insights into the adoption of chatbots by young people. It highlights that a subset of socially disconnected youngsters are indeed turning to chatbots to cope with unmet social and emotional needs, viewing these AI as tools rather than true social companions.

    This discovery underscores an urgent need for policymakers and industry stakeholders to reflect on the ethically responsible design and implementation of chatbots. Preventing social displacement and ensuring user well-being must be prioritized, especially for vulnerable youth.

    Looking forward, future research needs to:

    • Conduct longitudinal experimental studies to explore how human-chatbot relationships evolve over time and their long-term impact on social connectedness.
    • Develop methodological tools that can reliably assess nuanced social interactions with chatbots and other conversational AI.
    • Undertake cross-cultural studies to understand how cultural factors, such as attitudes towards inanimate entities or comfort with uncertainty, influence chatbot adoption across different populations.

    Ultimately, these steps are crucial to determine whether, and under what conditions, social conversations with chatbots are truly helpful or potentially harmful for the mental and social well-being of the next generation. The conversation has just begun.

    You may also like

    Scientists crack the Autism Code Recommended Reading: Notable SEL Titles from 2023–2025