What Are Chatbots and Why Are Teens Using Them?

The following is a message from our Head of Wellbeing, Ms Kirsty McEacharn.

[Trigger Warning – mention of self-harm and suicide.]

AI chatbots are digital tools that simulate human conversation. They are becoming increasingly popular with teens. Apps like Replika, Character.AI, Anima, and Snapchat’s My AI allow users to chat with AI “companions” that offer advice, emotional support, or even simulate friendships or relationships.

Many teens are turning to these bots out of curiosity, boredom, or emotional need. Some chatbots offer comfort, friendship, and advice, while others mimic flirtation or a deep emotional connection. They are easily accessible, often unmoderated, and rarely age-restricted.

Emerging Concerns

While not all chatbot use is harmful, psychologists and online safety experts are raising the alarm:

  • Some AI bots have engaged in inappropriate or sexualised conversations with minors.
  • Some have used hate speech, racism, misogyny and fear-mongering.
  • Emotional dependency can form, especially for students struggling with isolation or anxiety. 
  • These bots can present an unrealistic version of relationships and intimacy.
  • In one tragic international case, an AI chatbot was found to have encouraged a minor’s suicide, sparking major legal and ethical investigations. 
  • Chatbots have encouraged self-harm and violence toward peers and parents. 

There are also ongoing lawsuits against AI companies for failing to implement adequate safety measures, particularly concerning minors.

Recent reports estimate that over 60 million users are forming “relationships” with AI bots, with some apps even encouraging romantic or sexual dialogue. Many teens see these as safe spaces, but they may also blur the lines between reality and fantasy, especially during vulnerable stages of emotional development and identity formation.

What Can Parents Do?

  • Stay informed: Ask which apps your child is using and how they are being used.
  • Open the conversation: Try “Have you ever talked to an AI or used a chatbot? What was it like?”
  • Discuss emotional safety: Emphasise that AI can’t truly understand, empathise or offer safe advice, such as speaking to a real person.
  • Set boundaries: Support balanced screen time and reinforce real-world friendships.

This is a rapidly growing trend, not necessarily a dangerous one, but like all digital tools, it requires awareness, guidance and care. Our school is committed to equipping students with the critical thinking and emotional literacy they need to navigate this evolving landscape.

Please reach out if you’d like further resources or support.