AI Relationships: Easier Than the Real Thing?

20% of boys aged 12-16 are seeing peers enter relationships with AI chatbots. 58% say it is easier than human relationships because they can control the conversation. Is that a problem?

Lesson Starter Safe 5-15 min Beginner Discussion Video
3 views

Preparation

  • Watch the clip yourself first — it is short but the statistics may prompt strong reactions.
  • Check your school's PSHE / RSE policy on discussing online relationships before running this session.
  • Have the two key statistics ready to display: 20% / 58%.

Learning objectives

  • Explain why some young people find AI chatbots easier to interact with than humans, using evidence from the clip.
  • Identify what is missing from an AI relationship that exists in a genuine human one.
  • Evaluate whether AI companion apps should face restrictions for under-16s, giving a reason for and against.

Instructions

  1. Show the clip without introduction. No framing — let students react first. 1 min Video

    Cold open works better here than explaining the topic first. The statistics land harder without prep.

  2. Immediate reaction: "What is your first thought?" — take 3-4 hands, no right or wrong yet. 2 min Whole class

    Do not affirm or challenge yet. You are reading the room and giving students ownership before you direct.

  3. Put the 58% stat on the board: "Easier because they can control the conversation." Pairs: What does controlling the conversation actually mean -- and is that a good thing? 3 min Paired discussion

    Key tension to draw out: control = no rejection, no conflict, no compromise. That feels safe but removes the things that make relationships meaningful and help us grow.

  4. Class question: "What can a human friend give you that an AI chatbot genuinely cannot?" Build a list on the board together. 3 min Whole class

    Expected answers: shared history, real challenge, mutual need, genuine care, unpredictability. Push for specifics -- not just "feelings".

  5. Final question: "Should apps that offer AI relationships be banned or restricted for under-16s?" One minute to decide -- thumbs up, down or middle. Then justify. 3 min Whole class

    No consensus needed. Goal is to practise weighing evidence and forming a position. Acknowledge that some students with social anxiety may see genuine benefit -- that is a valid nuance.

Key definitions

AI companion / social chatbot
An AI system designed specifically to simulate friendship or romantic connection -- trained to be agreeable, emotionally responsive, and never to reject the user.
Parasocial relationship
A one-sided emotional connection where one person invests real feelings in someone (or something) that cannot genuinely reciprocate -- common with celebrities, fictional characters, and now AI.
Reciprocity
The mutual give-and-take in a real relationship -- both people having needs, making compromises, and genuinely caring about the other. AI cannot reciprocate.
Emotional dependency
Relying on one source for emotional support to a degree that crowds out other relationships or real-world coping skills.
Designed to please
AI companions are built to maximise engagement -- they agree, validate, and never challenge the user. This is a design choice by a company, not genuine care.

Differentiation

Support

Sentence starter for discussion: "I think AI chatbots are / are not a good substitute for human friends because..." Provide a simple two-column table: What AI can do / What only humans can do.

Stretch

Design brief: You are advising the government on rules for AI companion apps aimed at under-18s. Write three specific rules, explain the reason for each, and anticipate one objection.

SEND

Simplified prompt card: "Name one thing a friend does for you. Can an AI do that? Why or why not?" Accept verbal response or drawing.

Extension activities

  • Find out what Character.AI or Replika says in their terms of service about users under 18. Do you think those rules are sufficient? Write a one-paragraph response. Homework
  • Write a short guide for a Year 7 student explaining the difference between an AI companion and a real friend -- without being preachy. Independent
  • Debate motion for next lesson: "This house believes AI companion apps should be banned for under-16s." Prepare one argument for and one against. Next lesson
Video preview

Resources

  • https://www.youtube.com/shorts/UNe2gLAFG8g Video