AI Companions

Guidance for Educators

teacher showing student something on a computer

How and Why Kids Use AI Companion Bots

A recent report from the Center for Democracy and Technology outlined that over 85% of school-aged children self-reported using AI in some capacity, with 73% reporting to use AI for personal reasons. Most teens and children use generative AI platforms out of boredom or to help with homework. Others use these systems to seek advice on personal issues they are uncomfortable asking their family about or because of grief or loneliness. For a child experiencing deeper challenges, AI can appear to provide immediate relief — but long term, that relief could morph into dangerous dependency. 

Potential Harm and Warning Signs 

AI companion bots have data and privacy security risks. AI companion bots collect a wide range of data including user preferences, location, and communication history. Children and teens may not know what is safe to share.  

AI companion bots may provide misinformation or biased answers. AI is trained on data from across the internet, including inaccurate or biased information. Results from AI companion bots can provide children with wrong answers, stereotypes/discrimination, and even, information on methods to harm themselves.  

AI companion bots can impact social and emotional learning. As the model is designed to agree with and support users rather than challenge them, it can distort a child’s understanding of relationships in the real world and reduce opportunities to speak with family and friends about their needs.  

Warning Signs Include:

🚪Withdrawal from friends and family 


😓 Loss of interest in hobbies 

⛈️ Mood changes 


📝 Declining grades, and  

💬 Preference for AI over human interaction

How Educators Can Teach Students about Companion AI

Host a Discussion

Host classroom discussion on student use of AI companion bots. These talks should be framed as open and non-judgmental, focusing on whether students are familiar with AI companion bots, their beliefs on how bots should be used, their understanding of how they work, and what kinds of thoughts they feel more comfortable asking bots instead of humans.  

Teach Prompts and Wording

Teach students about the importance of prompts and wording when speaking with AI companion bots. Ask students to write a single prompt and then another question with slightly different wording. Compare the responses the AI companion bot offers and discuss why the platform gave different answers.  

AI Test Drive

Take AI test-drives as a class as an opportunity to use AI together to understand how companion AI companion bots work and what to use it for. Explain to students that AI companion bots are not human and are designed to engage and validate users. They cannot offer true feedback and instead might share incorrect information. 


Know Risks and Warning Signs

Recognize risks and warning signs of AI companion bot usage in students. Students should be informed about warning signs of unhealthy AI use and know how to report concerning interactions or emotional distress. If warning signs are visible in children, educators should follow school protocols for referring students and communicate to families, recommending they speak to the student’s health care professional or school mental health supports.