AI Companions

Guidance for Parents & Guardians

Dad and son walking and smiling

How and Why Kids use AI Companion Bots

A recent report from the Center for Democracy and Technology outlined that over 85% of school-aged children self-reported using AI in some capacity, with 73% reporting to use AI for personal reasons. Most teens and children use generative AI platforms out of boredom or to help with homework. Others use these systems to seek advice on personal issues they are uncomfortable asking their family about or because of grief or loneliness. For a child experiencing deeper challenges, AI can appear to provide immediate relief — but long term, that relief could morph into dangerous dependency. 

Potential Harms and Warning Signs

AI companion bots have data and privacy security risks. AI companion bots collect a wide range of data including user preferences, location, and communication history. Children and teens may not know what is safe to share.  

AI companion bots may provide misinformation or biased answers. AI is trained on data from across the internet, including inaccurate or biased information. Results from AI companion bots can provide children with wrong answers, stereotypes/discrimination, and even, information on methods to harm themselves.  

AI companion bots can impact social and emotional learning. As the model is designed to agree with and support users rather than challenge them, it can distort a child’s understanding of relationships in the real world and reduce opportunities to speak with family and friends about their needs.  

Warning Signs Include:

🚪Withdrawal from friends and family 


😓 Loss of interest in hobbies 

⛈️ Mood changes 


📝 Declining grades, and  

💬 Preference for AI over human interaction

AI Guidance for Parents, Families, and Youth Caregivers

Start a Convo

Start conversations about AI relationships with your children and teens, that are open and non-judgmental, focusing on what platforms they are using and their feelings on AI and human relationships. Teach them about the mental health support limitations of AI companion bots.

Know Warning Signs

Recognize risks and warning signs of AI companion bot usage. If you are observing warning signs in your child or teen, consider talking to your child’s health care professional, school counselor or mental health provider. 

Parental Controls

Use practical strategies such as using parental controls on a child’s device to set limits on apps; identifying when and why a child may seek comfort or interaction through an AI companion bot; and encouraging a gradual use reduction by helping foster healthier habits. 

Use AI Together

Model safe use of AI companion bots by using the platform together, talking through how to use it, and sharing your boundaries with its use.