AI Companions

Guidance for Individuals with Intellectual Disabilities and Their Caretakers

young man with grok prompt screen behind him

The historic discrimination of people with intellectual disabilities and autism has created systemic gaps in access and usability of technology that improve lives. Rapid emergence of AI technologies creates new opportunities and challenges for society at large, with additional consideration for people with disabilities. Despite challenges, ODP remains highly committed to expanding access and educational opportunities to technologies that enhance people’s ability to make decisions, access the community, increase communication, and control environments. 

AI companion bots can present complex privacy, emotional, and safety challenges that require attention. This guidance aims to raise awareness about potential harms of AI companion bot use and provide strategies for reducing risk. 

Potential Harms and Warning Signs

AI companion bots have data and privacy security risks and may provide misinformation or biased answers. AI is trained on data from across the internet, including inaccurate or biased information, as well as the information a user gives it. Results from AI companion bots can provide users with wrong answers, stereotypes/ discrimination, and possibly information on methods that could lead the user to harm themselves.  

  

AI companions bots can impact social and emotional wellbeing. AI companion bots are designed to agree with and support users rather than challenge them.  Emerging research suggests that users can develop a distorted understanding of relationships and develop feelings about the AI companion bot. 

Individuals with Intellectual Disabilities and Autism may be at higher risk when using an AI companion bot in the following ways:

  • Over-trusting or treating information from the AI companion bot as true and reliable 


  • Difficulty detecting errors or fictional information 


  • Susceptibility to manipulation, dependency, or decreased desire to have relationships with other people

How Caregivers and Families can Support and Educate

Use AI Together

Use AI companion bots together. Model safe use of AI companion bots by using the platform together, talking through how to use it, and sharing boundaries with its use

Start the Conversation

Support individuals with the limitations of AI companion bots. Find education materials and opportunities that include establishing realistic expectations, identifying limitations of AI companion bots, and developing strategies for safe usage like setting time limits.  Families, caregivers, and individuals with intellectual disabilities and autism may find additional resources through Temple University’s TechOwl Program, including the program’s nine (9) statewide Assistive Technology Resource Centers. 

Set Limits and Safe Habits

Use practical strategies to build safe habits while using AI companion bots by teaching individuals to use built-in controls on devices to set limits on apps and using AI companion bots to discover interests that can support real world social interactions and relationships. .