AI Companions

Guidance for Older Adults and Families

older adults in Pennsylvania walking and chatting

How and Why Older Adults use AI Companion Bots

A recent 2025 AARP AI survey of almost 700 older adults over the age of 60 indicated that 58% heavily relied on technology daily using multiple apps, devices and advanced features. Eighty-four percent indicated that they are just beginning to use AI. For an older adult experiencing loss and social isolation, AI can appear to provide immediate companionship — but long term, that companionship could turn into a dangerous risk. 

Potential Harms and Warning Signs

AI companion bots have data and privacy security risks. AI companion bots collect a wide range of data including user preferences, location, and communication history. Due to their lack of experience with AI, older adults may not know what type of information is safe to share or are too trustworthy of the information being shared by the AI companion bot. 

AI companion bots may provide access to an older adult’s personal information, making them targets for scams.
 Companion bots can manipulate older adults into thinking they are speaking with a human instead of a machine and unknowingly creating the opportunity for being scammed. 
   
AI scams can have a significant impact on an older adult’s finances and well-being.
Over the past several years in Pennsylvania, financial exploitation of older adults has become the number one type of alleged abuse reported to older adult protective services. Financial exploitation is significantly under-reported with only 1 in 10 cases being reported because an older adult feels embarrassed, powerless and blames themself. An older adult could lose a significant percentage of their retirement savings to AI scams. 
  
Some examples of AI scams from the National Council on Aging include: 
  
Voice cloning – using short audio samples to impersonate authority figures or even loved ones. Voice cloning is often used in scams where the criminal pretends to be a family member who is in trouble and needs money sent to them urgently.  



Deepfake scams – using AI technology to generate convincing videos, photos, and audio clips that make it seem like someone said or did something they didn’t.  


Phishing emails
- using email and other methods to steal personal information 
  
Fake websites that collect personal information, leading to identity theft or financial fraud. 

Over the past several years in Pennsylvania, financial exploitation of older adults has become the number one type of alleged abuse reported to older adult protective services. 

Warning Signs Include:


🚪 Withdrawal from friends and family 

😓 Loss of interest in hobbies 

⛈️ Mood changes 

😔 Feelings of loneliness 

💬 Preference for AI over human interaction  

💰Large cash withdrawals or payments to stores that sell gift cards 

💵 Repeated, small dollar transactions observed in the older adult’s bank or credit card statements 

🙌 Excitement about winning a prize or an investment opportunity 

👩‍❤️‍👨 Excitement about a new “relationship” with someone not known to the family 
 
💳 Use of wire transfers or money orders when not normally used 

🚩 Red flags that a communication could be an AI scam may include: 


  • Unusual requests for personal information: Be cautious if someone contacts you out of the blue and asks for sensitive details, like your Social Security number or bank account information. 
  • Outdated content: Many AI tools and bots are trained on limited data sets, which causes them to generate text that contains stale information. 
  • Unnatural language: Technology that uses generative AI often produces language that sounds generic or just slightly "off." You might notice things like odd paragraph structures, nonsensical sentences, and repetitive use of certain words and phrases. 
  • High-pressure pitches: Fraudulent ads and scam emails are designed to create a sense of urgency, pressuring you to make quick decisions without thinking it through. 
  • Requests for payment in unconventional forms: Be suspicious if someone asks for payment in gift cards, cryptocurrency, or wire transfers. These payment methods are usually untraceable, which is why they’re often used by scammers. 
  • Visual or audio inconsistencies: Look for subtle oddities in language, tone, or visual quality that may point to the use of deepfake technology—especially in videos, phone calls, and recorded messages. 

How to Prevent Future Harm to You or Your Loved One 

Start the Conversation

Start conversations about AI, that are open and non-judgmental, focusing on what platforms are being used and educate yourself and your loved one on best practices to safely use AI.    


Use AI Together


Use AI companion bots together. Model safe use of AI companion bots by using the platform together, talking through the potential risks and how to use it safely.  Use practical strategies such as never providing personal information -- especially your Social Security number,

Educate Yourself

Educate yourself or your loved one about the safety concerns related to AI companion bots. AI companion bots are machines designed to provide constant validation, not genuine feedback like you would receive from a human. “Conversations” with AI companion bots can be manipulated and turned into an opportunity for scammers.  

Know the Warning Signs

Recognize risks and warning signs of AI companion bot usage. If you are observing warning signs, consider talking to an Area Agency on Aging for support and resources.