Office of School Climate and Well-Being

Deep Fake Images and Sextortion

Acts 125 of 2024 and 35 of 2025 and Artificial Intelligence (AI) Generated Abuse 

In October 2024 and July 2025, Governor Josh Shapiro signed Act 125 and Act 35 into law, giving law enforcement new tools to combat the misuse of artificial intelligence.

With AI tools, individuals can easily and quickly create images, video, and audio of real people in situations that never occur; however, if misused, this can result in an individual being charged with crimes with real consequences. 

Under Act 125, it is a crime to disseminate a visual depiction of a current or former sexual or intimate partner in a state of nudity or engaged in sexual conduct or an AI generated sexual depiction of an individual with intent to harass, annoy, or alarm another person.  18 Pa.C.S. § 3131.  It is also a crime to intentionally view, possess, or control child sexual abuse material or any AI generated child sexual abuse material. 18 Pa.C.S. § 6312.  

The Pennsylvania Office of Attorney General has already charged multiple individuals with felony offenses under this law.

“Our office will continue to use the tools the legislature provided in Act 125 that allows law enforcement to pursue investigations involving artificially-generated materials, and hold these offenders accountable for their actions.”

Under Act 35, it is a crime to create a forged digital likeness, a computer-generated image, video, or audio recording of a person, with the intent to defraud or injure anyone.  18 Pa.C.S. § 4101.1.

- Pennsylvania Office of the Attorney General

What to do if You or Your Child is a Victim

If you discover AI-generated sexual images, deepfakes, or sextortion involving you or your child, take these steps immediately:

Report to Law Enforcement First

You can report these incidents through the following:

  • Local Police Department (non-emergency or 911 if immediate danger)

  • Pennsylvania Office of Attorney General – Safe2Say Something Program 
    Submit a tip (24/7, anonymous) Please provide as much detailed information as possible so law enforcement can effectively follow up. This may include: 
  • The app or platform used

  • Where the image or content was found

  • When the image or content was posted

  • The name of the individual involved

  • The username or profile associated with the account

After law enforcement is informed, if the victim is a minor, reporting to your school is critical to help address the impact of these incidents within the school community and provide student support.

Preserve Evidence

Important: Digital evidence can disappear quickly. Preserving it early is critical.

  • Take screenshots of images, messages, usernames, and profiles 

  • Save links (URLs) to where the content appears

  • Keep copies of messages and do not delete accounts right away 

  • Write down dates, times, and platforms involved

Limit Continued Sharing

  • Do not forward, download, or repost the images

  • Ask others not to share the content 

Request Removal of Content

Report the images directly to the platform where they appear as soon as possible. Most platforms have built-in tools to report non-consensual intimate images or sexual content involving minors.

When reporting, choose the most serious and accurate category available (such as nudity, sexual exploitation, or involves a minor).

  •     Tap the three dots on the post, story, or profile

  •     Select “Report”

  •     Choose “Nudity or sexual activity” or “Involves a child”

  •     Submit the report and follow any additional prompts

  •     Press and hold on the Snap, Story, or message

  •     Tap “Report”

  •     Select the reason (such as sexual content or harassment/abuse)

  •     Block the user if necessary

  •     Tap the arrow (share icon) on the video

  •     Select “Report”

  •     Choose “Nudity and sexual activity” or “Minor safety”

  •     Submit the report

For minors: 

This free secure service, run by the National Center for Missing & Exploited Children, helps remove or prevent the spread of online sexual images involving minors.

You (or your child) can submit images or videos anonymously. The service creates a unique digital fingerprint (called a “hash”) of the image without uploading or storing the actual content. Participating platforms then use this hash to detect and remove the image if it appears online.

This tool can be used for:

  • Real images or videos

  • AI-generated or edited images

  • Content that has already been shared or may be shared

Seek Support

Experiencing this type of harm is overwhelming. 

Support is available: 

  • Call or text 988 (Suicide & Crisis Lifeline)

  • Speak with a school counselor, if victim is a minor  

  • Contact a local mental health provider

What is Sextortion?

Sextortion happens when someone threatens to share sexual images or videos to manipulate another person. In a new policy statement, the American Academy of Pediatrics identifies sextortion as a form of image-based sexual abuse and exploitation. While it can affect individuals of all ages, children are particularly vulnerable. 

Learn more about what sextortion is and what to do if you or your child is the victim with resources from the American Association of Pediatrics.