top of page

Two Boys. 60 Girls. 347 AI-Generated Images. And a School Caught Off Guard.


school

A student at Lancaster Country Day School in Pennsylvania received a pornographic deepfake of a female classmate on Discord. He did the right thing: deleted it, left the group chat, and filed an anonymous report to a state tip line, which forwarded it to the school.


The school, unfortunately, failed to act.


According to an article in USA Today, “Over the next six months, two boys at the school continued continued to make AI-generated content of other girls.” By the time a criminal investigation finally began — prompted not by the school, but by parents who went directly to law enforcement — the boys had created 347 pornographic deepfakes of 60 girls. Forty-eight of them were students at the school.


Both boys have now pleaded guilty to 59 felony counts of manufacturing child sexual abuse material. They'll be sentenced this week.


This Is Not an Isolated Incident


If you think this can't happen at your child's school, think again.


The National Center for Missing and Exploited Children reported that AI-generated child sexual abuse images reported to its tip line skyrocketed from 4,700 in 2023 to 440,000 in just the first six months of 2025. Homeland Security says reports of child exploitation involving generative AI increased over 600% in early 2025 compared to the previous two years combined. And a UNICEF study across 11 countries found that at least 1.2 million children had their images manipulated into sexually explicit deepfakes in the past year alone.


This isn't a tomorrow problem. It's a right-now problem.


"Nudify" apps—tools that use AI to digitally strip clothing from photos—require zero technical skills. A middle schooler with a phone can use one. And they are.


The Real Scandal? Schools Aren't Teaching Kids About This


Here's what should alarm every parent and educator: A 2024 survey by the Center for Democracy & Technology found that only 13% of students said their school had explained that sharing AI-generated sexual images is harmful. Six in 10 teachers weren't even aware of their own school's policies for handling these incidents. And when CDT updated their research in 2025, a staggering 89% of students still didn't know who to tell if they encountered one.


Schools are still treating deepfake abuse as a discipline problem rather than a prevention problem. They're waiting for incidents to happen, and then scrambling to respond.


As one of the lawyers representing Lancaster families put it, “They have to broaden the lens and include the perils of AI for adolescents, just like they talk about perils of drug use or of promiscuity." 


It has to be part of the curriculum, not an afterthought.


Prevention Is the Only Strategy That Scales


Laws are starting to catch up. The TAKE IT DOWN Act now requires platforms to remove nonconsensual intimate images within 48 hours. States are expanding definitions of child sexual abuse material to include AI-generated content. But legislation alone won't protect kids who are encountering these tools right now, today, on their phones and laptops.


What will? Education. Specifically, teaching students before they encounter these situations — what deepfakes are, why creating or sharing them is abuse, how to respond if they become a target, and who to tell.

That's exactly why we built Digital Health and Wellbeing, a six-lesson curriculum designed for late middle and high school students. It covers the threats students are actually facing—AI-manipulated images, sexting and boundaries, sextortion, digital blackmail, and risky chatbot interactions—through the kind of teacher-led discussion that changes how kids think and act.


Because the goal isn't just awareness. It's giving students the skills to protect themselves and each other.


The student at Lancaster who reported that first image did the right thing. He knew it was wrong. Imagine if every student in that school had been equipped with that same understanding — and if the adults had been, too.


 

Watch a student video from the curriculum:


Diana Graber
 
 
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • YouTube Social  Icon
bottom of page