They’ve turned tech into a weapon — and no one’s safe from the scandal.
Teens are using artificial intelligence to whip up disturbingly realistic nude images of their classmates — and then share them like digital wildfire, sending shockwaves through schools and leaving experts fearing the worst.
The AI-powered tools, often dubbed “nudify” apps, are as sinister as they sound. With just a headshot — often lifted from a yearbook photo or social media profile — these apps can fabricate explicit deepfake images that appear scarily real.
And yes, it’s already happening in schools.
These hyper-realistic images — forged with AI tools — are turning bullying into a high-tech nightmare.
“We’re at a place now where you can be doing nothing and stories and pictures about you are posted online,” Don Austin, superintendent of the Palo Alto Unified School District, told Fox News Digital.
“They’re fabricated. They’re completely made up through AI and it can have your voice or face. That’s a whole other world.”
This is a full-blown digital crisis. Last summer, the San Francisco City Attorney’s office sued 16 so-called “nudify” websites for allegedly violating laws around child exploitation and nonconsensual images.
Those sites alone racked up more than 200 million visits in the first half of 2023.
But catching the tech companies behind these tools? That’s like playing a game of Whac-A-Mole.
Most have skated past current state laws, though some — like Minnesota — are trying to pass legislation to hold them accountable for the havoc they’re wreaking.
Still, the tech moves faster than the law — and kids are getting caught in the crossfire.
Josh Ochs, founder of SmartSocial — an organization that trains families on online safety — told Fox News Digital that AI-generated nudes are causing “extreme harm” to teens across the country.
“Kids these days will upload maybe a headshot of another kid at school and the app will recreate the body of the person as though they’re nude,” Ochs revealed to the outlet.
“This causes extreme harm to that kid that might be in the photo, and especially their friends as well and a whole family,” he noted.
He said parents need to stop tiptoeing around their children’s digital lives — and start laying down some boundaries.
“Before you give your kids a phone or social media, it’s time to have that discussion early and often. Hey, this is a loaner for you, and I can take it back at any time because you could really hurt our family,” Ochs said.
In February, the U.S. Senate unanimously passed a bill to criminalize publishing — or even threatening to publish — nonconsensual AI deepfake porn.
It now awaits further action.
Austin said the only way to get ahead of the curve is to keep talking — with parents, teachers, students, and anyone else who will listen.
“This isn’t going away,” he warned. “It’s evolving — and fast.”