Close Menu
Tech News VisionTech News Vision
  • Home
  • What’s On
  • Mobile
  • Computers
  • Gadgets
  • Apps
  • Gaming
  • How To
  • More
    • Web Stories
    • Global
    • Press Release

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Trending Now
NASA Wants to Put Nuclear Reactors on the Moon

NASA Wants to Put Nuclear Reactors on the Moon

15 April 2026
Fortnite Adds Karaoke Mode, and Laufey

Fortnite Adds Karaoke Mode, and Laufey

15 April 2026
Microsoft counters the MacBook Neo with freebies for students

Microsoft counters the MacBook Neo with freebies for students

15 April 2026
Facebook X (Twitter) Instagram
  • Privacy
  • Terms
  • Advertise
  • Contact
Facebook X (Twitter) Instagram Pinterest VKontakte
Tech News VisionTech News Vision
  • Home
  • What’s On
  • Mobile
  • Computers
  • Gadgets
  • Apps
  • Gaming
  • How To
  • More
    • Web Stories
    • Global
    • Press Release
Tech News VisionTech News Vision
Home » The Deepfake Nudes Crisis in Schools Is Much Worse Than You Thought
What's On

The Deepfake Nudes Crisis in Schools Is Much Worse Than You Thought

News RoomBy News Room15 April 2026Updated:15 April 2026No Comments
Facebook Twitter Pinterest LinkedIn Tumblr Email
The Deepfake Nudes Crisis in Schools Is Much Worse Than You Thought

Nevertheless, there are clear patterns that appear. In nearly all cases, teenage boys are allegedly responsible for the creation of the images or videos. They are often shared in social media apps or via instant messaging with classmates. And they are hugely harmful to the victims. “I’m worried that every time they see me, they see those photos,” one victim in Iowa said earlier this year. “She’s been crying. She hasn’t been eating,” another’s family said.

In multiple instances, victims often do not want to attend school or be faced with seeing those who created explicit images or videos of them. “She feels hopeless because she knows that these images will likely make it onto the internet and reach pedophiles,” says lawyer Shane Vogt, and three Yale Law School students, Catharine Strong, Tony Sjodin, and Suzanne Castillo, who are representing one unnamed New Jersey teenager in legal action against a nudifying service. “She is severely distressed by the knowledge that these images are out there, and she will have to monitor the internet for the rest of her life to keep them from spreading.”

In South Korea and Australia, schools have given pupils the option not to have their photos in yearbooks or stopped posting images of students on their official social media accounts, citing their use for potential deepfake abuse. “Around the world, there have been cases where school images were taken from public social media pages, altered using AI, and turned into harmful deepfakes,” one school in Australia said. “Imagery will instead feature side profiles, silhouettes, backs of heads, distant group shots, creative filters, or approved stock photography.”

Sexual deepfakes created using AI have existed since around the end of 2017; however, as generative AI systems have emerged and become more powerful, they have led to a shadowy ecosystem of “nudification” or “undress” technologies. Dozens of apps, bots, and websites allow anyone to create sexualized images and videos of others with just a couple of clicks, often with no technical knowledge.

“What AI changes is scale, speed, and accessibility,” says Siddharth Pillai, cofounder and director of the RATI Foundation, a Mumbai-based organization working to prevent violence against women and children. “The technical barrier has dropped significantly, which means more people, including adolescents, can produce more convincing outputs with minimal effort. As with many AI-enabled harms, this results in a glut of content.”

Amanda Goharian, the director of research and insights at child safety group Thorn, says its research indicates that there are different motivations involved in teenagers creating deepfake abuse, ranging from sexual motivations, curiosity, revenge, or even teens daring each other to create the imagery. Studies involving adults who have created deepfake sexual abuse similarly show a host of different reasons why the images may be created. “The goal is not always sexual gratification,” Pillai says. “Increasingly, the intent is humiliation, denigration, and social control.”

“It’s not just about the tech,” says Tanya Horeck, a feminist media studies professor and researcher focusing on gender-based violence who has looked at sexualized deepfakes in UK schools at Anglia Ruskin University. “It’s about the long-standing gender dynamics that facilitate these crimes.”

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

Google launches a Gemini AI app on Mac

Google launches a Gemini AI app on Mac

15 April 2026
Microsoft Surface PCs Are Getting Big Price Hikes, and the Cheaper Models Are Going Away

Microsoft Surface PCs Are Getting Big Price Hikes, and the Cheaper Models Are Going Away

15 April 2026
Ikea’s Varmblixt smart lamp review: A sweet treat

Ikea’s Varmblixt smart lamp review: A sweet treat

15 April 2026
NASA Wants to Put Nuclear Reactors on the Moon

NASA Wants to Put Nuclear Reactors on the Moon

15 April 2026
Editors Picks
Google launches a Gemini AI app on Mac

Google launches a Gemini AI app on Mac

15 April 2026
Microsoft Surface PCs Are Getting Big Price Hikes, and the Cheaper Models Are Going Away

Microsoft Surface PCs Are Getting Big Price Hikes, and the Cheaper Models Are Going Away

15 April 2026
Pokémon Go Fans Use Government Maps to Hunt Snakes

Pokémon Go Fans Use Government Maps to Hunt Snakes

15 April 2026
Ikea’s Varmblixt smart lamp review: A sweet treat

Ikea’s Varmblixt smart lamp review: A sweet treat

15 April 2026

Subscribe to Updates

Get the latest tech news and updates directly to your inbox.

Trending Now
Tech News Vision
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact
© 2026 Tech News Vision. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.