Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Teenage target of sexualized deepfakes fights back with online tools

Elliston Berry was only 14-years-old when a fellow high school student used artificial intelligence to create nude deepfakes of her and several classmates, and then posted them online.

At first, her shame and humiliation scared her into silence, but with the help and support of her parents, she chose to speak out and fight back. Her story inspired the Take It Down Act, passed in May 2025, which will require social media sites to remove non-consensual intimate imagery within 48 hours of being notified.

Companies were given until May 2026 to create the tools and infrastructure to do so. In the meantime, Elliston has teamed up with Adaptive Security CEO and founder Brian Long and the Pathos Consulting Group to launch a series of free trainings for parents and educators to prevent what he calls “deepfake sexual abuse.”

Host Robin Young talks to Berry and Long about their project, and trauma these images cause their victims.

This article was originally published on WBUR.org.

Copyright 2026 WBUR

Here & Now Newsroom