Teenage girls in the U.S. who are increasingly being targeted or threatened with fake nude photos created with artificial intelligence or other tools have limited ways to seek accountability or recourse, as schools and state legislatures struggle to catch up to the new technologies, according to legislators, legal experts and one victim who is now advocating for a federal bill.
Since the 2023 school year kicked into session, cases involving teen girls victimized by the fake nude photos, also known as deepfakes, have proliferated worldwide, including at high schools in New Jersey and Washington state.
Local police departments are investigating the incidents, lawmakers are racing to enact new measures that would enforce punishments against the photos’ creators, and affected families are pushing for answers and solutions.
Unrealistic deepfakes can be made with simple photo-editing tools that have existed for years. But two school districts told NBC News that they believe fake photos of teens that have affected their students were AI-generated.
AI technology is becoming more widely available, such as stable diffusion (open-source technology that can produce images from text prompts) and “face-swap” tools that can put a victim’s face in place of a pornographic performer’s face in a video or photo.
Apps that purport to “undress” clothed photos have also been identified as possible tools used in some cases and have been found available for free on app stores. These modern deepfakes can be more realistic-looking and harder to immediately identify as fake.
“I didn’t know how complex and scary AI technology is,” said Francesca Mani, 15, a sophomore at New Jersey’s Westfield High School, where more than 30 girls learned on Oct. 20 that they may have been depicted in explicit, AI-manipulated images.
“I was shocked because me and the other girls were betrayed by our classmates,” she said, “which means it could happen to anyone by anyone.”
Politicians and legal experts say there are few, if any, pathways to recourse for victims of AI-generated and deepfake pornography, which often attaches a victim’s face to a naked body.
The photos and videos can be surprisingly realistic, and according to Mary Anne Franks, a legal expert in nonconsensual sexually explicit media, the technology to make them has become more sophisticated and accessible.
A month after the incident at Westfield High School, Francesca and her mother, Dorota Mani, said they still do not know the identities or the number of people who created the images, how many were made, or if they still exist. It’s also unclear what punishment the school district doled out, if any.
The Town of Westfield directed comment to Westfield Public Schools, which declined to comment. Citing confidentiality, the school district previously told NBC New York that it “would not release any information about the students accused of creating the fake nude photos, or what discipline they are facing.”
Superintendent Raymond Gonzalez told the news outlet that the district would “continue to strengthen our efforts by educating our students and establishing clear guidelines to ensure that these new technologies are used responsibly in our schools and beyond.”
In an email obtained by NBC News, Mary Asfendis, the high school’s principal, told parents on Oct. 20 that it was investigating claims by students that some of their peers had used AI to create pornographic images from original photos.
At the time, school officials believed any created images had been deleted and were not being circulated, according to the memo.
“This is a very serious incident,” Asfendis wrote, as she urged parents to discuss their use of technology with their children. “New technologies have made it possible to falsify images and students need to know the impact and damage those actions can cause to others.”
While Francesca has not seen the image of herself or others, her mother said she was told by Westfield’s principal that four people identified Francesca as a victim. Francesca has filed a police report, but neither the Westfield Police Department nor the prosecutor’s office responded to requests for comment.
New Jersey State Sen. Jon Bramnick said law enforcement expressed concerns to him that the incident would only rise to a “cyber-type harassment claim, even though it really should reach the level of a more serious crime.”
“If you attach a nude body to a child’s face, that to me is child pornography,” he said.
The Republican lawmaker said state laws currently fall short of punishing the content creators, even though the damage inflicted by real or manipulated images can be the same.
“It victimizes them the same way people who deal in child pornography do. It’s not only offensive to the young person, it defames the person. And you never know what’s going to happen to that photograph,” he said. “You don’t know where that is once it’s transmitted, when it’s going to come back and haunt the young girl.”
A pending state bill in New Jersey, Bramnick said, would ban deepfake pornography and impose criminal and civil penalties for nonconsensual disclosure. Under the bill, a person convicted of the crime would face three to five years in jail and/or a $15,000 fine, he said.