Below Supernav ↴

AI nudes spread of teen. Now she’s fighting against deepfake porn

  • Fake AI nudes circulated at a New Jersey high school last year
  • One of the teens impacted is pushing for laws to protect others
  • Until now, there’s been no federal effort against deepfakes

 

Main Area Top ↴

Testing on staging11

AUTO TEST CUSTOM HTML 20241211205327

AUTO TEST CUSTOM HTML 20241212105526

(NewsNation) — Francesca Mani, a 14-year-old, was among several students at a New Jersey high school who say their pictures were manipulated using AI and turned into fake nude photos last year. Now, she’s helping lawmakers push for the first federal law targeting the creation of nonconsensual, deepfake pornography.

AI-made nudes circulated at New Jersey school

One or more boys at Westfield High School in New Jersey, roughly 25 miles west of Manhattan, have been accused of using artificial intelligence to generate pornographic pictures of female students at the school and sharing them on Snapchat.

It reportedly happened in summer 2023 but did not come to the attention of the school until October.

Mani’s mother, Dorota, said she received a call from the school letting her know nude pictures were created using the faces of some female students and circulated among a group of friends on Snapchat. She never thought anything like this would happen to her child.

The incident sparked outrage among parents and initially triggered a police investigation. But Dorota tells NewsNation’s Elizabeth Vargas there has been “little accountability” for what happened.

Dorota said more than 30 girls are believed to have been impacted, and one boy ended up getting suspended for three days over the incident.

What is deepfake porn?

A deepfake is a video or image of a person that has been digitally modified using a form of artificial intelligence known as deep learning to create videos of fake events.

According to Deeptrace Labs, 96% of deepfakes across the internet are of women, whose images are being turned into pornographic content, without their consent.

Deepfakes reportedly first surfaced in 2017, when a Reddit user posted explicit videos switching the faces of porn stars with celebrities Taylor Swift, Scarlett Johansson and Gal Gadot.

Now, researchers and amateurs alike have experimented with deep learning technology that’s not always used for malicious purposes.

‘What happened to me was not OK’

Francesca said she first learned of the AI nude photos of her made by other classmates Oct. 20.

“You know, what happened to me was not OK,” she told Vargas.

While the images were fake, the impact it has had on Francesca is very real. She has felt a range of emotions about what happened.

“At first, I felt helpless,” Francesca said. “And after, I got mad about the lack of legislation and AI school policies.”

Calls for laws to protect against AI

Now, Francesca is demanding change. She made her way to Capitol Hill on Tuesday, where she met with Reps. Tom Kean, Jr. and Joe Morelle of New Jersey.

“Just because I’m a teenager doesn’t mean my voice isn’t powerful,” Francesca said at a news conference. “Staying silent? Not an option. We are given voices to challenge, to speak up against the injustices we face. What happened to me and my classmates was not cool, and there’s no way I’m just going to shrug and let it slide.”

It’s why she’s working with her representatives to advocate for HR 6466, the AI Labeling Act of 2023, and HR 3106, the Preventing Deepfakes of Intimate Images Act.

“Try to imagine the horror of receiving intimate images looking exactly like you — or your daughter, or your wife, or your sister — and you can’t prove it’s not,” Morelle said at the news conference. “Deepfake pornography is sexual exploitation, it’s abusive, and I’m astounded it is not already a federal crime.”

He continued: “I’m grateful we have a generation of young women like Francesca ready to stand up against systemic oppression and stand in their power by fighting for their right to hold these perpetrators accountable.”

The Preventing Deepfakes of Intimate Images Act aims to prohibit the nonconsensual disclosure of digitally altered intimate images. The legislation would make the sharing of the images a criminal offense and create a right of private action for victims.

Until now, there has been no federal effort to provide protection for deepfakes.

Tech

Copyright 2024 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. regular

 

Main Area Middle ↴

Trending on NewsNationNow.com

Main Area Bottom ↴