Technology

The evolution of AI is creating a new form of online sexual exploitation

Published

on


A new form of “image-based sexual abuse” is becoming increasingly popular amongst American teenagers using artificial intelligence-based “nudification” apps to taunt schoolgirls.

New research shows that a development trend amongst highschool students from across the country who use these apps to generate and share fake nude photos of their classmates, reports. Students in schools from California to Illinois have fallen victim to having fake nudes shared without their consent.

While revenge porn has been a problem for years, the arrival of deepfake technology means “anyone can just put their face into this app and get an image of someone – friends, classmates, co-workers, anyone – with no clothes on at all” – Britt Paris assistant professor of library and knowledge science at Rutgers who has researched deepfakes, said.

Male students at Issaquah High School in Washington used a nude app to “undress” photos of girls who attended homecoming last fall. Tenth-grade boys at Westfield High School in New Jersey shared fake X-rated photos of their classmates throughout the varsity. The growing fad is leader to laws which might impose penalties on people found guilty of sharing doctored photos.

Washington, South Dakota and Louisiana have already passed laws prohibiting the creation and sharing of fake deeds by states like California and others that follow close behind. Representative Joseph Morelle (D-NY) recently reintroduced a bill that may make sharing fake records a federal crime.

He continues to point to the applications behind the growing AI nudification trend. Amy Hasinoff, a communications professor on the University of Colorado in Denver, believes the new regulations will only be a “symbolic gesture” if no motion is taken to combat apps used to generate images.

“I try to imagine a reason why these apps would exist,” Hasinoff said.

Lawmakers are also working to manage app stores offering nude apps to stop them from being worn without explicit consent provisions. Apple and Google have removed several apps offering fake nudes from the App Store and Google Play.

15-year-old Westfield student Francesca Mani was a victim of the fake image and shared how traumatic the experience was for her.

“I was in the counselor’s office, emotional and crying,” Mani said. “I couldn’t believe I was one of the victims.”

Hasinoff notes that even when the photos are fake, victims can deal with “shaming, blaming and stigmatization” brought on by stereotypes that sexualize female victims and make them appear more sexually lively.

“These images put these young women at risk of exclusion from future employment opportunities and also expose them to physical violence if recognized,” said Yeshi Milner, founder of the nonprofit Data for Black Lives.

To combat fake images, nine states have adopted or updated laws to punish people affiliated with more states, and the number is growing. A federal bill introduced in 2023 would give victims or parents the power to sue perpetrators for damages and impose criminal penalties. Although the bill has not yet passed Congress, it enjoys growing bipartisan support.

Some remain skeptical in regards to the impact of the regulations as AI nudification applications remain available to be used.

“Until companies can be held accountable for the kinds of harm they cause,” Paris said. “I don’t see much changing.”


This article was originally published on : www.blackenterprise.com

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version