skip to content

Spanish teenagers were sent AI nudes of themselves, authorities can’t arrest, prosecute any one

Over 20 girls in Spain have reported a distressing incident wherein they received AI-generated nude images of themselves on their mobile phones.

However, the question arises: can legal actions be taken against creating and distributing such deep fake content?

Haunted by AI-generated deep fakes
More than twenty girls from Almendralejo, a town in southern Spain, were shocked when they received nude photos of themselves on their mobile devices. What made matters worse was that none of these girls had taken these pictures, yet they appeared frighteningly realistic.

These images had been illicitly taken from their Instagram accounts, manipulated using AI, and subsequently circulated in the school’s WhatsApp groups.

The teenagers were clothed entirely in genuine photographs, but the application skillfully made them appear nude. Consequently, concerned parents and legal authorities wonder whether a crime has been committed, even if the images are technically inaccurate. Could these images be deemed as child pornography?

Miriam Al Adib, one of the girls’ mothers, expressed her distress on her Instagram account, stating, “The montages are super realistic, it’s alarming and a real outrage.” She added that her daughter was deeply disturbed by the ordeal.

Al Adib even raised concerns that these photos might have found their way onto internet platforms like OnlyFans or adult websites while the girls endured hurtful comments from their peers. One of the girls was even told, “Don’t complain; girls upload pictures that almost show their private parts.” Notably, the youngest among the victims is just 11 years old and hasn’t even reached high school yet.

The mothers have come together to voice their concerns to address this troubling situation. The National Police have initiated an investigation and have already identified several underage individuals allegedly involved in this incident, some of whom are classmates of the affected girls.

The case has been referred to the Juvenile Prosecutor’s Office, and the town’s mayor has issued a stern warning: “It may have started as a joke, but the implications are much greater and could have serious consequences for those who made these photos.”

Problematic App
The hyper-realistic deepfake images, created with the ClothOff app, have raised alarm. Marketed with the slogan “Undress anybody, undress girls for free,” this app permits users to digitally remove clothing from individuals depicted in their phone’s image gallery for a fee of €10, yielding 25 naked images.

Although the nudity depicted in these images is not accurate, the mothers emphasize that the girls’ emotional distress is genuine. Miriam Al Adib issued a stern message on her Instagram account, directed at those who shared the pictures: “You are not aware of the damage you have done to these girls, and you’re also unaware of the crime you have committed.”

EU’s toothless laws
Legal experts are grappling with whether this offense can be classified as distributing child pornography, which would entail severe penalties, or if a more cautious approach is warranted.

Leandro Nunez, a lawyer specializing in new technologies at the Audens law firm, emphasizes that the critical factor isn’t whether the photo is 100 percent authentic but whether it appears to be. He suggests that it could be regarded as child pornography, crimes against moral integrity, or the distribution of images containing non-consensual sexual content, resulting in a lesser sentence of six months to two years in prison.

However, Eloi Font, a lawyer at Font Advocates, a law firm specializing in digital law, contends that it might be categorized as a crime akin to the reproduction of sexual images of minors, carrying a penalty of between five and nine years in prison.

Share your love
Facebook
Twitter
LinkedIn
WhatsApp

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

error: Unauthorized Content Copy Is Not Allowed