Spain: Minor girls are reporting AI-generated nude photos of them being circulated at school

Spain: Minor girls are reporting AI-generated nude photos of them being circulated at school

Authorities in Spain are now investigating a disturbing case that has sent shockwaves throughout the country. A deepfake nude photograph scandal has targeted several teenage females. These photos, made with an Artificial Intelligence (AI) program, have upset the victims, who have faced bullying and/or distressing comments from their classmates in certain situations. In the southwestern Spanish town of Almendralejo, local law enforcement is dealing with 11 accusations from victims, all of whom are juveniles.

Speaking to AFP, a spokesperson from the local police said that those responsible for these abhorrent deeds employed a sinister method. They “manipulated photos of underage girls”, superimposing their innocent faces onto the “bodies of other people” in other images. 

These deep fake photographs were made with the use of an AI tool called ClothOff

According to Euronews, these deepfake nude photographs were made with the use of an AI tool called ClothOff. Users can use this software to ‘strip clothing off’ anyone they have a picture of. People can generate 25 naked photographs for €10 (about $11). ClothOff, which can create frighteningly lifelike photo montages, with the slogan “Undress anybody, undress girls for free,” according to Euronews. According to Spanish media sources, an estimated 20 girls may have been fatally victimized by these deepfake porn photos. Miriam Al Adib, the mother of a 14-year-old victim, expressed her worries about the situation. In an interview with the press, she stated, “This is very serious.”

Taking to Instagram, she recounted how her distraught daughter had shown her one such photo. “When I came home, one of my daughters, who was really upset, told me: ‘Look what they did’. It turns out they took a photo of her, and they made it seem as if she was naked with the aid of artificial intelligence.” “Girls, don’t be afraid to report such acts. Tell your mothers. Affected mothers, tell me so that you can be in the group that we created,” she added.

Someone tried to extort her daughter using these bogus nude photographs, according to another mother. The makers of these photographs allegedly wanted monetary compensation in exchange for their silence. Furthermore, it has been discovered that these modified photographs may have been distributed on a variety of platforms, including OnlyFans, an online subscription service known for adult content, and explicit websites.

Exit mobile version