According to researchers, apps and websites that use artificial intelligence to undress women in photos are becoming increasingly popular. According to the social network analysis company Graphika, 24 million people visited undressing websites in September alone.
Many of these undressing, or “nudify,” services market through popular social networks, according to Graphika. According to the researchers, the number of links advertising undressing apps on social media has increased by more than 2,400% since the beginning of this year, including on X and Reddit. The services use AI to recreate an image of a naked person. Many of the services are only available to women.
These apps are part of a troubling trend of non-consensual pornography being developed and distributed as artificial intelligence advances – a type of fabricated media known as deepfake pornography. Its spread faces serious legal and ethical challenges, as the images are frequently taken from social media and distributed without the subject’s consent, control, or knowledge.
According to Graphika, the increase in popularity corresponds to the release of several open-source diffusion models, or artificial intelligence, that can create images that are far superior to those created only a few years ago. The models that app developers use are free to use because they are open-source.
“You can create something that actually looks realistic,” said Santiago Lakatos, an analyst at Graphika, noting that previous deepfakes were often blurry.
Apps share explicit content on Google and YouTube
One image posted to X promoting an undressing app contained language implying that customers could create nude images and then send them to the person whose image was digitally undressed, thereby inciting harassment. Meanwhile, one of the apps has paid for sponsored content on Google’s YouTube and ranks first when searching for “nudify.”
According to a Google spokesperson, the company does not permit ads “that contain sexually explicit content.”
“We’ve reviewed the ads in question and are removing those that violate our policies,” the company said.
According to a Reddit spokesperson, the site prohibits the non-consensual sharing of faked sexually explicit content and has banned several domains as a result of the investigation. X did not respond to a comment request.
Aside from the increase in traffic, the services, some of which charge $9.99 per month, claim on their websites that they are attracting a lot of customers. “They are doing a lot of business,” Lakatos said. Describing one of the undressing apps, he said, “If you take them at their word, their website advertises that it has more than a thousand users per day.”
Non-consensual pornography of public figures has long been a scourge of the internet, but privacy experts are becoming increasingly concerned that advances in AI technology have made deepfake software easier and more effective.
“We are seeing more and more of this being done by ordinary people with ordinary targets,” said Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation. “You see it among high school children and people who are in college.”
Deepfake porn legal gaps persist as platforms block related keywords
Many victims never learn about the images, but even those who do may have difficulty getting law enforcement to investigate or find funds to pursue legal action, according to Galperin.
There is currently no federal law prohibiting the creation of deepfake pornography, though the US government does prohibit the creation of such images of minors. A North Carolina child psychiatrist was sentenced to 40 years in prison in November for using undressing apps on photos of his patients, the first prosecution of its kind under a law prohibiting the deepfake generation of child sexual abuse material.
TikTok has blocked the keyword “undress,” a popular search term associated with the services, warning users that it “may be associated with behavior or content that violates our guidelines,” according to the app. A TikTok representative declined to comment further. Meta Platforms Inc. began blocking keywords associated with searching for undressing apps in response to questions. A spokesperson for the company declined to comment.