In a disturbing trend, the popularity of apps and websites utilizing AI to undress women in photos is surging, reaching alarming proportions.
Researchers from the social network analysis company Graphika revealed that a staggering 24 million people visited these undressing websites in September alone.
Rising Tide of ‘Nudify’ Apps
These services, often referred to as “nudify” apps, leverage popular social networks for marketing, witnessing a substantial increase of over 2,400% in links advertising undressing apps on platforms like X and Reddit since the beginning of the year.
The underlying technology involves the use of AI to manipulate images, rendering the subjects nude. Most notably, these applications predominantly target women. The rise of such applications is closely linked to the release of open-source diffusion models in artificial intelligence, allowing developers to create highly realistic images.
Santiago Lakatos, an analyst at Graphika, highlighted the significant improvement in image quality, emphasizing the realistic nature of these manipulated images compared to earlier, often blurry, deep fakes.
The implications of these undressing apps extend beyond technological advancements, delving into serious legal and ethical concerns. This phenomenon is a manifestation of non-consensual pornography facilitated by the progress in AI, commonly known as deep fake pornography.
Victims often find their images taken from social media and distributed without their consent, leading to harassment, privacy invasion, and potential legal ramifications.
The increase in popularity corresponds with the open accessibility of advanced AI models, available for free due to their open-source nature. This accessibility allows app developers to create sophisticated deepfakes, contributing to the proliferation of non-consensual explicit content.
Read more: Exhausted Nation: Chronic Fatigue Affects Millions Of Americans
Undressing AI Apps Sparks Legal and Ethical Concerns
Notably, some of these undressing apps engage in explicit marketing tactics, as seen in an image posted on X advertising an undressing app that suggested customers could create nude images and send them to the digitally undressed person, potentially inciting harassment.
Concerns have been raised regarding the ease with which deep fake software can be employed by ordinary individuals against ordinary targets. Privacy experts, including Eva Galperin, Director of Cybersecurity at the Electronic Frontier Foundation, emphasize the shift towards more widespread use, even among high school and college students.
Despite the severity of the issue, there is currently no federal law explicitly prohibiting the creation of deep fake pornography. However, in a significant development, a North Carolina child psychiatrist was recently sentenced to 40 years in prison for using undressing apps on photos of his patients, marking the first prosecution of its kind under laws addressing the generation of deepfake child sexual abuse material.
Major online platforms, including TikTok and Meta Platforms Inc., have taken steps to address the issue by blocking keywords associated with searching for undressing apps.
TikTok, in particular, warns users that the term “undress” may be linked to content violating their guidelines.
As the menace of AI-powered undressing apps continues to grow, the need for comprehensive legal frameworks, ethical guidelines, and technological safeguards becomes increasingly urgent to protect individuals from the malicious use of deepfake technology.
Read more: Gemini: 10 Biggest Revelations About Google’s New AI