Advertisment

Have We Crossed The Line? AI's 'Nudify' Apps Spark Controversy

After the rising concerns over trending deep fake videos in the country, a new social network survey stated the rise in popularity of nudify apps, AI-driven undressing women from the photos.

author-image
Pavi Vyas
New Update
CREDITS: Google

File Representative Image.

Graphika, a social network analysis company, revealed some concerning results in its recent survey. The survey claimed that alone in September, more than 24 million people visited 'Nudify' sites, which are AI-driven apps that use AI algorithms to manipulate images, remove clothing, and expose individuals to compromising positions without their consent.

Advertisment

The survey highlighted the surge in cases of toxic internet trends and the worrying proliferation of non-consensual pornography through Artificial intelligence. The rise of deep fake videos and the popularity of nudify apps present a worrying intersection of technology and privacy invasion, raising ethical and legal concerns.

'Nudify' Apps That Use AI To Undress Woman Gaining Popularity: Study:

There have been various AI trends on the internet this year; some of these trends have amused users, the others have disturbed many, raising concerns over the privacy, safety of an individual, and authenticity of digital information with the rising technological growth. 

After the recent threatening deep fake videos trend, there has been another toxic trend of nudify gaining popularity where apps and websites use AI algorithms to undress women in photos, which has caught the attention of researchers and privacy advocates, as stated in a report from Bloomberg. 

The social media network analysis company Graphika revealed that these undressing websites in September alone had 24 million visitors, highlighting the troubling rise of non-consensual pornography driven by advancements in artificial intelligence.

The report also stated that these proliferations mainly work on women only in many of these 'nudify' services and link to and use popular social media sites for advertising. The report states there has been a rise of 2400% in links used to advertise undressing apps on social media, including X and Reddit, as the services use artificial intelligence to recreate an image so that a person is nude. The images are often taken from social media, as stated by Graphika. 

Advertisment

Graphika also stated that there has been a rise to these non-consensual pornography trends on the internet as AI services and many open-source models have been made available for free to the public use which was earlier used just by app developers with some responsibilities. 

An analyst from Graphika, Santiago Lakatos, said that the previous deep fakes were often blurry, but the nudify services can create and manipulate images that actually look realistic.

Social Media On Promoting Undressing Apps:

As these images are often taken from social media, the worrying trend extends to the potential for harassment, as one of the ads of 'nudify' services on X suggested customers use this app to create nude images and send them to the digitally undressed person they created nude of using this app, inciting harassment.

One of the undressing apps of these 'nudify' services has also paid for sponsored advertisements on popular search engines Google and YouTube and appears at the top when searching 'nudify'. 

In response, Google's spokesperson stated its policy against "sexually explicit content" on its ads, as they constantly review and remove violative material.

Advertisment

A spokesperson from Reddit said that the website has prohibited sharing non-consensual fake sexually explicit content and has banned several domains. While X and YouTube denied commenting on the situation,

Looking at the alarming trends, TikTok and Meta Platforms Inc. have also started blocking keywords associated with undressing apps and TikTok warned users that searching "undress" keywords can violate their guidelines. The spokesperson denied further information on the action, while Meta denied commenting as well. 

No Laws Against Deep Fake And Undressing App Using AI Tools:

The rise of non-consensual pornography through AI raises critical ethical and legal questions as it strips away consent with the use of advancements in Artificial Intelligence to manipulate and exploit individuals, demanding robust legal frameworks to address these emerging challenges.

There are still no federal laws prohibiting and criminalising explicitly creating non-consensual pornography and deepfake videos. However, in a recent case in North Carolina, a child psychiatrist was sentenced to 40 years in prison for using undressing apps on his patients, marking the first prosecution under a law banning the creation of deep fakes and using undressing apps on children as child sexual abuse in the United States. 

Non-Consensual Pornography: An Alarming Scourage On the Internet:

Advertisment

As non-consensual pornography of celebrities and public figures has been a prolonged scourage, privacy experts raised concerns that advancements in AI tools making deep fake software easy and effective, along with its free access to the public, may be a threat to even the ordinary public now, who are less empowered.

Cybersecurity Director of the Electronic Frontier Foundation, Eva Galperin, said that they have observed these acts being done by ordinary people on ordinary targets, with more high school and college students using these services. 

Galperin also stated that while many victims never find out, they face difficulty taking action and pursuing law enforcement against the violation. 

 

The blurred lines between freedom of expression and privacy rights underscore the need for legislation that keeps pace with technological advancements and calls for responsible practices within the tech industry to prevent the misuse of these tools as they pose serious threats to privacy and societal well-being. 

As technology continues to advance, legal, ethical, and technological countermeasures must be developed in tandem to protect individuals from the malicious use of AI.

artificial intelligence nudify AI on women
Advertisment