Advertisment

Reddit Users Scammed With AI Model Nudes: Has AI Imaging Also Overtaken Porn?

Reddit users were recently duped of their money after they purchased nude pictures online from whom they believed was a human model, Claudia.

author-image
Bhana Bisht
Updated On
New Update
Reddit Users Scammed With AI Generated Models, AI Clones Teenager's Voice
Artificial Intelligence may have changed the way the world works but it has also opened a path to delusion in some cases. Recently, a Reddit user was duped of his money after he purchased nude pictures online believing they were of a human model, Claudia. However, an investigation by a US publication informed him that the model he believed was human was, in fact, an Artificial Intelligence-generated female model.
Advertisment

Reddit Users Scammed With AI Generated Models

In the past few days, several Reddit users came across photos of a woman named Claudia. It was when a few of these users went on to purchase nude photos of her online believing them to be real that they realised that they were duped of their money. The news, which spread like fire across social media platforms, paved the way for publications to dive deep into the issue, and when Washington Post investigated the story, it found how Artificial Intelligence has now found its way in even porn.

The Reddit user shared his experience on the platform writing, "feel a little cheated," after he came to terms with the full truth. Claudia's account, as per another report by Rolling Stone, was created with the help of two students of computer science using technology. The identity of the duo has been kept hidden.

Although AI has levelled up working efficiency technologically, its vast coverage also pinpoints to the control it can have over social media users to a large extent. Technological advancement is now also being used to generate fake nude images of non-existent women to make money from people, and the recent Reddit example proves its control over platforms.

With several people warning one another to not fall into the trap, many have found themselves in trouble whilst making purchases and engaging in conversation with AI-generated fakes. The students who created Claudia's account fake account claimed that they earned about 100 dollars from their scam until the news went viral.

There is no way social media users can mark the accuracy of the images because most of the fakes created do not hold a watermark. One of the AI-porn creators shared his two bits with WP citing how most people choose convenience over accuracy these days hence this promotes the blurriness between what's real and what's not. "The average person looking at this stuff doesn't really care if they’re not real, who really cares?"

Advertisment

With the world advancing towards further tech development, what remains problematic and concerning is how far can AI-imaging tools be misused to not just make money but also promote crime. Recent incidents show how it remains a global challenge to draw a line between what's real content and what's fake, and that's where the trap lies.


Suggested reading: Small Town Wonder: Teen Develops Soil Analysis Tool Using Artificial Intelligence

artificial intelligence AI Porn Models Scam AI Scam
Advertisment