Advertisment

An App Which Let Men Disrobe Women. For Entertainment!

An App called DeepNude would let men undress women in their photographs using AI. While the app has been taken down, one needs to ask, how did it not come across as disturbing casualisation of perversion to the creators.

author-image
Yamini Pustake Bhalerao
Updated On
New Update
deepnude, Woman Carry Husband shoulder

A Photoshop App called DeepNude allowed men to undress women in their photographs using Artificial Intelligence. When you read this what is the first thing that comes to your mind? Well if you are a woman, then fun is the last thing. With spycam and revenge pornography, etc. already a headache in protecting our dignity from perverted minds, did we really need an App which could let anyone with access to our photographs undress us? DeepNude has been shut down, but according to reports, critics are worried that its technology still remains available online and puts millions of women at risk of being subjected to photoshopped nudity, all in the name of fun.

Advertisment

SOME TAKEAWAYS:

  • An application called DeepNude allowed men to swap clothes on women's bodies on their photos with nudity.
  • While the app has been taken down, many worry that the algorithm is available online.
  • The creators of DeepNude allege that the software was launched for “entertainment.” 
  • Just what will it take for men to not see such casual violation of women's consent as entertainment?

DeepNude has been shut down, but according to reports, critics are worried that its technology still remains available online and puts millions of women at risk of being subjected to photoshopped nudity, all in the name of fun.

The creators of this App said that the software was launched for “entertainment” and that they “greatly underestimated” demand for the app. “We never thought it would be viral and (that) we would not be able to control the traffic,” they wrote on Twitter. Despite the safety measures adopted (watermarks), if 500,000 people use it, the probability that people will misuse it is too high. We don't want to make money this way.” Are we seriously expected to believe that the creators didn’t figure that their App could be misused? Is there any other way of using this app at all? If men could download women’s photograph, even with their consent, would they be taking consent again before virtually undressing them?

For long such perversions have been casualised in the name of ‘harmless fun’. This is yet another way of eroding women’s agency to consent. What is the harm if a man takes your photo and swaps your clothes for a nude body, breasts and vulva included? If you are asking that question then you need to brush up your understanding of the concept of consent. Consent isn’t just about touching a woman with her permission, it is also about respecting her agency and rational in every aspect. The DeepNude basically turns our bodies into objects and lets grown up men ‘play’ with them. It never intended to treat women with respect at all, even with all its so called safety measures in place. Which is why one needs to ask, how did this App make it to the market in the first place?

Are we seriously expected to believe that the creators didn’t figure that their App could be misused? Is there any other way of using this App at all?

Did no one find what it was offering alarming? Did no one think that if in wrong hands, it will jeopardise safety of innumerable women and girls? Did no one question how the makers were ensuring women’s consent wasn’t violated here? Or the fact that it was being unapologetically marketed as entertainment? Both the free and premium versions of the App add watermarks to the AI-generated nudes that clearly identify them as “fake.” But that does very little to diminish the damage such images could do. Any tampering with anyone's photograph, without their consent is a violation of their rights. When models sign up for assignments and allow themselves to be clicked in images free for usage by anyone, even in such cases, there is a line everyone is expected not to cross.

As a woman and mother to a girl who’ll eventually grow into an adult in this world, it makes me feel very unsafe. How do we protect ourselves and our girls from such casual objectification? How do we empower them, when there are packs of entitled men all around us, who refuse to respect our bodies? The mere existence of DeepNude App calls for an in-depth conversation on objectification and consent with young boys and men. They must be sensitised on how such technologies aren’t entertaining, but a violation of women’s rights on more counts than one. Unless we have men on-board, there’ll always be a creator who would try to normalise such perversions. So, we need this practiced plugged from the consumers’ end. Keep in mind that the App wasn’t pulled down because it wasn’t selling, but because it was selling too much. In their tweet, the creators of this App say that the world isn't ready for an App like DeepNude. Let us hope that it never is.

Also Read: How Kamala Harris Is Refusing To Be Spoken Over And Why It Matters

Yamini Pustake Bhalerao is a writer with the SheThePeople team, in the Opinions section. The views expressed are author’s own.

objectification of women artificial intelligence DeepNude women's consent
Advertisment