Annutri co-founder and award-winning businesswoman Anita Donoghue on the power of a positive mindset
Annutri co-founder and award-winning businesswoman Anita Donoghue on the power of a positive mindset

IMAGE

This Clontarf home was reconfigured to streamline the layout and maximise its views
This Clontarf home was reconfigured to streamline the layout and maximise its views

Megan Burns

6 classic movies worth watching over Easter
6 classic movies worth watching over Easter

Jennifer McShane

The friend zone: How to navigate finding friends as an adult
The friend zone: How to navigate finding friends as an adult

Sarah Gill

Supper Club: Fearne Cotton’s haddock burrito, punchy salsa and homemade guacamole
Supper Club: Fearne Cotton’s haddock burrito, punchy salsa and homemade guacamole

Meg Walker

New life has been breathed into this Victorian Portobello home thanks to a revamp that’s full of personality
New life has been breathed into this Victorian Portobello home thanks to a revamp that’s...

Megan Burns

This rustic four-bedroom home in Westport is on the market for €449,000
This rustic four-bedroom home in Westport is on the market for €449,000

Sarah Finnan

My Career: Archivist at Guinness Eibhlin Colgan
My Career: Archivist at Guinness Eibhlin Colgan

Sarah Finnan

Irish visual artist Ciara O’Connor on using embroidery to explore women’s lives
Irish visual artist Ciara O’Connor on using embroidery to explore women’s lives

Nathalie Marquez Courtney

How an interior stylist turned this period Cork apartment into a quietly luxurious home
How an interior stylist turned this period Cork apartment into a quietly luxurious home

IMAGE Interiors & Living

Image / Editorial

‘Terrifying’: App used to create fake nudes of women is shut down


By Jennifer McShane
29th Jun 2019
‘Terrifying’: App used to create fake nudes of women is shut down

An app which claimed it was able to digitally remove clothes from women to create fake nudes for “entertainment” has been shut down.  But the very fact that it was created at all – and with the purpose of ‘entertaining’ – is deeply frightening. 


A new AI-powered software tool made it easy for anyone to generate realistic nude images of women simply by feeding the program a picture of the intended target wearing clothes.

The now-defunct ‘DeepNude’ is the latest example of AI-generated deepfakes being used to create compromising images of unsuspecting women. The software was first spotted by Motherboard’s Samantha Cole, and was available to download free for Windows, with a premium version that offers better resolution output images available for $99.

The program reportedly used AI-based neural networks to remove clothing from images of women to produce realistic naked shots.

Both the free and premium versions of the app add watermarks to the AI-generated nudes that clearly identify them as “fake.”  But watermarked or not, the images could still have been used to target women.

The software was shut down hours after it was spotted available to buy; DeepNude will no longer be offered for sale and further versions won’t be released. The team also warned against sharing the software online, saying it would be against the app’s terms of service.

They acknowledge that “surely some copies” will get out, though.

And the app will still work for anyone who owns it.

Similar to *revenge porn, these images can be used to shame, harass, intimidate, and silence women. And here, at the touch of a button, you can – or could – do it from a phone.

The term ‘revenge porn’ covers the online posting of sexually explicit visual material, without the consent of the person portrayed.  The term typically includes photographs and video clips which have been consensually generated-either jointly or by self (“sexting”), as well as content covertly recorded by a partner or unknown third party.

Speaking to Motherboard, Katelyn Bowden, founder of anti-revenge porn campaign group Badass, said she found the app “terrifying.”

“Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo,” she told the site, according to the BBC.

Its very creation is sickening.

It’s outrageous.

And still, there hasn’t been much by way of outrage.

Are we becoming de-sensitised to such a horrific issue?

And, more importantly, why aren’t more laws in place banning the creation of such software?

It matters not that the nudes might be fake – should one go viral, this could see a woman’s life, career and reputation completely destroyed at the click of a button.

And all the creators had to say by way of response was that “the probability people will misuse it is too high,” after they created it for “entertainment purposes” a few months ago.

“Not that great”

Anyone who bought the app would get a refund, they said, adding that there would be no other versions of it available and withdrawing the right of anyone else to use it.

The fail to mention how they will control any variations of the software that makes its way online.

In their statement, the developers added: “Honestly, the app is not that great, it only works with particular photos.”

Yet still the website managed to crash, such was the apparent demand for the software when it made its way online originally.

We frequently hear of women in the public eye failing victim to the same thing. Jennifer Lawrence and most recently, actress Bella Thorn who leaked her own nudes after a hacker threatened her. She leaked her own before he could, by way of “taking back her own power.”

The fact that she had to – or the fact that this is happening at all in 2019 – is truly disturbing.

*If you have experienced this type of abuse and harassment please contact the Women’s Aid National Freephone Helpline on 1800 341 900 from 24 hours a day, seven days a week and speak to someone at your local Garda Victims Service Office here


Main  photograph: Unsplash