PHOTO: Twitter
Now playing
01:11
How AI is changing the way we work
WASHINGTON, DC - JUNE 22: Facebook
WASHINGTON, DC - JUNE 22: Facebook's Chief Operating Officer Sheryl Sandberg speaks with AEI president Arthur C. Brooks during a public conversation on Facebook's work on 'breakthrough innovations that seek to open up the world' at The American Enterprise Institute for Public Policy Research on June 22, 2016 in Washington, DC. (Photo by Allison Shelley/Getty Images)
PHOTO: Allison Shelley/Getty Images North America/Getty Images
Now playing
01:23
Hear Sandberg downplay Facebook's role in the Capitol riots
screengrab US social media
screengrab US social media
PHOTO: Getty Images
Now playing
04:35
Tech companies ban Trump, but not other problematic leaders
PHOTO: Samsung
Now playing
01:53
See Samsung's new Galaxy S21 lineup
PHOTO: CNN
Now playing
02:47
Extremists and conspiracy theorists search for new platforms online
This illustration picture shows the social media website from Parler displayed on a computer screen in Arlington, Virginia on July 2, 2020. - Amid rising turmoil in social media, recently formed social network Parler is gaining with prominent political conservatives who claim their voices are being silenced by Silicon Valley giants. Parler, founded in Nevada in 2018, bills itself as an alternative to "ideological suppression" at other social networks. (Photo by Olivier Douliery/AFP/Getty Images)
This illustration picture shows the social media website from Parler displayed on a computer screen in Arlington, Virginia on July 2, 2020. - Amid rising turmoil in social media, recently formed social network Parler is gaining with prominent political conservatives who claim their voices are being silenced by Silicon Valley giants. Parler, founded in Nevada in 2018, bills itself as an alternative to "ideological suppression" at other social networks. (Photo by Olivier Douliery/AFP/Getty Images)
PHOTO: Olivier Douliery/AFP/Getty Images
Now playing
03:49
Parler sues Amazon in response to being deplatformed
PHOTO: Twitter
Now playing
02:39
Twitter permanently suspends Donald Trump from platform
Panasonic
Panasonic's Augmented Reality Heads-up Display
PHOTO: Panasonic USA
Now playing
01:06
This tech gives drivers directions on the road in front of them
PHOTO: LG Display
Now playing
01:10
See LG's transparent TV
NEW YORK, NY - JUNE 3: The Google logo adorns the outside of their NYC office Google Building 8510 at 85 10th Ave on June 3, 2019 in New York City. Shares of Google parent company Alphabet were down over six percent on Monday, following news reports that the U.S. Department of Justice is preparing to launch an anti-trust investigation aimed at Google. (Photo by Drew Angerer/Getty Images)
NEW YORK, NY - JUNE 3: The Google logo adorns the outside of their NYC office Google Building 8510 at 85 10th Ave on June 3, 2019 in New York City. Shares of Google parent company Alphabet were down over six percent on Monday, following news reports that the U.S. Department of Justice is preparing to launch an anti-trust investigation aimed at Google. (Photo by Drew Angerer/Getty Images)
PHOTO: Drew Angerer/Getty Images North America/Getty Images
Now playing
03:25
Google employee on unionizing: Google can't fire us all
Now playing
02:01
Watch 'deepfake' Queen deliver alternative Christmas speech
Now playing
01:42
Watch father leave daughter dozens of surprise Ring messages
PHOTO: Photo Illustration: Kena Betancur/Getty Images
Now playing
04:50
Zoom's founder says he 'let down' customers. Here's why
Now playing
00:48
See Walmart's self-driving delivery trucks in action
Now playing
01:25
This robotaxi from Amazon's Zoox has no reverse function
(CNN Business) —  

On Wednesday afternoon, I clicked on a picture of a woman on a website called Deepnude.com. Suddenly, her outfit disappeared, and naked breasts were on my computer screen. It was transfixing and nauseating. I felt like I had just peeped through a stranger’s window, utterly violating her privacy.

A day later that website had disappeared; its creator apparently had a crisis of conscience. If you type in the URL, you’ll see a blank, white page and the words “not found.” But before it disappeared, it offered visitors like myself free previews of a horrific AI-enhanced world where photos of women — any woman, really — could be undressed via algorithms and shared with reckless abandon. Like the woman I saw, the resulting nudes weren’t real. But they certainly looked like it.

The website, initially reported on by Samantha Cole at Vice site Motherboard on Wednesday, began selling a $50 Windows and Linux application just a few days earlier that could take a photo of a clothed woman and, using artificial intelligence, replace it with a fairly realistic-looking naked image of her. There was a free version, too, that would place a big watermark on resulting images (the paid version of the app, according to Vice, instead had a “FAKE” stamp in one corner).

DeepNude was meant to work on women, specifically. (Vice reported that it would insert a vulva, in place of pants, in a photo of a man.) It is the latest example of how it’s getting increasingly easy to use technology to shame and demean women, in particular, online. The images it created in the online samples I saw didn’t look perfect, but they were good enough to make a casual observer gasp. And such AI-crafted images are likely to keep spreading: any copies of DeepNude that are already out there could easily be replicated, and other similar programs are likely to pop up.

The anonymous person behind the app, who reportedly spoke with Vice, said DeepNude was trained on over 10,000 photos of naked women and used the AI technique behind many deepfake videos, known as generative adversarial networks, or GANs. GANs consist of two different neural networks pitted against each other in an effort to come up with new outputs — which could range from realistic-looking faces to paintings or, in this case, nude women — that mimic those in a mountain of training data.

“Digital tools, once they’re in the wild, are cheap and easy to replicate, and cheap and easy to use, increasingly,” Danielle Citron, a law professor at Boston University and author of the book “Hate Crimes in Cyberspace,” told CNN Business.

The swift rise and fall of DeepNude shows there is a demand for such software. In the hours after Vice’s story appeared on Wednesday, a Twitter account that appears to be linked to the DeepNude website indicated its server was down due to “unexpected traffic.” A few updates later, the account tweeted that DeepNude would be back online “in a few days.” Then, on Thursday afternoon, a new message appeared on Twitter, saying that DeepNude had been killed off due to concerns about potential misuse in the wake of “greatly underestimated” demand.

Though the free version of the application created images that have watermarks on them, the Twitter message read, “if 500,000 people use it, the probability that people misuse it is too high. We don’t want to make money this way. Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones that sell it.”

“The world,” it concluded, “is not yet ready for DeepNude.”

According to GoDaddy, the service through which DeepNude.com was registered, the site was registered by an organization in Estonia calling itself DeepInstruction. A message sent to the domain registrant via GoDaddy received no immediate response, nor did a tweet directed at the DeepNude Twitter account.

Though DeepNude could create images that look like realistic nudes, they’re not actual photos of naked bodies. So while there are a growing number of laws criminalizing non-consensual pornography, such as revenge porn, the images churned out by such AI-assisted software aren’t covered by existing legislation, Citron explained.

Yet Citron, who said she has spoken to women who have been the subject of such ersatz images, stressed that those depicted in them feel the same kind of invasion of sexual privacy as those who have had actual naked photos spread around.

“It has the same impact,” she said. “You feel like you have 1,000 eyes on your body.”