- DeepNude, a web app that used deepfake artificial-intelligence tech to turn any photo of a woman into a realistic-looking nude image, went viral this week after much news coverage.
- The team behind DeepNude announced on Thursday on Twitter they were shutting down the app for good, saying “the world is not yet ready for DeepNude.”
- DeepNude has been offline for some time because its servers haven’t been able to handle the crazy the amount of traffic brought to the app. The app creators said the decision to permanently shut down was because of concerns the technology would be misused.
- Visit Business Insider’s homepage for more stories.
A web app called DeepNude, which could turn any photo of a woman into a realistic-seeming nude image, is shutting down after a short stint of going viral.
DeepNude caught major attention from the public after Vice’s tech vertical, Motherboard, published a story about the web app on Wednesday evening. People raced to check out the software, which harnessed deepfake technology to let users generate fake, yet believable, nude photos of women in a one-step process.
But DeepNude, which was relatively unknown until the Motherboard story, was unable to handle the traffic. The team behind DeepNude quickly took the app offline, saying its servers “need reinforcement” and promising to have the app up and running “in a few days.”
But the team announced Thursday afternoon on Twitter that DeepNude was offline — for good. DeepNude said it “greatly underestimated” the amount of traffic it would get and decided to shut down the app because “the probability that people will misuse it is too high.”
“We don’t want to make money this way. Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones who sell it,” DeepNude wrote in a tweet. “The world is not yet ready for DeepNude.”
DeepNude is just the latest example in how techies have been using artificial intelligence to create deepfakes, eerily realistic fake depictions of someone doing or saying something they have never done. Some have used the technology to create computer-generated cats, Airbnb listings, and revised versions of famous Hollywood movies. But others have used the technology to effortlessly spread misinformation, like this deepfake video of Alexandria Ocasio-Cortez, which was altered to make the House representative seem like she doesn’t know the answers to questions from an interviewer.
Facebook CEO Mark Zuckerberg said at a conference on Wednesday that deepfake technology was such a unique new challenge that it would require special policies different from the ones in place for traditional misinformation.
And indeed, DeepNude shows how quickly the technology has evolved, making it ever easier for non-technically savvy people to create realistic-enough content that could then be used for blackmail and bullying purposes, especially when it comes to women. Deepfake technology has already been used for revenge porn targeting anyone from people’s friends to their classmates, in addition to fueling fake nude videos of celebrities like Scarlett Johansson.
DeepNude brought the ability to make believable revenge porn to the masses, something a revenge-porn activist told Motherboard was “absolutely terrifying” and should not be available for public use.
But Alberto, a developer behind DeepNude, defended himself to Motherboard: “I’m not a voyeur, I’m a technology enthusiast.”
Alberto told Motherboard his software is based off Pix2Pix, an open-source algorithm used for “image-to-image translation.” The team behind Pix2Pix, a group of computer-science researchers, called DeepNude’s use of their work “quite concerning.”
“We have seen some wonderful uses of our work, by doctors, artists, cartographers, musicians, and more,” the MIT professor Phillip Isola, who helped create Pix2Pix, told Business Insider in an email. “We as a scientific community should engage in serious discussion on how best to move our field forward while putting reasonable safeguards in place to better ensure that we can benefit from the positive use-cases while mitigating abuse.”
And if you’re wondering why DeepNude undressed only women and not men, according to the site, it’s because there is a much larger number of photos of naked women to train the AI with, compared with photos of naked men.