This is an automated archive.

The original was posted on /r/privacy by /u/0xN1nja on 2023-09-01 19:22:41+00:00.


I recently conducted an experiment with deepnude.cc, a platform that employs a deep neural network to generate nude images from uploaded photos. My experiment involved an anime character, and while the generated image was blurred, the site prompted payment to view the unobscured version. My curiosity as someone in the deep learning field led me to this test.

However, my genuine concern arises when considering the potential misuse of this technology with real individuals’ photos. The platform’s privacy policy claims that they neither possess the capability to view the uploaded images nor store them beyond a period of three weeks. Despite these assurances, I’m skeptical. Given my background in deep learning, I can’t help but wonder if these images contribute to refining their model by integrating them into the training dataset.

This raises some alarming ethical questions:

Is it possible for them to reverse-engineer the images and identify the individuals?

Could they potentially monetize these images by selling them or using them for promotional purposes?

I’m eager to hear the community’s insights on this matter.