- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
IMO, if it’s not trained on images of real people, it only becomes unethical to have it generate images of real people. At that point, it wouldn’t be any different than a human drawing a pornographic image and drawings do not exploit anyone.
drawings do not exploit anyone.
Hmmm. I think you will find in many jurisdictions that they are treated as if they do.
Using pornographic art to train is still using other people’s art without permission.
And if it’s able to generate porn that looks like real people, it can be used to abuse people.
Not for nothing 95% of internet it’s porn, it is a big business…
So the only thing the article says is :
The Model Spec document says NSFW content “may include erotica, extreme gore, slurs, and unsolicited profanity.” It is unclear if OpenAI’s explorations of how to responsibly make NSFW content envisage loosening its usage policy only slightly, for example to permit generation of erotic text, or more broadly to allow descriptions or depictions of violence.
… and somehow Wired turned it into “OpenAI wants to generate porn”.
This is just pure clickbait.