As described in the article, there is a no tolerance policy because of the danger of switching from generated to real. Children are not sexual objects. Whoever feels different needs help.
I may be mistaken but it look more like someone used a pretrained image generator and retrained it with child pics.
The following site for exemple teach you to build and train a image generator you just have to change the kind of pictures it is trained on to make it malevolent. https://www.assemblyai.com/blog/minimagen-build-your-own-imagen-text-to-image-model/
Installed this. Made no changes to it. Ran a few queries. Most was nightmare fuel of severed limbs and crazy teeth etc as I have no clue what I’m doing. But still, with enough tries… it generated it. So, confirmed that you don’t need to introduce CP to get naked AI kids.
It’s abuse if there is no one being abused?
As described in the article, there is a no tolerance policy because of the danger of switching from generated to real. Children are not sexual objects. Whoever feels different needs help.
Initial training picture may come from abused child’s. And we can theorize that they will need more to keep training the AI.
You’re thinking
OpenAI(EDIT: Runway, sorry, got mixed up with ChatGPT) put CSEM in their data set? Or maybe an accident?I may be mistaken but it look more like someone used a pretrained image generator and retrained it with child pics. The following site for exemple teach you to build and train a image generator you just have to change the kind of pictures it is trained on to make it malevolent. https://www.assemblyai.com/blog/minimagen-build-your-own-imagen-text-to-image-model/
Installed this. Made no changes to it. Ran a few queries. Most was nightmare fuel of severed limbs and crazy teeth etc as I have no clue what I’m doing. But still, with enough tries… it generated it. So, confirmed that you don’t need to introduce CP to get naked AI kids.