• Todd Bonzalez@lemm.ee
    link
    fedilink
    arrow-up
    5
    arrow-down
    2
    ·
    1 month ago

    The issue is not with all forms of pornographic AI, but more about deepfakes and nudifying apps that create nonconsensual pornography of real people. It is those people’s consent that is being violated.

    • quindraco@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 month ago

      No-one cares if you consent to being drawn. The problem here isn’t the consent of the depicted person, it’s that the viewer is being misled. That’s why the moral quandary goes away entirely if the AI porn is clearly labeled as such.

      • CarbonIceDragon@pawb.social
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 month ago

        I dont think that this is really true, I strongly suspect that most people I know would consider someone drawing porn of them without consent a majorly icky thing to do, and would probably consider someone doing that to someone else to be a creep for doing so. The reason such drawings are less an issue is at least partly that the barrier to entry is lower with AI, since it takes a certain amount of skill and time investment to draw something like that such as to be clearly recognizable as any specific real person.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      edit-2
      1 month ago

      I still don’t understand why this is now an issue but decades of photo editing did not bother anyone at all.

      • CarbonIceDragon@pawb.social
        link
        fedilink
        arrow-up
        1
        ·
        1 month ago

        I mean, it did bother people, it just took more skill and time at using photo manipulation software to make it look convincing such that it was rare for someone to both have the expertise and be willing to put in the time, so it didnt come up often enough to be a point of discussion. The AI just makes it quick and easy enough to become more common.