• TheAnonymouseJoker@lemmy.ml
    link
    fedilink
    arrow-up
    2
    arrow-down
    18
    ·
    edit-2
    9 days ago

    There is not yet AI that can do this. Also, is there real world harm happening? This is a problem of defamation and libel, not “CSAM”. Reducing problems to absurdity is lethal to liberty of citizens.

    All those who wanted AI so much, you will have the whole cake now. Fuck AI empowerment. I knew this would happen, but the people glazing AI would not stop. Enjoy this brainrot, and soon a flood of Sora AI generated 720p deep fake porn/gore/murder videos.

    • papertowels@lemmy.one
      link
      fedilink
      arrow-up
      5
      ·
      9 days ago

      Just passing through, no strong opinions on the matter nor is it something I wish to do deep dive research on.

      Just wanted to point out that your original comment was indeed just a threat that did nothing to address OPs argument.

      • TheAnonymouseJoker@lemmy.ml
        link
        fedilink
        arrow-up
        1
        arrow-down
        13
        ·
        edit-2
        9 days ago

        It was not a threat, but a hypothetical example to gauge the reaction of that reactionary baiter.

        The problem with claiming AI generated art as CSAM is that there is no possible way to create an objective definition of what “level” of realism is real and what is not. A drawing or imaginary creation is best left not defined as real in any capacity whatsoever. If it is drawn or digitally created, it is not real, period. Those people thinking of good uses of AI were too optimistic and failed to account for the extremely bad use cases that will spiral out of control as far as human society goes.

        Even though China is incredibly advanced and proactive on trying to control this AI deepfake issue, I do not trust any entity in any capacity on such a problem impossible to solve on a country or international scale.

        I just had a dejavu moment typing this comment, and I have no idea why.

          • TheAnonymouseJoker@lemmy.ml
            link
            fedilink
            arrow-up
            1
            arrow-down
            8
            ·
            9 days ago

            So if I draw a stick figure with 2 circles, call it 8 years old, is it CSAM? Will I be arrested for it? Do you see how that dumb logic does not work too well?

            • ssj2marx@lemmy.ml
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              8 days ago

              Hot take: yes. All art exists in a social context, and if the social context of your art is “this is a child and they are sexualized” then your art should be considered CSAM. Doesn’t matter if it’s in an anime style, a photorealistic style, or if it’s a movie where the children are fully clothed for the duration but are sexualized by the director as in Cuties - CSAM, CSAM, CSAM.

              • TheAnonymouseJoker@lemmy.ml
                link
                fedilink
                arrow-up
                1
                arrow-down
                3
                ·
                8 days ago

                Glad that it will always remain a hot take.

                The problem with your argument is there cannot be developed a scale or spectrum to judge where the fake stops and real starts for drawings or AI generated media. And since they were not recorded with a camera in real world, they cannot be real, no matter what your emotional response to such a deplorable defamation act may be. It is libel of an extreme order.

                Cuties was shot with a camera in real world. Do you see the difference between AI generated media and what Cuties was?

                • ssj2marx@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  8 days ago

                  there cannot be developed a scale or spectrum to judge where the fake stops and real starts

                  Ah, but my definition didn’t at all rely on whether or not the images were “real” or “fake”, did it? An image is not merely an arrangement of pixels in a jpeg, you understand - an image has a social context that tells us what it is and why it was created. It doesn’t matter if there were real actors or not, if it’s an image of a child and it’s being sexualized, it should be considered CSAM.

                  And yes I understand that that will always be a subjective judgement with a grey area, but not every law needs to have a perfectly defined line where the legal becomes the illegal. A justice system should not be a computer program that simply runs the numbers and delivers an output.

                  • TheAnonymouseJoker@lemmy.ml
                    link
                    fedilink
                    arrow-up
                    1
                    arrow-down
                    4
                    ·
                    8 days ago

                    An image is not merely an arrangement of pixels in a jpeg,

                    I am not one of those “it’s just pixels on a screen” people. But if it was not recorded in real world with a camera, it cannot be real.

                    Who will be the judge? If there is some automated AI created, who will be the one creating it? Will it be perfect? No. We will end up in the situation that Google caused to users, like doctors, married parents and legitimate people being labelled as pedophiles or CSAM users. It has already happened to me in this thread, and you also said it. The only accurate way to judge it will be a very large team of forensic experts on image/video media, which is not feasible for the amount of data social media generates.

                    not every law needs to have a perfectly defined line

                    And this is where the abuse by elites, politicians and establishment starts. Activists and dissidents can be easily jailed by CSAM being planted, which would in this case be as simple as AI pictures being temporary drive by downloads onto target’s devices.