• 0x0001@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    One thing to consider, if this turned out to be accepted, it would make it much harder to prosecute actual csam, they could claim “ai generated” for actual images

    • theherk@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      I get this position, truly, but I struggle to reconcile it with the feeling that artwork of something and photos of it aren’t equal. In a binary way they are, but with more precision they’re pretty far apart. But I’m not arguing against it, I’m just not super clear how I feel about it yet.

      • Corkyskog@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        It’s not a difficult test. If a person can’t reasonably distinguish it from an actual child, then it’s CSAM.

        • phoenixz@lemmy.ca
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          Just to play devil’s advocate:

          What about hentai where little girls get fondled by tentacles? (Please please please don’t make this be my most up voted post)

          • bitfucker@programming.dev
            link
            fedilink
            arrow-up
            0
            ·
            4 months ago

            Yeah, no. The commenter has stated actual child, not cartoon one. It is a different discussion entirely, and a good one too. Because artwork is a part of freedom of expression. An artwork CAN be made without hurting anyone or abusing anyone. We fully know that a human has creative capabilities to come up with something without having those actual something exist beforehand. It implies that humans can come up with CSAM without ever having seen a CSAM.

            • phoenixz@lemmy.ca
              link
              fedilink
              arrow-up
              1
              ·
              3 months ago

              Yeah but then it gets very messy and complicated fast. What about photo perfect AI pornography of minors? When and where do you draw the line?

        • Madison420@lemmy.world
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          4 months ago

          This would also outlaw “teen” porn as they are explicitly trying to look more childlike as well as models that only appear to be minors.

          I get the reason people think it’s a good thing but all censorship has to be narrowly tailored to content lest it be too vague or overly broad.

          • Corkyskog@sh.itjust.works
            link
            fedilink
            arrow-up
            0
            ·
            4 months ago

            And nothing was lost…

            But in seriousness, as you said they are models who are in the industry, verified, etc. It’s not impossible to have a white-list of actors, and if anything there should be more scrutiny on the unknown “actresses” portraying teenagers…

            • Madison420@lemmy.world
              link
              fedilink
              arrow-up
              0
              arrow-down
              1
              ·
              4 months ago

              Except jobs dude, you may not like their work but it’s work. That law ignores verified age, that’s a not insignificant part of my point…

      • Madison420@lemmy.world
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        4 months ago

        So long as the generation is without actual model examples that are actual minors there’s nothing technically illegal about having sexual material of what appears to be a child. They would then have a mens rea question and a content question, what actual defines in a visual sense a child? Could those same things equally define a person of smaller stature? And finally could someone like tiny texie be charged for producing csam as she by all appearance or of context looks to be a child.

          • jeremyparker@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            4 months ago

            This isn’t true. AI can generate tan people if you show them the color tan and a pale person – or green people or purple people. That’s all ai does, whether it’s image or text generation – it can create things it hasn’t seen by smooshing together things it has seen.

            And this is proven by reality: ai CAN generate csam, but it’s trained on that huge image database, which is constantly scanned for illegal content.

          • Madison420@lemmy.world
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            4 months ago

            Real images that don’t have to be of csam but rather of children, it could theoretically train anything sexual with legal sexual content and let the ai connect the dots.

        • Fungah@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          It is illegal in Canada to have sexual depictions of a child whether its a real image or you’ve just sat down and drawn it yourself. The rationale being that behavior escalated, and looking at images goes to wanting more

          It borders on thought crime which I feel kind of high about but only pedophiles suffer which I feel great about. There’s no legitimate reason to have sexualized image of a child whether computer geneerate, hand drawn, or whatever.

          • Madison420@lemmy.world
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            4 months ago

            This article isn’t about Canada homeboy.

            Also that theory is not provable and never will be, morality crime is thought crime and thought crime is horseshit. We criminalize criminal acts not criminal thoughts.

            Similarly, you didn’t actually offer a counterpoint to any of my points.