• Empricorn@feddit.nl
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    This is tough. If it was just a sicko who generated the images for himself locally… that is the definition of a victimless crime, no? And it might actually dissuade him from seeking out real CSAM…

    BUT, iirc he was actually distributing the material, and even contacted minors, so… yeah he definitely needed to be arrested.

    But, I’m still torn on the first scenario…

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      But, I’m still torn on the first scenario…

      To me it comes down to a single question:

      “Does exposure and availability to CSAM for pedophiles correlate with increased or decreased likelihood of harming a child?”

      If there’s a reduction effect by providing an outlet for arousal that isn’t actually harming anyone - that sounds like a pretty big win.

      If there’s a force multiplier effect where exposure and availability means it’s even more of an obsession and focus such that there’s increased likelihood to harm children, then society should make the AI generated version illegal too.

        • ricecake@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          4 months ago

          How they’ve done it in the past is by tracking the criminal history of people caught with csam, arrested for abuse, or some combination thereof, or by tracking the outcomes of people seeking therapy for pedophilia.

          It’s not perfect due to the sample biases, but the results are also quite inconsistent, even amongst similar populations.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        Image-generating AI is capable of generating images that are not like anything that was in its training set.

          • GBU_28@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            If it has images of construction equipment and houses, it can make images of houses that look like construction equipment. Swap out vocabulary as needed.

            • xmunk@sh.itjust.works
              link
              fedilink
              arrow-up
              0
              ·
              4 months ago

              Cool, how would it know what a naked young person looks like? Naked adults look significantly different.

                • xmunk@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  4 months ago

                  Is a kid just a 60% reduction by volume of an adult? And these are generative algorithms… nobody really understands how it perceives the world and word relations.

                  • FaceDeer@fedia.io
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    4 months ago

                    It understands young and old. That means it knows a kid is not just a 60% reduction by volume of an adult.

                    We know it understands these sorts of things because of the very things this whole kerfuffle is about - it’s able to generate images of things that weren’t explicitly in its training set.