• FaceDeer@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      Image-generating AI is capable of generating images that are not like anything that was in its training set.

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          4 months ago

          If it has images of construction equipment and houses, it can make images of houses that look like construction equipment. Swap out vocabulary as needed.

          • xmunk@sh.itjust.works
            link
            fedilink
            arrow-up
            0
            ·
            4 months ago

            Cool, how would it know what a naked young person looks like? Naked adults look significantly different.

              • xmunk@sh.itjust.works
                link
                fedilink
                arrow-up
                0
                ·
                4 months ago

                Is a kid just a 60% reduction by volume of an adult? And these are generative algorithms… nobody really understands how it perceives the world and word relations.

                • FaceDeer@fedia.io
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  4 months ago

                  It understands young and old. That means it knows a kid is not just a 60% reduction by volume of an adult.

                  We know it understands these sorts of things because of the very things this whole kerfuffle is about - it’s able to generate images of things that weren’t explicitly in its training set.

                  • xmunk@sh.itjust.works
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    4 months ago

                    But it doesn’t fully understand young and “naked young person” isn’t just a scaled down “naked adult”. There are physiological changes that people go through during puberty which is why the “It understands young vs. old” is a clearly vapid and low effort comment. Yours has more meaning behind it so I’d clarify that just being able to have a vague understanding of young and old doesn’t mean it can generate CSAM.