• FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    Image-generating AI is capable of generating images that are not like anything that was in its training set.

      • GBU_28@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        If it has images of construction equipment and houses, it can make images of houses that look like construction equipment. Swap out vocabulary as needed.

        • xmunk@sh.itjust.works
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          Cool, how would it know what a naked young person looks like? Naked adults look significantly different.

            • xmunk@sh.itjust.works
              link
              fedilink
              arrow-up
              0
              ·
              4 months ago

              Is a kid just a 60% reduction by volume of an adult? And these are generative algorithms… nobody really understands how it perceives the world and word relations.

              • FaceDeer@fedia.io
                link
                fedilink
                arrow-up
                0
                ·
                4 months ago

                It understands young and old. That means it knows a kid is not just a 60% reduction by volume of an adult.

                We know it understands these sorts of things because of the very things this whole kerfuffle is about - it’s able to generate images of things that weren’t explicitly in its training set.

                • xmunk@sh.itjust.works
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  4 months ago

                  But it doesn’t fully understand young and “naked young person” isn’t just a scaled down “naked adult”. There are physiological changes that people go through during puberty which is why the “It understands young vs. old” is a clearly vapid and low effort comment. Yours has more meaning behind it so I’d clarify that just being able to have a vague understanding of young and old doesn’t mean it can generate CSAM.

                  • FaceDeer@fedia.io
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    4 months ago

                    But it doesn’t fully understand young and “naked young person” isn’t just a scaled down “naked adult”.

                    Do you actually know that, or are you just assuming it?

                    Personally, I’m basing my assertions off of experience with related situations, where I’ve asked image AIs to generate images of things that I’m quite sure weren’t in its training set and that require conceptual understanding to create “hybrids.” It’s done a decent job of those so I’m assuming that it can figure out this specific situation as well, since most of these models have a lot of examples of naked people and young people in their training sets. But I haven’t actually asked any AIs to generate images of naked young people to test this one specific case.