A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • TallonMetroid@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    47
    ·
    2 months ago

    Well, the image generator had to be trained on something first in order to spit out child porn. While it may be that the training set was solely drawn/rendered images, we don’t know that, and even if the output were in that style, it might very well be photorealistic images generated from real child porn and run through a filter.

    • MagicShel@programming.dev
      link
      fedilink
      arrow-up
      48
      arrow-down
      3
      ·
      2 months ago

      An AI that is trained on children and nude adults can infer what a nude child looks like without ever being trained specifically with those images.

            • Cryophilia@lemmy.world
              link
              fedilink
              arrow-up
              3
              arrow-down
              1
              ·
              2 months ago

              No, I’m admitting they’re stupid for even bringing it up.

              Unless their argument is that all AI should be illegal, in which case they’re stupid in a different way.

              • LustyArgonianMana@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                edit-2
                2 months ago

                Do you think regular child porn should be illegal? If so, why?

                Generally it’s because kids were harmed in the making of those images. Since we know that AI is using images of children being harmed to make these images, as the other posters has repeatedly sourced (but also if you’ve looked up deepfakes, most deepfakes are of an existing porn and the face just changed over top. They do this with CP as well and must use CP videos to seed it, because the adult model would be too large)… why does AI get a pass for using children’s bodies in this way? Why isn’t it immoral when AI is used as a middle man to abuse kids?

                • Cryophilia@lemmy.world
                  link
                  fedilink
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  2 months ago

                  Since we know that AI is using images of children being harmed to make these images

                  As I keep saying, if this is your reasoning then all AI should be illegal. It only has CP in its training set incidentally, because the entire dataset of images on the internet contains some CP. It’s not being specifically trained on CP images.

        • LustyArgonianMana@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          edit-2
          2 months ago

          Yes exactly. That people are then excusing this with “well it was trained on all.public images,” are just admitting you’re right and that there is a level of harm here since real materials are used. Even if they weren’t being used or if it was just a cartoon, the morality is still shaky because of the role porn plays in advertising. We already have laws about advertising because it’s so effective, including around cigarettes and prescriptions. Most porn, ESPECIALLY FREE PORN, is an ad to get you to buy other services. CP is not excluded from this rule - no one gets free lunch, so to speak. These materials are made and hosted for a reason.

          The role that CP plays in most countries is difficult. It is used for blackmail. It is also used to generate money for countries (intelligence groups around the world host illegal porn ostensibly “to catch a predator,” but then why is it morally okay for them to distribute these images but no one else?). And it’s used as advertising for actual human trafficking organizations. And similar organizations exist for snuff and gore btw. And ofc animals. And any combination of those 3. Or did you all forget about those monkey torture videos, or the orangutan who was being sex trafficked? Or Daisy’s Destruction and Peter Scully?

          So it’s important to not allow these advertisers to combine their most famous monkey torture video with enough AI that they can say it’s AI generated, but it’s really just an ad for their monkey torture productions. And even if NONE of the footage was from illegal or similar events and was 100% thought of by AI - it can still be used as an ad for these groups if they host it. Cartoons can be ads ofc.

      • Saledovil@sh.itjust.works
        link
        fedilink
        arrow-up
        6
        ·
        2 months ago

        Wild corn dogs are an outright plague where I live. When I was younger, me and my buddies would lay snares to catch to corn dogs. When we caught one, we’d roast it over a fire to make popcorn. Corn dog cutlets served with popcorn from the same corn dog is popular meal, especially among the less fortunate. Even though some of the affluent consider it the equivalent to eating rat meat. When me pa got me first rifle when I turned 14, I spent a few days just shooting corn dogs.

      • emmy67@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        edit-2
        2 months ago

        It didn’t generate what we expect and know a corn dog is.

        Hence it missed because it doesn’t know what a “corn dog” is

        You have proven the point that it couldn’t generate csam without some being present in the training data

        • ContrarianTrail@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          edit-2
          2 months ago

          I hope you didn’t seriously think the prompt for that image was “corn dog” because if your understanding of generative AI is on that level you probably should refrain from commenting on it.

          Prompt: Photograph of a hybrid creature that is a cross between corn and a dog

          • emmy67@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            2 months ago

            Then if your question is “how many Photograph of a hybrid creature that is a cross between corn and a dog were in the training data?”

            I’d honestly say, i don’t know.

            And if you’re honest, you’ll say the same.

            • ContrarianTrail@lemm.ee
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              2 months ago

              But you do know because corn dogs as depicted in the picture do not exists so there couldn’t have been photos of them in the training data, yet it was still able to create one when asked.

              This is because it doesn’t need to have been seen one before. It knows what corn looks like and it knows what a dog looks like so when you ask it to combine the two it will gladly do so.

              • emmy67@lemmy.world
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                edit-2
                2 months ago

                But you do know because corn dogs as depicted in the picture do not exists so there couldn’t have been photos of them in the training data, yet it was still able to create one when asked.

                Yeah, except photoshop and artists exist. And a quick google image search will find them. 🙄

                • ContrarianTrail@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  2 months ago

                  And this proves that AI can’t generate simulated CSAM without first having seen actual CSAM how, exactly?

                  To me, the takeaway here is that you can take a shitty 2 minute photoshop doodle and by feeding it thru AI it’ll improve the quality of it by orders of magnitude.

                  • emmy67@lemmy.world
                    link
                    fedilink
                    arrow-up
                    1
                    arrow-down
                    1
                    ·
                    2 months ago

                    I wasn’t the one attempting to prove that. Though I think it’s definitive.

                    You were attempting to prove it could generate things not in its data set and i have disproved your theory.

                    To me, the takeaway here is that you can take a shitty 2 minute photoshop doodle and by feeding it thru AI it’ll improve the quality of it by orders of magnitude.

                    To me, the takeaway is that you know less about ai than you claim. Much less. Cause we have actual instances and many where csam is in the training data. Don’t believe me?

                    Here’s a link to it

    • lunarul@lemmy.world
      link
      fedilink
      arrow-up
      15
      arrow-down
      1
      ·
      edit-2
      2 months ago

      we don’t know that

      might

      Unless you’re operating under “guilty until proven innocent”, those are not reasons to accuse someone.