• Guy Dudeman@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    33
    ·
    9 months ago

    Is it a problem though? I mean, it just makes Rule34 pics that much easier to create. And you wouldn’t want to kink-shame anyone, now, would you? Why is it always heterosexual men who are kink-shamed? Why is liking naked women a bad thing?

    • denshirenji@lemmy.world
      link
      fedilink
      English
      arrow-up
      55
      arrow-down
      12
      ·
      9 months ago

      Alright, I’ll bite. To start, this is a real person that we are talking about. A real person who did not consent. Does that mean anything to you? The fact that there is a very real person that exists in very real life that has had this happen to them?

      Otherwise, I agree. Nothing wrong with the male libido.

      • andrai@feddit.de
        link
        fedilink
        English
        arrow-up
        29
        arrow-down
        14
        ·
        9 months ago

        I don’t need someone’s consent to draw lewd fanart of them.

        • just another dev@lemmy.my-box.dev
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          7
          ·
          edit-2
          9 months ago

          I wouldn’t be so sure. Depending on where you are in the world, there dozens of laws that might interfere. Ranging from publicity rights to slander, especially if the images are photorealistic (enough).

          • Grimy@lemmy.world
            link
            fedilink
            English
            arrow-up
            19
            arrow-down
            9
            ·
            9 months ago

            100%, which is why the fault lies with the bad actors and the platforms that let this proliferate and not the tool itself.

            Can you imagine this headline but with Photoshop instead of AI? It would be utterly silly.

            This is orchestrated to create anger against AI. There’s a lot of money involved in it and that money triples if consumers aren’t allowed to run and distribute models on their own PC.

            • denshirenji@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              9 months ago

              I can agree with you on principal. Just as long as you aren’t distributing things like this I don’t really see an issue. Not the tool, the distributor / platform. I also agree that these articles are meant to ensure that those technology can be held behind locked doors. I fully support the idea of making AI something that is self-hostable.

              That being said there are people in this thread that see nothing wrong with distributing lewd pictures of real people (drawn, ai generated, or even hacked and stolen). That is the only thing that I was addressing.

              • Grimy@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                2
                ·
                9 months ago

                Ya I agree, there’s a very ugly side of things and It’s a shame certain people are rolling themselves in it.

                There’s also the fact that lewds of celebs and similar material will grow exponentially but the same is also true for all other media so hopefully it balances itself out.

                I think that’s the main problem with a lot of these articles, they are missing the forest for the trees so to speak. We are looking at an explosion of culture, the bad stuff is just along for the ride. I’m personally excited for it.

          • BarbecueCowboy@kbin.social
            link
            fedilink
            arrow-up
            2
            arrow-down
            2
            ·
            9 months ago

            This is actually super super tricky.

            So, there’s an exemption for ‘Transformative’ art, and while this is obviously pretty shady, it feels like there’s a good chance this would qualify as transformative. Basically, you can’t copy an existing photograph you don’t own, but you can take an existing person and paint a new original picture of them.

            We had a big lawsuit just last year where the Supreme Court clarified the line a bit. In that case, the art was found to be not Transformative, but they did a lot to explain why, and based on that, this would be super likely to fall on the side of ‘Legally Allowed’.

            • denshirenji@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              9 months ago

              I’m not a lawyer and can’t even begin to answer that question. I was merely trying to get the conversation starting down that logical track, because I, personally, think that it is at the heart of the matter.

              Looking this up, it seems that, at least in California, it probably would be considered illegal, at least according to this site.

    • Candelestine@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      7
      ·
      9 months ago

      So, this line of thought is not going to get AI fakes permitted, it’s going to get rule 34 banned.

      We’ve been censoring sex shit for a long ass time, in case you haven’t noticed. The recent trend of information freedom is not going to defeat that old religious puritanical bs, and people’s wishes for privacy on top of it.

      Trump would shut that shit down fast. Conservatives want people reproducing, not masturbating, and he has christian supporters to keep in line, who do not like porn.