Westfield is but one example of an issue all school districts are grappling with as the omnipresence of technology — including artificial intelligence — impacts students’ lives, the district’s superintendent Raymond González said in a statement.

    • paraphrand@lemmy.world
      link
      fedilink
      English
      arrow-up
      51
      arrow-down
      10
      ·
      edit-2
      1 year ago

      Photoshop was always something that required skill, and a computer to run it, and a copy of a paid program.

      This stuff does not need a lot of those hurdles. It’s all about ease and how it’s usable on your pocket computer that you and all your classmates have with you all the time.

      Your thought is still a fair one to have. But there are big differences between what was and this new stuff. In the past you woulda needed a ton more skill and the alignment of a bunch of things to casually generate fake nudes like the ones covered by this article.

        • paraphrand@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          ·
          edit-2
          1 year ago

          Sweet. Me too. In the 90s. This is partially where I draw my understanding of the situation from.

          Specifically:

          • The idea that piracy of professional software isn’t as casual as phone apps or web apps.
          • The fact that it’s paid software that is professional software with a learning curve.
        • JohnEdwa@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          16
          ·
          edit-2
          1 year ago

          While true, you can generate AI images with a potato, it just takes longer. For my setup, stable diffusion on my RTX 3060 generates the basic image in around 10 seconds while running on CPU only takes around five minutes, but the result is exactly the same.

          • rar@discuss.online
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Do the files exactly match to their hashes? I wonder if there’s a fundamental difference generated by using different hardwares.

            • Echo Dot@feddit.uk
              link
              fedilink
              English
              arrow-up
              6
              ·
              1 year ago

              AI always generate different outputs for the same input (AI appears to be non-deterministic) so it would be impossible to confirm that exactly.

              But I suppose what they mean is they appear to be of the same quality. Taking a longer time does not appear to decrease the quality of the output.

              I suppose you could give an AI the same input resetting it after each input and then use statistical models to identify common traits. Then do the same thing on different hardware and run the same statistical analysis and see if there is a difference between group A in group B but as far as I’m aware no one has done this.

              In theory hardware shouldn’t matter, it’s all mathematics basically and one plus one is always equal two, so there shouldn’t be any fluctuations.

              • rar@discuss.online
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                Yes, I suppose given equal input (model, keyword, seed, etc.) two Stable Diffusion installs should output same images; what I am curious about is whether the hardware configuration (e.g. gpu manufacturers) could result in traceable variations. As abuse of this tech gains prominence, tracing back the producer of a certain synthetic media by the specific hardware combination could become a thing.

                • JohnEdwa@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  edit-2
                  1 year ago

                  While it could work like bullet forensics, where given access to the gun you can shoot it and compare it to the original bullet, there is no way to look at a generated image and figure out exactly what made it as there are simply way too many variables and random influences. Well, unless the creator is dumb enough to keep the metadata enabled, by default automatic1111 stable diffusion embeds all of it in the file itself as a png comment thingy.