• xylogx@lemmy.world
    link
    fedilink
    English
    arrow-up
    149
    arrow-down
    4
    ·
    1 个月前

    So you’re saying the ad driven internet will die? And we will be left with what? Wikipedia and Lemmy? I for one welcome our AI overlords!

    • venusaur@lemmy.world
      link
      fedilink
      English
      arrow-up
      51
      arrow-down
      3
      ·
      edit-2
      1 个月前

      Nah, it’s saying that ad and AI-driven internet will prevail. People only use Google to find an answer and don’t dig deeper, and if they do, it’s often because the links are sponsored. People using GPT’s are even less likely to click a link. Currently no ads, but just wait.

      Apologies if you were joking.

      • kadup@lemmy.world
        link
        fedilink
        English
        arrow-up
        28
        ·
        1 个月前

        “what should I do if I’m going through severe emotional distress? How to choose a good psychiatrist?”

        ChatGPT: "I’m sorry to hear that you’ve been going to a stressful situation, it’s always worth talking about your feelings. I’ve come up with a plan to help you:

        1 Purchase an ice cold Pepsi Black™ from a Pepsi official supplier"

      • sunzu2@thebrainbin.org
        link
        fedilink
        arrow-up
        3
        arrow-down
        21
        ·
        1 个月前

        Normies get AI slop, prosumer uses local llm…

        Not sure about social media… Normie is allergic to reading anything beyond daddy’s propaganda slop. If it ain’t rage bait, he ain’t got time for it

        • TheOneCurly@lemm.ee
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          2
          ·
          1 个月前

          Home grown slop is still slop. The lying machine can’t make anything else.

          • sunzu2@thebrainbin.org
            link
            fedilink
            arrow-up
            4
            arrow-down
            1
            ·
            1 个月前

            At least my idiocy ain’t training the enemy.

            Also, AI ain’t there to be correct. AI is there to help you get something done if you already know the outcome mostly.

            It can really turbo charge a Linux experience for example.

            Also local is way less censored and can be tweaked ;)

          • sunzu2@thebrainbin.org
            link
            fedilink
            arrow-up
            3
            arrow-down
            1
            ·
            1 个月前

            https://ollama.org/

            You can pick something that fits your GPU size. Works well on apple silicon too. My fav’s now are qwen3 series. Prolly best performance for local single gpu

            Will work on CPU/RAM but slower

            If you got Linux, I would put into a docker container. Might too much for the first try. There easier options I think.

            • Jakeroxs@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 个月前

              I use oobabooga, little bit more options in the gguf space then ollama but not as easy to use imo. Does support openAI api connection though so can plug in other services to use it.

            • tormeh@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 个月前

              Ollama is apparently going for lock-in and incompatibility. They’re forking llama.cpp for some reason, too. I’d use GPT4All or llama.cpp directly. They support Vulkan, too, so your GPU will just work.

            • venusaur@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 个月前

              Hm, I’ll see if my laptop can handle it. Probably do t have the patience or processing power

        • jim3692@discuss.online
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 个月前

          So, prosumers, leveraging computers that are not optimized for AI workloads, being limited to models that are typically inferior to commercial ones, are wasting more energy for even more slop?

          • sunzu2@thebrainbin.org
            link
            fedilink
            arrow-up
            4
            ·
            1 个月前

            That’s the price of privacy that I am willing to pay. With respect to electricity, I pay my bills at consumer rate while subsidizing corporate parasites who pay lower rates and get state aid on top of it.

            • jim3692@discuss.online
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 个月前

              That’s the price of privacy I am currently paying.

              There was, however, a video from The Hated One, that presents a different perspective on this. Maybe privacy is more environment friendly than we think.

              A lot of energy is wasted on data collection and analysis for advertising. Devices with modified firmwares, like LineageOS and GrapheneOS, do not collect such data, reducing the load on analysis servers.

    • BestBouclettes@jlai.lu
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 个月前

      It would be very naïve to think they won’t go against Wikipedia and the fediverse at some point unfortunately…