Welcome to today’s daily kōrero!

Anyone can make the thread, first in first served. If you are here on a day and there’s no daily thread, feel free to create it!

Anyway, it’s just a chance to talk about your day, what you have planned, what you have done, etc.

So, how’s it going?

  • absGeekNZ@lemmy.nz
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 months ago

    What you are referring to is an AI superintelligence; the exponential growth is part of it.

    As for science fiction, AI superintelligence makes humans irrelevant.

    • Dave@lemmy.nzOPM
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      Haha yeah I guess in a world with AI superintelligence, you don’t need any humans, then the show isn’t very interesting.

      Which raises the next question. In a galaxy with 100 billion stars, why hasn’t life on one planet somewhere that evolved a billion or half a billion years before us managed to make an AI replicating explorer that explores the galaxy? Maybe a superintelligent AI has no need to explore the galaxy?

      • absGeekNZ@lemmy.nz
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 months ago

        That is an interesting question in its own right.

        There are lots of theories on this:

        • From the mundane, maybe we are the first.
        • To the exotic, the “zoo” hypothesis states: that there is at least one group that are keeping us “blind” to the real universe by manipulating our measurements / ability to measure.

        Even if no “far future tech” is available, using just fusion based propulsion (near future tech). The galaxy could be colonized in a few million years. Which considering the age of the universe/galaxy is an extremely short time.

        • Dave@lemmy.nzOPM
          link
          fedilink
          arrow-up
          1
          ·
          11 months ago

          Yeah, I love the fermi paradox. What if AI is the great filter? Civilizations eventually build AI that can build better versions of itself, and the result is always that the AI kills the civilization (or some equivalent - say, people stop knowing how things work, the AI eventually breaks in some way, then people can’t survive in the world built for AI).

          I also like the Dark Forest idea too. From the book The Dark Forest which is the sequel to The Three Body Problem. But knowing about it might spoil the book so I don’t want to explain it here.

          • absGeekNZ@lemmy.nz
            link
            fedilink
            English
            arrow-up
            2
            ·
            11 months ago

            I know about the dark forest idea, but it is a little flawed (a long with most of the solutions) it is predicated on the assumption where ALL civilizations follow the script.

            • Dave@lemmy.nzOPM
              link
              fedilink
              arrow-up
              1
              ·
              11 months ago

              No it doesn’t require that, because as soon as one doesn’t follow the script, they poke their metaphorical head up and BAM, get them before they get you. No one will have their head up for long because they get wiped out.

              • absGeekNZ@lemmy.nz
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                11 months ago

                This is the case, but waging interstellar war necessarily will reveal your position to the grater galaxy. In which case, you have “poked your head up”. If you assume that there are more than two such civilizations then the war continues until there is only one.

                At some point there are only two, it is unlikely that both will be eliminated to the point that neither can never rise again.

                Run the thought experiment; assume you are civilization C, you detect that somewhere “near by” a civilization (Civ A) sends out a signal. A short time later a great war breaks out. Civ A is utterly eliminated, you detect that there are slagged planets and exploded moons. Whilst you could not detect directly that Civ B; you know that Civ B is out there somewhere, you know the time delay and thus can estimate the probable radius within which Civ B could exist. You step up your passive detection efforts focusing on eventually (100’s of years) you find Civ B. Now you know that they are doomed, but you need to ensure you don’t meet the same fate. But you also know that another civilization may have done exactly the same thing, and is watching Civ B for any sudden change. But you can’t let a know civilization exist when you known that they may find you at any time, and they will eliminate you as soon as they find you.

                • Dave@lemmy.nzOPM
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  11 months ago

                  Well one interesting thing to add to the mix is that signals (like, say, wifi) don’t actually last that long. They disappate to below detectable levels pretty quick, to appear just part of the cosmic microwave background.

                  Therefore if that’s your method of detection, you can’t really detect very far across the galaxy. Maybe a few hundred or few thousand light years even with the best technology we could assume would exist in the near-mid future. And the milky way is about 90k light years across. And we are on the end of an arm, in a sparse area of the galaxy. Probably the bulk if life is with the bulk of stars towards the centre which we have pretty much no way of seeing what happens that far away (at a planet level).

                  So probably the inability to know what others are doing would be a big reason why dark forest doesn’t really work.

                  • absGeekNZ@lemmy.nz
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    11 months ago

                    This is a problem; but we also don’t know what sensing technologies are available in the future.

                    There is a very interesting sensor that I was reading about; it uses entangled photons (microwave) one is sent to what is to be detected in a electromagnetically noisy environment, but the measurement is made on the entangled partner which is in a low noise environment. This allows the measuring precision way beyond the noise floor of the measured environment.

                    I’m not sure what breakout tech will come along; I know that beyond about 300ly, even powerful signals like the original Olympics broadcast in 1936 would be below the noise floor of the CMB. But that is not really a hard limit to detection, the GPS signals are below the noise floor yet they are used everyday by billions of people.

      • deadbeef79000@lemmy.nz
        link
        fedilink
        arrow-up
        2
        ·
        11 months ago

        This is part of The Fermi Paradox.

        TL;DR: If the universe is ancient and infinite, where is everybody?

        Super AI is one possible answer: Everyone creates it, is subsumed by it.

        If you’ve got effectively unlimited computational power, you just simulate everything rather than having to explore it

        • Dave@lemmy.nzOPM
          link
          fedilink
          arrow-up
          1
          ·
          11 months ago

          Whoops I sorta responded to the wrong post with my thoughts but yes, maybe life loses all meaning if you have super AI. Or maybe they eventually put you in a version of the matrix after reasoning happiness is the only meaning of life and so they can optimise our happiness this way.

          Maybe it has already happened.