• pewpew@feddit.it
    link
    fedilink
    arrow-up
    37
    arrow-down
    1
    ·
    1 year ago

    but the I in AI it’s actually a lowrcase L, so it’s short for Algorithm

    • ShortFuse@lemmy.world
      link
      fedilink
      arrow-up
      18
      ·
      edit-2
      1 year ago

      It kinda annoys me that the lowercase L glyph is taller than capital A. I don’t mind there being a difference, but cap-height should be taller than lowercase letters.

      Illuminati

    • thisisnotgoingwell@programming.dev
      link
      fedilink
      arrow-up
      125
      arrow-down
      2
      ·
      1 year ago

      I’m guessing he’s saying companies are still using the same human written code, but since AI is sexy right now and is being used to describe even simple programming logic, everything is “powered by AI”

          • Xylight (Photon dev)@lemmy.xylight.dev
            link
            fedilink
            English
            arrow-up
            11
            arrow-down
            5
            ·
            edit-2
            1 year ago

            (This isn’t my opinion, just saying what I think they are)

            They are saying it’s not intelligent in any way though. It sees a bunch of words as numbers and spits out some new numbers that the prediction algorithm creates.

            • LoafyLemon@kbin.social
              link
              fedilink
              arrow-up
              17
              ·
              edit-2
              1 year ago

              What you’re thinking of as AI is actually a narrower version, while true intelligence is termed AGI.

              Explanation:
              The term ‘AI’ (Artificial Intelligence) refers to computer systems that can perform tasks that would typically require human intelligence, like recognizing patterns or making decisions. However, most AI systems are specialized and focused on specific tasks.

              On the other hand, ‘AGI’ (Artificial General Intelligence) refers to a higher level of AI that possesses human-like cognitive abilities. AGI systems would be capable of understanding, learning, and applying knowledge across a wide range of tasks, much like us.

              So, the distinction lies in the breadth of capabilities: AI refers to more specialized, task-focused systems, while AGI represents a more versatile and human-like intelligence.

              • BlinkAndItsGone@lemm.ee
                link
                fedilink
                arrow-up
                7
                arrow-down
                1
                ·
                edit-2
                1 year ago

                The term ‘AI’ (Artificial Intelligence) refers to computer systems that can perform tasks that would typically require human intelligence,

                That’s everything computers do, though, isn’t it? Pocket calculators would have fit this definition of AI in the 1970s. In the '60s, “computer” was a human job title.

                • LoafyLemon@kbin.social
                  link
                  fedilink
                  arrow-up
                  4
                  arrow-down
                  1
                  ·
                  1 year ago

                  Unless your pocket calculator can recognise patterns or make decisions, it doesn’t fit the description.

            • jadero@programming.dev
              link
              fedilink
              arrow-up
              3
              ·
              1 year ago

              Fair enough. What evidence have you got that it’s any different than what humans do? Have you looked around? How many people can you point to that are not just regurgitating or iterating or recombining or rearranging or taking the next step?

              As far as I can tell, much of what we call intelligent activity can be performed by computer software and the gaps get smaller every year.

            • Yendor@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              5
              ·
              1 year ago

              That’s not how ChatGPT works.

              GPT is an LLM that use RNN. An RNN (Recurrent neural network) is not an algorithm.

                • Yendor@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  1 year ago

                  Yea, but not really. The algorithms are available for free, but they don’t do anything useful by themselves. The RNN is built by training the neural net, which uses grading/classification of training data to increase or decrease millions of coefficients of a multi-layer filter. It’s the training data, the classification feedback and the processing power that actually creates the AI.

    • obosob@feddit.uk
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      1 year ago

      Yeah, just use a char for card and test

      if(card < '7') count++;
      else count--;
      

      Or something, don’t mix types.

        • obosob@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Chars are just numbers, but yeah, an enum would work fine too, sure. The only advantage with using a char for it is that there’s no conversion needed for outputting them into strings so it’s a little easier. Less code, very readable, etc. Though yeah, thinking about it JQKA wouldn’t be numerically in the right order which could cause issues if the program did more than just implement HiLo

  • Lil' Bobby Tables@programming.dev
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    1 year ago

    Is the litmus test for programmers. When they start referring to “AI” as a clearly defined concept, you know they’re artlessly making shit up for a quick buck.

    • rho_@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Half of it. This gives you the running count. You need to also keep track of “number of decks in shoe” -“number of cards dealt since last shuffle”/52 to tell you how many decks are left in the shoe, then divide the running count by the number of decks left to give you a true count.

      True count higher than 1? Start increasing your bet accordingly.