• peopleproblems@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    4 hours ago

    10 to 30? Yeah I think it might be a lot longer than that.

    Somehow everyone keeps glossing over the fact that you have to have enormous amounts of highly curated data to feed the trainer in order to develop a model.

    Curating data for general purposes is incredibly difficult. The big medical research universities have been working on it for at least a decade, and the tools they have developed, while cool, are only useful as tools too a doctor that has learned how to use them. They can speed diagnostics up, they can improve patient outcome. But they cannot replace anything in the medical setting.

    The AI we have is like fancy signal processing at best

    • RBG@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 hour ago

      Not an expert so I might be wrong, but as far as I understand it, those specialised tools you describe are not even AI. It is all machine learning. Maybe to the end user it doesn’t matter, but people have this idea of an intelligent machine when its more like brute force information feeding into a model system.

      • RecluseRamble@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        4
        ·
        53 minutes ago

        Don’t say AI when you mean AGI.

        By definition AI (artificial intelligence) is any algorithm by which a computer system automatically adapts to and learns from its input. That definition also covers conventional algorithms that aren’t even based on neural nets. Machine learning is a subset of that.

        AGI (artifical general intelligence) is the thing you see in movies, people project into their LLM responses and what’s driving this bubble. It is the final goal, and means a system being able to perform everything a human can on at least human level. Pretty much all the actual experts agree we’re a far shot from such a system.

    • ContrarianTrail@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      2 hours ago

      LLM’s are not the only type of AI out there. ChatGPT appeared seemingly out of nowhere. Whose to say the next AI system wont do that as well?

      • Vritrahan@lemmy.zip
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 hour ago

        Anything can happen. We can discover time travel tomorrow. The economy cannot run on wishful thinking.

        • lennivelkant@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 minutes ago

          It can! For a while. Isn’t that the nature of speculation and speculative bubbles? Sure, they may pop some day, because we don’t know for sure what’s a bubble and what is a promising market disruption. But a bunch of people make a bunch of money until then, and that’s all that matters.

  • poo@lemmy.world
    link
    fedilink
    English
    arrow-up
    136
    arrow-down
    9
    ·
    9 hours ago

    No bubble has deserved to pop as much as AI deserves to

    • misk@sopuli.xyzOP
      link
      fedilink
      English
      arrow-up
      112
      arrow-down
      8
      ·
      9 hours ago

      Blockchain and crypto were worse. „AI” has some actual use even if it’s way overblown.

      • Flying Squid@lemmy.world
        link
        fedilink
        English
        arrow-up
        33
        arrow-down
        1
        ·
        7 hours ago

        I’m glad you didn’t say NFTs because my Bored Ape will regain and triple its value any day now!

        • protist@mander.xyz
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 hours ago

          Bro the GME short squeeze is going to hit any day now. We’re going to be millionaires bro, you just wait

      • SkyezOpen@lemmy.world
        link
        fedilink
        English
        arrow-up
        39
        ·
        8 hours ago

        Creating a specialized neural net to perform a specific function is cool. Slapping GPT into customer support because you like money is horse shit and I hope your company collapses. But yeah you’re right. Blockchain was a solution with basically no problems to fix. Neural nets are a tool that can do a ton of things, but everyone is content to use them as a hammer.

        • astronaut_sloth@mander.xyz
          link
          fedilink
          English
          arrow-up
          17
          ·
          6 hours ago

          Yes! “AI” defined as only LLMs and the party trick applications is a bubble. AI in general has been around for decades and will only continue to grow.

        • Graphy@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          ·
          7 hours ago

          Honestly kinda miss when the drugs I did were illegal. I used to buy weed from this online seller that was really into designer drugs. The amount of time I used to spend on Erowid just to figure out wtf I was about to take.

      • _bcron_@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        8 hours ago

        I’m not even understanding what AI is at this point because there’s no delineation between moderately sophisticated algorithms and things that are orders of magnitude more complex.

        I mean, if something like multisampling came out today we’d all know how it’d be marketed

        • SlopppyEngineer@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          8 hours ago

          AI is a ridiculous broad term these days. Everybody had been slapping the label on anything. It’s kinda like saying “transportation” and it means anything between babies crawling up to wrap drive and teleportation.

        • catloaf@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          7 hours ago

          The AI buzzword means machine learning. You give it a massive dataset and it identifies correlations.

          Regular hand-coded AI is mostly simple state machines.

      • confusedbytheBasics@lemm.ee
        link
        fedilink
        English
        arrow-up
        19
        arrow-down
        13
        ·
        8 hours ago

        Blockchain has many valuable uses. A distributed zero trust ledger is useful. Sadly the finance scammers and the digital beanie baby collectors attracted all the marketing money.

        • Voroxpete@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          37
          arrow-down
          3
          ·
          8 hours ago

          And yet, every single company that has ever tried to implement a distributed zero trust ledger into their products and processes has inevitably ditched the idea after releasing that it does not, in fact, provide any useful benefit.

          • WhatAmLemmy@lemmy.world
            link
            fedilink
            English
            arrow-up
            12
            arrow-down
            4
            ·
            edit-2
            2 hours ago

            It is exceptionally useful for the auditing of damn near everything in digital space, as long as shared resources and 3rd parties have access to the blockchain … which is probably the major reason corporations and politicians don’t want anything to do with it.

            It’d be a lot harder to hide crimes, fraud, grey business dealings, bribery and illegal donations, sanction violations, secret police slush funds, etc, etc if every event in the entire financial system and supply chain was logged and cryptographically verifiable.

            EDIT: NOTE I’m not talking about everyones transactions being in a public ledger (bad). Only enhancing the current system between businesses and orgs so it’s exceptionally difficult for any of them to falsify data without the others knowing, as well as having near instant visibility and analytics of the entire market (great for regulators, academics, etc).

            A supply-chain wide blockchain could enable individuals to view every raw material that went into every product they consume, down to the location, date — even the exact time in many cases — each was mined, refined, harvested, transported, picked, traded, etc. in a way that no individual corp could hide or falsify dramatically. Each corp and individuals true (embodied energy consumption would be visible to every buyer; developed world politicians and corporations couldn’t simply blame China and other developing countries for their own consumption.

            • Voroxpete@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              13
              ·
              edit-2
              4 hours ago

              The reason major businesses haven’t bothered using distributed blockchains for auditing is because they fundamentally do not actually help in any way with auditing.

              At the end of the day, the blockchain is just a ledger. At some point a person has to enter the information into that ledger.

              Now, hear me out here, because this is going to be some totally out there craziness that is going to blow your mind… What happens if that person lies?

              Like, you’ve built your huge, complicated system to track every banana you buy from the farm to the grocery store… But what happens if the shipper just sends you a different crate of bananas with the wrong label on them? How does your system solve that? What happens if the company growing your bananas claims to use only ethical practices but in reality their workers are effectively slaves? How does a blockchain help fix that?

              The data in a system is only as good as your ability to verify it. Verifying the integrity of the data within systems was largely a solved problem long before distributed blockchains came along, and was rarely if ever the primary avenue for fraud. It’s the human components of these systems where fraud can most easily occur. And distributed blockchains do absolutely nothing to solve that.

        • Thrashy@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          ·
          7 hours ago

          The idea has merit, in theory – but in practice, in the vast majority of cases, having a trusted regulator managing the system, who can proactively step in to block or unwind suspicious activity, turns out to be vastly preferable to the “code is law” status quo of most blockchain implementations. Not to mention most potential applications really need a mechanism for transactions to clear in seconds, rather than minutes to days, and it’d be preferable if they didn’t need to boil the oceans dry in the process of doing so.

          If I was really reaching, I could maybe imagine a valid use case for say, a hypothetical, federated open source game that needed to have a trusted way for every node to validate the creation and trading of loot and items, that could serve as a layer of protection against cheating nodes duping items, for instance. But that’s insanely niche, and for nearly every other use case a database held by a trusted entity is faster, simpler, safer, more efficient, and easier to manage.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 hours ago

        Yes. But companies bought into AI way more than they bought into crypto though, in many outlandish and stupid ways. And many AI companies sell it in ways they shouldn’t.

    • DarkCloud@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      18
      ·
      8 hours ago

      Try Venice Ai, free to use, won’t try to censor your topics. Still just a chat bot though (although I think it does image generation too).

        • DarkCloud@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          edit-2
          3 hours ago

          The part where they were saying they don’t like the current AIs they know about. Showing disapproval of the trend.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    38
    arrow-down
    2
    ·
    edit-2
    9 hours ago

    As a major locally-hosted AI proponent, aka a kind of AI fan, absolutely. I’d wager it’s even worse than crypto, and I hate crypto.

    What I’m kinda hoping happens is that bitnet takes off in the next few months/years, and that running a very smart model on a phone or desktop takes milliwatts… Who’s gonna buy into Sam Altman $7 trillion cloud scheme to burn the Earth when anyone can run models offline on their phones, instead of hitting APIs running on multi-kilowatt servers?

    And ironically it may be a Chinese company like Alibaba that pops the bubble, lol.

  • MyOpinion@lemm.ee
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    9 hours ago

    Not shocked. It seems the tech bros like to troll us every few years.

    • NaibofTabr@infosec.pub
      link
      fedilink
      English
      arrow-up
      24
      ·
      8 hours ago

      The tech bros are selling, but it’s the VCs that are fueling this whole thing. They’re grasping for the next big thing. Mostly they don’t care if any of it actually works, as long as they can pump share value and then sell before it collapses.

    • sunzu2@thebrainbin.org
      link
      fedilink
      arrow-up
      5
      arrow-down
      3
      ·
      9 hours ago

      They they have been trying to repeat big tech rise…

      But each generation is more limp dick

      Uber/airbnb > crypto > ai

  • LemmyBe@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    edit-2
    8 hours ago

    Checks to see if Baidu is doing AI…yes, they are. How shocking.

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 hours ago

      “probably 1% of the companies will stand out and become huge and will create a lot of value, or will create tremendous value for the people, for society. And I think we are just going through this kind of process.”

      Baidu is huge. Sounds like good news for Baidu!

  • DarkCloud@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    7
    ·
    edit-2
    8 hours ago

    I think less restrictive AI that are free, like Venice AI (you can ask it pretty much anything and it will not stop you) will be around for longer than ones that went with restrictive subscription models, and that eventually those other ones will become niche.

    New technology always propagates further the freer it is to use and experiment with, and ChatGPT and OpenAI are quite restrictive and money hungry.