• deegeese@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    278
    arrow-down
    2
    ·
    23 days ago

    Guy who buys programmers and sells AI thinks he can sell more AI and stop buying programmers.

    This is up there with Uber pretending self driving cars will make them rich.

    • MeekerThanBeaker@lemmy.world
      link
      fedilink
      English
      arrow-up
      47
      arrow-down
      5
      ·
      23 days ago

      I mean… self driving cars probably will. Just not as soon as they think. My guess, at least another decade.

      • IphtashuFitz@lemmy.world
        link
        fedilink
        English
        arrow-up
        70
        arrow-down
        3
        ·
        23 days ago

        Not until a self driving car can safely handle all manner of edge cases thrown at it, and I don’t see that happening any time soon. The cars would need to be able to recognize situations that may not be explicitly programmed into it, and figure out a safe way to deal with it.

        • massive_bereavement@fedia.io
          link
          fedilink
          arrow-up
          50
          ·
          23 days ago

          As someone said on this thread: as soon as they can convince legislators, even if they are murder machines, capital will go for it.

          Borrowing from my favorite movie: “it’s just a glitch”.

          • IphtashuFitz@lemmy.world
            link
            fedilink
            English
            arrow-up
            10
            ·
            23 days ago

            I doubt it. The liability would be far too great. Ambulance chasing lawyers would salivate at the chance to represent the families of pedestrians struck and killed by buggy self driving cars. Those capitalists don’t want endless years of class action cases tying up their profits.

            • Nommer@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              21
              ·
              23 days ago

              When was the last time a corporation got anything other than a slap on the wrist and a small donation to the government just so they could keep doing what they’re doing?

              • atrielienz@lemmy.world
                link
                fedilink
                English
                arrow-up
                10
                ·
                edit-2
                23 days ago

                Like Boeing. As much as I hate people saying dumb shit about a company they don’t know much of anything about, Boeing is the epitome of what you said. A company getting a small slap on the wrist for gross negligence in the name of profit. Especially because of all the goodies they develope for the US Federal Government. And since they are a world wide company our government isn’t the only one. They know they reside in a place of power because they fill a hole in an industry that basically has to be filled. And people want to try to bankrupt them with some weird ideas about voting with their dollar. But that’s nonsense.

                People don’t understand about how they build planes not to sell but to lease. How these types of leases keep their customers paying out the nose for an asset they don’t own, and responsible for the maintenance of that asset until it’s time to upgrade. They cornered the market on enshitification long before the likes of Microsoft and Google, and they have mastered the art of it.

                Tesla or Uber or whoever wish they could do what Boeing has been doing for decades. People have this rose tinted glasses view of what Boeing “used to be” when it was “run by engineers” etc. That’s hilarious to me. Back in the day they hedged their bets in a race to the bottom to develop a two engined plane that wouldn’t catastrophically fail and fall out of the sky if it lost an engine so they could skirt worldwide federal regulations that required planes to have more than two engines. This added to upkeep and fuel costs making it untenable and creating air travel that was incredibly expensive. And their engineers managed it, so they played the long game, basically allowing them to develop planes that were more fuel efficient and cost effective to maintenance meaning their customers could afford to buy more of them by providing air travel opportunities to more people.

                You know what we got from that? Shittier seating arrangements, poorly manufactured planes, and baggage fees out the whazoo in addition to ever rising ticket prices for air travel.

          • meco03211@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            23 days ago

            Alternatively measures could be put in place to eliminate certain edge cases. You can see similar concepts in places with separate infrastructure for things like busses or HOV lanes. Places you could still ostensibly allow “regular” vehicles to travel but limit/eliminate pedestrians or merging.

        • Num10ck@lemmy.world
          link
          fedilink
          English
          arrow-up
          37
          arrow-down
          1
          ·
          23 days ago

          there will be a massive building in like india with many thousand of atrociously paid workers donning VR goggles who spend their long hours constantly Quantum Leap finding themselves in traumatizing last second emergency situations that the AI gives up on. Instantly they slam on the brakes as hard as they can. They drink tea. there’s suicide netting everywhere. they were the lowest bidder this quarter.

          • kronisk @lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            23 days ago

            I wish I could give this comment more than a simple upvote. I want to mail you a freshly baked cinnamon bun.

        • KevonLooney@lemm.ee
          link
          fedilink
          English
          arrow-up
          7
          ·
          23 days ago

          Plus, as soon as the cars can drive themselves people will stop needing Uber in many cases.

          No parking? Just tell your car to go park on a street 10 blocks away.

          Drunk? Car drives itself while you sleep.

          Going to the airport? Car drops you off and returns home. Car also picks you up when you are back.

          This is combined with the fact that people will do more disgusting things in an Uber without the driver there. If you have ever driven for Uber, you know that 10% of people are trying to eat or drink in the car. They are going to spill and it’s going to end up like the back of a bus.

          • yildolw@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            23 days ago

            Not sure if we’re agreeing and saying exactly the same thing here, but Uber’s business model is to get suckers who are bad at math to own the cars. Uber’s business model does not work if they have to own their own cars. Self-driving Uber doesn’t work because Uber would have to own the cars and therefore has to cover vehicle insurance, vehicle depreciation, and so on out of its own margin.

        • cogitase@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          2
          ·
          23 days ago

          Their accident rate continues to decrease and things like quorum sensing and platooning are going to push them to be better than humans. You’re never going to have a perfect system that never has accidents, but if you’re substantially better than humans in accidents per mile driven and you’re dramatically improving throughput and reducing traffic through V2X, it’s going to make sense to fully transition.

          I imagine some east Asian countries will be the first to transition and then the rest of the world will begrudgingly accept it once the advantages become clear and the traditional car driving zealots die off.

          • AtomicTacoSauce@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            1
            ·
            23 days ago

            The robot taxi from Total Recall came to mind while reading your reply. Our future is almost assuredly dystopian.

        • Nollij@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          23 days ago

          “handle” is doing a lot of heavy lifting there. The signs are already there that all of these edge cases will just be programmed as “safely pull over and stop until conditions change or a human takes control”. Which isn’t a small task in itself, but it’s a lot easier than figuring out to continue (e.g.) on ice.

        • pbbananaman@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          23 days ago

          Just like all humans can do right now, right?

          I never see any humans on the rode staring at their phone and driving like shit.

          • Wilzax@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            23 days ago

            The problem with self-driving cars isn’t that it’s worse than human drivers on average, it’s that it’s SO INCREDIBLY BAD when it’s wrong that no company would ever assume the liability for the worst of its mistakes.

            • pbbananaman@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              23 days ago

              But if the average is better, then we’re will clearly win by using it. I’m not following the logic of tracking the worst case scenarios as opposed to the average.

              • Wilzax@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                23 days ago

                Average is better means fewer incidents overall. But when there are incidents, the damages for those incidents tend to be much worse. This means the victims are more likely to lawyer up and go after the company responsible for the AI that was driving, and that means that the company who makes the self-driving software better be prepared to pay for those worst case scenarios, which will now be 100% their fault.

                Uber can avoid liability for crashes caused by their human drivers. They won’t be able to do the same when their fleet is AI. And when that happens, AI sensibilities will be measured my human metrics because courts are run by humans. The mistakes that they make will be VERY expensive ones, because a minor glitch can turn an autonomous vehicle from the safest driving experience possible to a rogue machine with zero sense of self-preservation. That liability is not worth the cost savings of getting rid of human drivers yet, and it won’t be for a very long time.

        • rottingleaf@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          23 days ago

          Those self-driving cars are called trains. They already can be self-driving. In a situation where the computational complexity and required precision are somewhat controlled, that is, on train tracks.

      • NOT_RICK@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        23 days ago

        Maybe, or maybe like harnessing fusion it will always be “just a few more years away!”

      • deegeese@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        5
        ·
        23 days ago

        Self driving taxis are definitely happening, but the people getting rich in a gold rush are the people selling shovels.

        Uber has no structural advantage because their unique value proposition is the army of cheap drivers.

        • yildolw@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          23 days ago

          We’re a century away from self-driving cars that can handle snowfall

          Just this year farmers with self-driving tractors got screwed because a solar flare made GPS inaccurate and so tractors went wild because they were programmed with the assumption of GPS being 100% reliable and accurate with no way to override

      • med@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        22 days ago

        I’m right there with you, but I also remember hearing that this time last decade.

  • trolololol@lemmy.world
    link
    fedilink
    English
    arrow-up
    137
    arrow-down
    1
    ·
    23 days ago

    I hope this helps people understand that you don’t get to be CEO by being smart or working hard. It’s all influence and gossip all the way up.

      • trolololol@lemmy.world
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        2
        ·
        23 days ago

        Yep if I had that kind of money and surrounded by like minded people I’d agree. Unfortunately I’m cursed with a rational mind 🙃🙃🙃

  • Hackworth@lemmy.world
    link
    fedilink
    English
    arrow-up
    104
    arrow-down
    1
    ·
    23 days ago

    “Coding” was never the source of value, and people shouldn’t get overly attached to it. Problem solving is the core skill. The discipline and precision demanded by traditional programming will remain valuable transferable attributes, but they won’t be a barrier to entry. - John Carmack

    • ASDraptor@lemmy.autism.place
      link
      fedilink
      English
      arrow-up
      60
      ·
      23 days ago

      This right here.

      Problem is not coding. Anybody can learn that with a couple of well focused courses.

      I’d love to see an AI find the cause of a catastrophic crash of a machine that isn’t caused by a software bug.

    • shalafi@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      22 days ago

      Agreed! Problem solving is core to any sort of success. Whether you’re moving up or on for more pay, growing tomatoes or nurturing a relationship, you’re problem solving. But I can see AI putting the screws to those of us in tech.

      Haven’t used it much so far, last job didn’t afford much coding opportunity, but I wrote a Google Apps script to populate my calendar given changes to an Excel sheet. Pretty neat!

      With zero experience App scripting, I tried going the usual way, searching web pages. Got it half-ass working, got stuck. Asked ChatGPT to write it and boom, solved with an hour’s additional work.

      You could say, “Yeah, but you at least had a clue as to general scripting and still had to problem solve. Plus, you came up with the idea in the first place, not the AI!” Yes! But point being, AI made the task shockingly easier. That was at a software outfit so I had the oppurtuniy to chat with my dev friends, see what they were up to. They were properly skeptical/realistic as to what AI can do, but they still used it to great effect.

      Another example: Struggled like hell to teach myself database scripting, so ignorant I didn’t know the words to search and the solutions I found were more advanced answers than my beginner work required (or understood!). First script was 8 short lines, took 8 hours. Had AI been available to jump start me, I could have done that in an hour, maybe two. That’s a wild productivity boost. So while AI will never make programmers obsolete, we’ll surely need fewer of them.

  • L0rdMathias@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    79
    arrow-down
    1
    ·
    23 days ago

    “Guy who was fed a pay-to-win degree at a nepotism practicing school with a silver spoon shares fantasy, to his fan base that own large publications, about replacing hard working and intelligent employees with machines he is unable to comprehend the most basic features of”

  • Boozilla@lemmy.world
    link
    fedilink
    English
    arrow-up
    77
    ·
    23 days ago

    They’ve been saying this kind of bullshit since the early 90s. Employers hate programmers because they are expensive employees with ideas of their own. The half-dozen elite lizard people running the world really don’t like that kind of thing.

    Unfortunately, I don’t think any job is truly safe forever. For myriad reasons. Of course there will always be a need for programmers, engineers, designers, testers, and many other human-performed jobs. However, that will be a rapidly changing landscape and the number of positions will be reduced as much as the owning class can get away with. We currently have large teams of people creating digital content, websites, apps, etc. Those teams will get smaller and smaller as AI can do more and more of the tedious / repetitive / well-solved stuff.

    • abcd@feddit.org
      link
      fedilink
      English
      arrow-up
      6
      ·
      23 days ago

      I’m relaxed. IMHO this is just another trend.

      In all my career I haven’t seen a single customer who was able to tell me out of the box what they need. Big part of my job is to talk to all entities to get the big picture. Gather information about Soft- and Hardware interfaces, visit places to see PHYSICAL things like sub processes or machines.

      My focus may be shifted to less coding in an IDE and more of generating code with prompts to use AI as what it is: a TOOL.

      I’m annoyed of this mentality of get rich quick, earn a lot of money with no work, develop software without earning the skills and experience. It’s like using libraries for every little problem you have to solve. Worst case you land in dependency/debug hell and waste much more time debugging stuff other people wrote than coding it by yourself and understanding how the things work under the hood.

    • SlopppyEngineer@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      23 days ago

      And by that time, processors and open source AI are good enough that any noob can ask his phone to generate a new app from scratch. You’d only need big corpo for cloud storage and then only when distributed systems written by AI don’t work.

    • shalafi@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      22 days ago

      the number of positions will be reduced as much as the owning class can get away with

      Well, after all, you don’t hire people to do nothing. It’s simply a late-stage capitalism thing. Hopefully one day we can take the benefits of that extra productivity and share the wealth. The younger generations seem like they might move us that way in the coming decades.

      • Boozilla@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        22 days ago

        I really hope so. Sometimes I think the kids are alright. Like the 12 year old owning the My Pillow idiot. Then I hear the horror stories from my school teacher friends.

  • EnderMB@lemmy.world
    link
    fedilink
    English
    arrow-up
    52
    ·
    23 days ago

    It’s worth noting that the new CEO is one of few people at Amazon to have worked their way up from PM and sales to CEO.

    With that in mind, while it’s a hilariously stupid comment to make, he’s in the business of selling AWS and its role in AI. Take it with the same level of credibility as that crypto scammer you know telling you that Bitcoin is the future of banking.

    • mycodesucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      23 days ago

      PM and sales, eh?

      So you’re saying his lack of respect for programmers isn’t new, but has spanned his whole career?

    • Squizzy@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      23 days ago

      As a wage slave with no bitcoin or crypto, the technology has been hijacked by these types and could otherwise have been useful.

      • EnderMB@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        23 days ago

        I’m not entirely sold on the technology, especially since immutable ledgers have been around long before the blockchain, but also due to potential attack vectors and the natural push towards centralisation for many applications - but I’m just one man and if people find uses for it then good for them.

        • khaleer@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          2
          ·
          23 days ago

          I guess additional bonus for crypto would be not burning the planet, and actuallt have a real value of something, not the imagined one.

        • eleitl@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          23 days ago

          What other solutions to double spending were there in financial cryptography before?

          • EnderMB@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            23 days ago

            No idea, I don’t work in fintech, but was it a fundamental problem that required a solution?

            I’ve worked with blockchain in the past, and the uses where it excelled were in immutable bidding contracts for shared resources between specific owners (e.g. who uses this cable at x time).

            • eleitl@lemm.ee
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              1
              ·
              edit-2
              23 days ago

              Fully decentralized p2p cryptocurrency transactions without double spending by proof of work (improvement upon Hashcash) was done first with Bitcoin. The term fintech did not exist at the time. EDIT: looked it up, apparently first use as Fin-Tech was 1967 https://en.wikipedia.org/wiki/Fintech – it’s not the current use of the term though.

    • Eril@feddit.org
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      23 days ago

      When I last tried to let some AI write actual code, it didn’t even compile 🙂 And another time when it actually compiled it was trash anyway and I had to spend as much time fixing it, as I would have spent writing it myself in the first place.

      So far I can only use AI as a glorified search engine 😅

  • jubilationtcornpone@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    47
    ·
    23 days ago

    Just the other day, the Mixtral chatbot insisted that PostgreSQL v16 doesn’t exist.

    A few weeks ago, Chat GPT gave me a DAX measure for an Excel pivot table that used several DAX functions in ways that they could not be used.

    The funny thing was, it knew and could explain why those functions couldn’t be used when I corrected it. But it wasn’t able to correlate and use that information to generate a proper function. In fact, I had to correct it for the same mistakes multiple times and it never did get it quite right.

    Generative AI is very good at confidently spitting out inaccurate information in ways that make it sound like it knows what it’s talking about to the average person.

    Basically, AI is currently functioning at the same level as the average tech CEO.

  • mojo_raisin@lemmy.world
    link
    fedilink
    English
    arrow-up
    46
    ·
    22 days ago

    The job of CEO seems the far easier to replace with AI. A fairly basic algorithm with weighted goals and parameters (chosen by the board) + LLM + character avatar would probably perform better than most CEOs. Leave out the LLM if you want it to spout nonsense like this Amazon Cloud CEO.

  • yesman@lemmy.world
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    2
    ·
    23 days ago

    I just want to remind everyone that capital won’t wait until AI is “as good” as humans, just when it’s minimally viable.

    They didn’t wait for self-checkout to be as good as a cashier; They didn’t wait for chat-bots to be as good as human support; and they won’t wait for AI to be as good as programmers.

      • SlopppyEngineer@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        edit-2
        23 days ago

        They’ll try the opposite. It’s what the movie producers did to the writers. They gave them AI generated junk and told them to fix it. It was basically rewriting the whole thing but because now it was “just touching up an existing script” it was half price.

        • xtr0n@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          22 days ago

          They can try. But cleaning up a mess takes a while and there’s no magic wand to make it ho faster.

        • Eager Eagle@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          23 days ago

          Yeah they’ll try. Surely that can’t cascade into a snowball of issues. Good luck for them 😎

          • SlopppyEngineer@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            23 days ago

            A strike with tech workers would be something else. Curious what would happen if the one maintaining the servers for entertainment, stock market or factories would just walk out. On the other hand, tech doesn’t have unions.

      • peopleproblems@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        23 days ago

        You better fucking believe it.

        AIs are going to be the new outsource, only cheaper than outsourcing and probably less confusing for us to fix

    • AmbiguousProps@lemmy.today
      link
      fedilink
      English
      arrow-up
      17
      ·
      edit-2
      23 days ago

      They won’t, and they’ll suffer because of it and want to immediately hire back programmers (who can actually do problem solving for difficult issues). We’ve already seen this happen with customer service reps - some companies have resumed hiring customer service reps because they realized AI isn’t able to do their jobs.

    • SlopppyEngineer@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      23 days ago

      And because all the theft and malfunctions, the nearby supermarkets replaced the self checkout by normal cashiers again.

      If it’s AI doing all the work, the responsibility goes to the remaining humans. They’ll be interesting lawsuits even there’s the inevitable bug that the AI itself can’t figure out.

      • atrielienz@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        23 days ago

        We saw this happen in Amazon’s cashier-less stores. They were actively trying to use a computer based AI system but it didn’t work without thousands of man hours from real humans which is why those stores are going away. Companies will try this repeatedly til they get something that does work or run out of money. The problem is, some companies have cash to burn.

        I doubt the vast majority of tech workers will be replaced by AI any time soon. But they’ll probably keep trying because they really really don’t want to pay human beings a liveable wage.

    • shalafi@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      22 days ago

      Already happening. Cisco just smoked another 4,000 employees. And anecdotally, my tech job hunt is, for the first time, not going so hot.

  • Feyd@programming.dev
    link
    fedilink
    English
    arrow-up
    31
    ·
    22 days ago

    Meanwhile, llms are less useful at helping me write code than intellij was a decade ago

    • tzrlk@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      22 days ago

      I’m actually really impressed with the auto complete intellij is packaged with now. It’s really good with golang (probably because golang has a ton of code duplication).

  • werefreeatlast@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    23 days ago

    AI is terrible at solving real problems thru programming. As soon as the problem is not technical in nature and needs a decision to be made based on experience, it falls flat on its face.

  • SparrowRanjitScaur@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    2
    ·
    edit-2
    23 days ago

    Extremely misleading title. He didn’t say programmers would be a thing of the past, he said they’ll be doing higher level design and not writing code.

    • Tyfud@lemmy.world
      link
      fedilink
      English
      arrow-up
      34
      arrow-down
      1
      ·
      23 days ago

      Even so, he’s wrong. This is the kind of stupid thing someone without any first hand experience programming would say.

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        23 days ago

        Yeah, there are people who can “in general” imagine how this will happen, but programming is exactly 99% not about “in general” but about specific “dumb” conflicts in the objective reality.

        People think that what they generally imagine as the task is the most important part, and since they don’t actually do programming or anything requiring to deal with those small details, they just plainly ignore them, because those conversations and opinions exist in subjective bendable reality.

        But objective reality doesn’t bend. Their general ideas without every little bloody detail simply won’t work.

      • SparrowRanjitScaur@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        10
        ·
        23 days ago

        Not really, it’s doable with chatgpt right now for programs that have a relatively small scope. If you set very clear requirements and decompose the problem well it can generate fairly high quality solutions.

        • Tyfud@lemmy.world
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          2
          ·
          23 days ago

          This is incorrect. And I’m in the industry. In this specific field. Nobody in my industry, in my field, at my level, seriously considers this effective enough to replace their day to day coding beyond generating some boiler plate ELT/ETL type scripts that it is semi-effective at. It still contains multiple errors 9 times out of 10.

          I cannot be more clear. The people who are claiming that this is possible are not tenured or effective coders, much less X10 devs in any capacity.

          People who think it generates quality enough code to be effective are hobbyists, people who dabble with coding, who understand some rudimentary coding patterns/practices, but are not career devs, or not serious career devs.

          If you don’t know what you’re doing, LLMs can get you close, some of the time. But there’s no way it generates anything close to quality enough code for me to use without the effort of rewriting, simplifying, and verifying.

          Why would I want to voluntarily spend my day trying to decypher someone else’s code? I don’t need chatGPT to solve a coding problem. I can do it, and I will. My code will always be more readable to me than someone else’s. This is true by orders of magnitude for AI-code gen today.

          So I don’t consider anyone that considers LLM code gen to be a viable path forward, as being a serious person in the engineering field.

          • SparrowRanjitScaur@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            5
            ·
            edit-2
            23 days ago

            It’s just a tool like any other. An experienced developer knows that you can’t apply every tool to every situation. Just like you should know the difference between threads and coroutines and know when to apply them. Or know which design pattern is relevant to a given situation. It’s a tool, and a useful one if you know how to use it.

            • rottingleaf@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              1
              ·
              23 days ago

              This is like applying a tambourine made of optical discs as a storage solution. A bit better cause punctured discs are no good.

              A full description of what a program does is the program itself, have you heard that? (except for UB, libraries, … , but an LLM is no better than a human in that too)

        • OmnislashIsACloudApp@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          23 days ago

          right now not a chance. it’s okay ish at simple scripts. it’s alright as an assistant to get a buggy draft for anything even vaguely complex.

          ai doing any actual programming is a long ways off.

      • Eyck_of_denesle@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        20
        ·
        edit-2
        23 days ago

        I heard a lot of programmers say it

        Edit: why is everyone downvoting me lol. I’m not agreeing with them but I’ve seen and met a lot that do.

        • Tyfud@lemmy.world
          link
          fedilink
          English
          arrow-up
          31
          arrow-down
          1
          ·
          23 days ago

          They’re falling for a hype train then.

          I work in the industry. With several thousand of my peers every day that also code. I lead a team of extremely talented, tenured engineers across the company to take on some of the most difficult challenges it can offer us. I’ve been coding and working in tech for over 25 years.

          The people who say this are people who either do not understand how AI (LLMs in this case) work, or do not understand programming, or are easily plied by the hype train.

          We’re so far off from this existing with the current tech, that it’s not worth seriously discussing.

          There are scripts, snippets of code that vscode’s llm or VS2022’s llm plugin can help with/bring up. But 9 times out of 10 there’s multiple bugs in it.

          If you’re doing anything semi-complex it’s a crapshoot if it gets close at all.

          It’s not bad for generating psuedo-code, or templates, but it’s designed to generate code that looks right, not be right; and there’s a huge difference.

          AI Genned code is exceedingly buggy, and if you don’t understand what it’s trying to do, it’s impossible to debug because what it generates is trash tier levels of code quality.

          The tech may get there eventually, but there’s no way I trust it, or anyone I work with trusts it, or considers it a serious threat or even resource beyond the novelty.

          It’s useful for non-engineers to get an idea of what they’re trying to do, but it can just as easily send them down a bad path.

          • rottingleaf@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            23 days ago

            People use visual environments to draw systems and then generate code for specific controllers, that’s in control systems design and such.

            In that sense there are already situations where they don’t write code directly.

            But this has nothing to do with LLMs.

            Just for designing systems in one place visual environments with blocks might be more optimal.

            • Miaou@jlai.lu
              link
              fedilink
              English
              arrow-up
              3
              ·
              23 days ago

              And often you still have actual developers reimplementing this shit because EE majors don’t understand dereferencing null pointers is bad

          • magic_smoke@links.hackliberty.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            23 days ago

            Had to do some bullshit ai training for work. Tried to get the thing to remake cmatrix in python.

            Yeah no, that’s not replacing us anytime soon, lmao.

    • cheddar@programming.dev
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      23 days ago

      So they would be doing engineering and not programming? To me that sounds like programmers would be a thing of the past.

    • realharo@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      23 days ago

      Sounds like he’s just repeating a common meme. I don’t see anything about higher level design that would make it more difficult for an AI (hypothetical future AI, not the stuff that’s available now) compared to lower level tasks.

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      22 days ago

      How is “not writing code” different from programmers being a thing of the past?

      What do you think programmers do?

    • Zip2@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      23 days ago

      We’ll be able to use the newly found time to realise our dream of making PMs redundant by automating them.