- cross-posted to:
- technology@beehaw.org
- cross-posted to:
- technology@beehaw.org
(let me preach a little, I have to listen to my boss gushing about AI every meeting)
Compare AI tools: now vs 3 years ago. All those 2022 “Prompt engineer” courses are totally useless in 2025.
Extrapolate into the future and realize, that you’re not losing anything valuable by not learning AI tools today. The whole point of them is they don’t require any proficiency. It “just works”.
Instead focus on what makes you a good developer: understanding how things work, which solution is good for what problem, centering your divs.
Key skill is to be able to communicate your problem and requirements which turns out to be really hard.
It’s also a damn useful skill whether you’re working with AI or humans. Probably worth investing some effort into that regardless of what the future holds.
Though it’s more work with current AI at least compared to another team member - the AI cannot have access to a lot of context due to data security rules.
As an old fart you can’t imagine how often I heard or read that.
You should click the link.
Hehe. Damn, absolutely fell for it. Nice 😂
Yeah but it’s different this time!
I do wonder about inventions that actually changed the world or the way people do things, and if there is a noticeable pattern that distinguishes them from inventions that came and went and got lost to history, or that did get adopted but do not have mass adoption. Hindsight is 20/20, but we live in the present and have to make our guesses about what will succeed and what will fail, and it would be nice to have better guesses.
Quality work will always need human craftsmanship
I’d wager that most revolutionary technologies are either those that expand human knowledge and understanding, and (to a lesser extent) those that increase replicability (like assembly lines)
It’s tricky, because there’s no hard definition for what it means to “change the world”, either. To me, it brings to mind technologies like the Internet, the telephone, aviation, or the steam engine. In those cases, it seems like the common thread is to enable us to do something that simply wasn’t possible before, and is also reliably useful.
To me, AI fails on both those points. It doesn’t really enable us to do anything new. We already had chat bots, we already had Photoshop, we already had search algorithms and auto complete. It can do some of those things a lot more quickly than older technologies, but until they solve the hallucination problems it doesn’t seem reliable enough to be consistently useful.
These things make it come off more as a potential incremental improvement that is still too early in it’s infancy, than as something truly revolutionary.
Well it’ll change the world by consuming a shit ton of electricity and using even more precious water to fill the data centres. So changing the world is correct in that regard.
It needs to be more trustworthy. If I have to double check everything, I still have to figure out how to do whatever it’s doing, then figure out how it’s doing the thing, then verify if it did it right. By then, I could have just done it in step 1.5 probably.
deleted by creator
I do wonder about inventions that actually changed the world or the way people do things, and if there is a noticeable pattern that distinguishes them from inventions that came and went and got lost to history,
Cool thought experiment.
Comparing the first iPhone with the release of BlockChain is a pretty solid way to consider the differences.
We all knew that modern phones were going to be huge. We didn’t need tech bros to tell us to trust them about it. The usefulness was obvious.
After I got my first iPhone, I learned a new thing I could do with it - by word-of-mouth - pretty much every week for the first year.
Even so, Google supposedly under-estimated the demand for the first Android phones by almost a factor of 10x.
BlockChain works fine, but it’s not changing my daily routine every week.
AI is somewhere in between. I do frequently learn something new and cool that AI can do for me, from a peer. It’s not as impactful as my first pocket computer phone, but it’s still useful.
Even with the iPhone release, I was told “learn iPhone programming or I won’t have a job.” I actually did not learn iPhone programming, and I do still have a job. But I did need to learn some things about making code run on phones.
I’d love to read a list of those instances/claims/tech
I imagine one of them was low-code/no-code?
/edit: I see such a list is what the posted link is about.
I’m surprised there’s not low-code/no-code in that list.
“We’re gonna make a fully functioning e-commerce website with only this WYSIWYG site builder. See? No need to hire any devs!”
Several months later…
“Well that was a complete waste of time.”
You’re right. It belongs on the list.
I was told several times that my programming career was ending, when the first low-code/no-code platforms released.
At my work we explored a low-code platform. It was not low on code at all. Beyond the simplest demos you had to code everything in javascript, but in a convoluted, intransparend, undocumented environment with a horrendous editing UI. Of course their marketing was something different than that.
That was not the early days of low-code mind you. It was rather recently; maybe three or four years ago.
Remember when “The Cloud” was going to put everyone in IT out of a job?
I don’t think it was supposed to replace everyone in IT, but every company had system administrators or IT administrators that would work with physical servers and now there are gone. You can say that the new SRE are their replacement, but it’s a different set of skills, more similar to SDE than to system administrators.
And some companies (like mine) just have their SDEs do the SRE job as well. Apparently it incentivizes us to write more stable code or something
I just think this is patently false. Or at least there are/were orgs where cloud costs so much more than running their own servers that are tended by maybe 1 FTE across a bunch of admins mostly doing other tasks.
Let me just point out one recent comparison - we were considering cloud backup for a couple petabytes of data, with a few hundred GB changing or adding / restoring every week or less. I think the best deal, where we held the software costs equal was $5/TB/Month.
This is catastrophically more expensive over a 10 year lifespan of a server or two and a small/mid sized LTO9 tape library and tapes. For one thing, we’d have paid more than the server etc in about a year. After that, tape prices have always tended down over time, and the storage costs for us for tape is basically $0 once in archive storage. We put it in a cabinet in another building - and you can fit A LOT of data in these tapes in a small room. That’ll cost basically $0 additional for 20 years, forget about 10. So let’s add in electricity etc - I still have doubts those will be over ~$100k over the lifetime of the project. Labor is about a wash cause you still need people to manage the backups to the cloud, and I think actually moving tapes might be ~.05 FTE in our situation. Literally anyone can be taught how to do it once the backup admin puts the tapes in the hopper or tells them which serial # to put in the hopper.
I also think that many companies are finding something similar for straight servers - at least it was in the news quite a bit for a while. Now, if you can be entirely cloud native - maybe it washes out, but for large groups of people that’s still not possible due to controlling hardware (think factory,scientific, etc)or existing desktop software for which the cloud isn’t really a replacement and throughput isn’t great (think Adobe products, video, scientific, financial etc data).
Naming it “The Cloud” and not “Someone else’s old computer running in their basement” was a smart move though.
It just sounds better.
Many of our customers store their backups in our “cloud storage solution”.
I think they’d be rather less impressed to see the cloud is in fact a jumble of PCs scattered all around our office.
deleted by creator
There is still difference.
Cloud was FOR the IT people. Machine learning is for predicting patterns following data.
Maybe stock predictors will adapt or replace but average programmer didn’t have to switch to replit because it’s “cloud IDE”
I mean, isn’t that what “get on or get left behind” means?
It does not necessarily mean you’ll lose your job. Nor does “get on” mean you have to become a specialist on it.
The post picks specifically on things that didn’t catch on (or that only catched on for a period of time but were eventually superseeded), but does not apply it to other successful technologies.
deleted by creator
touchscreen phones are a fad
Blackberry? I was like 10 at the time so this is based off my memory of who had what phone but that seems like the right guess
deleted by creator
This technology solves every development problem we have had. I can teach you how with my $5000 course.
Yes, I would like to book the $5000 Silverlight course, please.
glorified autocomplete
Which is honestly its best use case. That and occasionally asking it to generate a one-liner for a library call I don’t feel like looking up. Any significant generation tends to go off the rails fast.
Getting it to format documentation for you seems to work a treat. Nothing too complex, just “move this bit here, split that into points”.
You sir haven’t railed an entire ui out of your vibes up asshole
I’ve been using it to write unit tests, I still need to edit them to mock out some things and change a bit of logic here and there, but it saves me probably 50-75% of the time it used to take, just from not having to hand-write all that code.
If you use it basically like you’d use an intern or junior dev, it could be useful.
You wouldn’t allow them to check anything in themselves. You wouldn’t trust anything they did without carefully reading it over. You’d have to expect that they’d occasionally completely misunderstand the request. You’d treat them as someone completely lacking in common sense.
If, with all those caveats, you can get this assistance for free or nearly free, it might be worth it. But, right now, all the AI companies are basically setting money on fire to try to drive demand. If people had to pay enough that the AI companies were able to break even, it might be so expensive it was no longer worth it.
I love how it fucks up closing braces/parentheses, some advanced tech right there.
I use it to discuss the pros and cons of potential refactorings, then laugh as it botches the implementation.
I use it to find easy to miss errors.
I still think PWAs are a good idea instead of needing to download an app on your phone for every website. Like, for example, PWAs can easilly replace most banking apps, which are already just PWAs with added tracking.
They’re great for users, which is why Google and Apple are letting them die from lack of development so apps can make them money.
Thanks for summing it up so succinctly. As an aging dev, I’ve seen quite a lot of tech come and go. I wish more people interested in technology would spend more time learning the basics and the history of things.
it’s funny, but also holy moly do I not trust a “sign in with github” button
Might I ask why? There are some pages where I see that as the least evil option, I.e. duckdns
Basically because my Github account has an important job, and I don’t want to increase its attack surface by using it as a pseudo-Facebook
I’m skeptical of author’s credibility and vision of the future, if he has not even reached blink tag technology in his progress.
<blink>How dare they!</blink>
The future of web development is Angelfire.
Good thing I hate web development
No one can predict the future. One way or the other.
The best way to not be let behind is to be flexible about whatever may come.
Can’t predict the future, but I can see the past. Specifically the part of the past that used standards based implementations and boring technology. Love that I can pull up html with elements using ALL CAPs and table aligned content. It looks like a hot mess but it still works, even on mobile. Plain text keeps trucking along. Sqlite will outlive me. Exciting things are exciting but the world is made of boring.
10/10. No notes.
It pains me so much when I see my colleagues pay OpenAI to do programming assignments… they see it is faster to ask gpt, than learn it properly. Sadly, I can say nothing to them, or I would risk worsening relations with them.
I’m glad they do. This is going to generate so much work opportunities to undo their messes.
Except that they are research students in PhD course, it would exacerbate code messiness in research paper codebases.
Or open source projects…
You should probably click the link
If you’re not using Notepad, I don’t even know what to tell you.
JEdit 4 life!
What the fuck is Silverlight
Microsoft Flash. Netflix used it for a while. I don’t remember anything else using it.
A bunch of Disney movie sites did for a while, back in the day when every movie had it’s own website with trailers, promo, and a link to buy tickets and/or the DVD release.
Ahh good times
The League of Legends launcher used it at one point. Not sure if it still does.
I was going to say there’s no way they still are since Silverlight was discontinued by Microsoft in 2013, but it is Riot Games so ¯\(ツ)/¯
EA Tiburon in Orlando used flash for a while to do the menus in Madden and other sports games.
Be glad, you never had to interact with that ‘technology’. I once did at an internship in 2016.