Thinking AI is an upgrade from pencil to pen gives the impression that you spent zero effort incorporating it in your workflow, but still thinking you saw the whole payoff. Feels like watching my Dad using Eclipse for 20 years but never learning anything more complicated than having multiple tabs.
With that in mind, work on your prompting skills and give it a shot. Here are some things I’ve had immense success using GPT for:
Refactoring code
Turning code “pure” so it can be unit-testable
Transpiling code between languages
Slapping together frontends and backends in frameworks I’m only somewhat familiar with in days instead of weeks
I know in advance someone will tunnel vision on that last point and say “this is why AI bad”, so I will kindly remind you the alternative is doing the same thing by hand… In weeks instead of days. No, you don’t learn significantly more doing it by hand (in fact when accounting for speed, I would argue you learn less).
In general, the biggest tip I have for using LLM models is 1. They’re only as smart as you are. Get them to do simple tasks that are time consuming but you can easily verify; 2. They forget and hallucinate a lot. Do not give them more than 100 lines of code per chat session if you require high reliability.
Things I’ve had immense success using Copilot for (although I cancelled my Copilot subscription last year, I’m going to switch to this when it comes out: https://github.com/carlrobertoh/CodeGPT/pull/333)
Adding tonnes of unit tests
Making helper functions instantly
Basically anything autocomplete does, but on steroids
One thing I’m not getting into on this comment is licensing/morals, because it’s not relevant to the OP. If you have any questions/debate for this info though, I’ll read and reply in the morning.
Your original post referred to wanting to hire people based on the tools they use to do a task, not their ability to do the task - in fact, you talked down to people for using certain tools by calling them elitist. That’s why my pen/pencil comparison is accurate.
I don’t get the downvotes. I’ve hired probably 30+ engineers over the last 5 or so years, and have been writing code professionally for over 20, and I fully agree with your sentiment.
It’s just the general ai hate. It’s not surprising, because machine learning is yet another scam area. But for programming you would be a complete fool to ignore copilot mastery since paper after paper proves it has completely revolutionised productivity. And it’s not normal to think you will be better than everyone when not using an assistant, it’s just the new paradigm. For starters it has made stack overflow be almost obsolete and it was the next most important tool…
Thinking AI is an upgrade from pencil to pen gives the impression that you spent zero effort incorporating it in your workflow, but still thinking you saw the whole payoff. Feels like watching my Dad using Eclipse for 20 years but never learning anything more complicated than having multiple tabs.
For anyone who wants to augment their coding ability, I recommend reading how GPT (and other LLMs) work: https://writings.stephenwolfram.com/2023/02/what-is-chatgpt-doing-and-why-does-it-work/
With that in mind, work on your prompting skills and give it a shot. Here are some things I’ve had immense success using GPT for:
I know in advance someone will tunnel vision on that last point and say “this is why AI bad”, so I will kindly remind you the alternative is doing the same thing by hand… In weeks instead of days. No, you don’t learn significantly more doing it by hand (in fact when accounting for speed, I would argue you learn less).
In general, the biggest tip I have for using LLM models is 1. They’re only as smart as you are. Get them to do simple tasks that are time consuming but you can easily verify; 2. They forget and hallucinate a lot. Do not give them more than 100 lines of code per chat session if you require high reliability.
Things I’ve had immense success using Copilot for (although I cancelled my Copilot subscription last year, I’m going to switch to this when it comes out: https://github.com/carlrobertoh/CodeGPT/pull/333)
One thing I’m not getting into on this comment is licensing/morals, because it’s not relevant to the OP. If you have any questions/debate for this info though, I’ll read and reply in the morning.
Your original post referred to wanting to hire people based on the tools they use to do a task, not their ability to do the task - in fact, you talked down to people for using certain tools by calling them elitist. That’s why my pen/pencil comparison is accurate.
Personally, I think caring about that is silly.
I don’t get the downvotes. I’ve hired probably 30+ engineers over the last 5 or so years, and have been writing code professionally for over 20, and I fully agree with your sentiment.
It’s just the general ai hate. It’s not surprising, because machine learning is yet another scam area. But for programming you would be a complete fool to ignore copilot mastery since paper after paper proves it has completely revolutionised productivity. And it’s not normal to think you will be better than everyone when not using an assistant, it’s just the new paradigm. For starters it has made stack overflow be almost obsolete and it was the next most important tool…
I edited the comment to provide actual info, it was originally just the first paragraph