I feel like every person stating things of this nature are literally not able to communicate effectively (though this is not a barrier anymore, you can get a dog to vibe code games with the right workflow, which to me seems like quite an intellectual thing to be able to do.
Despite that, you will make this argument when trying to use copilot to do something, the worst model in the entire industry.
If an AI can replace you at your job, you are not a very good programmer.
Copilot isn't a model. Currently it's giving me a choice of 15 different models. By all evidence, AI is nowhere close to replacing me, but to hear other people tell it, it is weeks or maybe months away.
Remember when copilot released? It was running some openai thing at the time, now you can choose from many models sure, but if you want a BMW, buy a BMW, don't buy a Nissan with badly strapped on BMW decals.
I don't want a Nissan or a BMW. This was provided by my employer, and I've been asked to use it. To be honest, I don't even understand how your car analogy applies to any of this.
It does generate word salad (and usefulness depends on the person reading it). If both the writer and the reader share a common context, there's a lot that can be left out (the extreme version is military signal). An SOS over the radio says the same thing as "I'm in a dangerous situation, please help me if you can" but the former is way more efficient. LLMs tend to prefer the latter.
> If an AI can replace you at your job, you are not a very good programmer.
Me and millions of other local yokel programmers who work in regional cities at small shops, in house at businesses, etc are absolutely COOKED. No I cant leet code, no I didnt go to MIT, no I dont know how O(n) is calculated when reading a function. I can scrap together a lot of useful business stuff but no I am not a very good programmer.
>no I dont know how O(n) is calculated when reading a function
This is really, honestly not hard. Spend a few minutes reading about this, or even better, ask a LLM to explain it to you and clear your misconceptions if regular blog posts don't do it for you. This is one of the concepts that sounds scarier than it is.
edit: To be clear there are tough academic cases where complexity is harder to compute, with weird functions in O(sqrt(n)) or O(log(log(n)) or worse, but most real world code complexity is really easy to tell at glance.
Do you mean you aren't able to use AI to make software?
The thing you fear is the thing that you could just use to improve yourself?
Why fear a shovel?
Also, I never claimed to be a good programmer either.
Just don't see the point fearing something that makes it infinitely easier and faster to get work done.
I suspect the value you bring to the table is that you are good enough a programmer to translate the problems of the people you work with into working code.
LLMs can do it somewhat, but it can probably leetcode better than even most of the the people who went to MIT.
It's a strange kind of concept creep that "scam" came to mean "bad deal", or "suboptimal deal", or "worse deal" to some users, instead of deception. Not clear what is fraudulent about the listed practice, just sucky.
If you can't trust yourself, you will never be able to trust anyone else.
If you believe the AI is out to get you, that's certainly the reality you will manifest.
reply