once you realize you are 30-40-50..% slower than your peers you will not only want it but need it to keep being employed.
saying “i am learning less, growing less…” is just so amazingly shortsighted, like farmers when first tractors arrived going “fuck this, imma walk this field in 100 degree weather - doing it myself”(sound familiar?).
But even if I didn't, I'd still choose the team that delivers 1 month later with high quality code, over the team that cranks out dozens of half broken features they have no idea how to support. Mandating I spend the same amount on a team able to barely maintain that team's mistakes. I know I'm just making the "buy once, cry once" argument, and I understand that's "not how business works" but I'm an engineer, not a manager trading company interest for a promo.
To extend the analogy, I'm reminded of the pets vs cattle debate in the context of managing computer systems, and how that distinction continues to shake out. I suspect that as personal computing moves toward cloud computing, following businesses, we will see tension between local-first AI solutions and cloud AI products. Seeing as how the same megacorps that build personal computer pets and pet products are building cloud computing AI cattle products with AI that gives the benefits of pets with some of the upsides of cattle as well.
Local compute vs cloud compute is not the same distinction as that between pets vs cattle, but it has many of the same market participants, and with both Microsoft and Apple making moves to bundle AI capabilities with other cloud products for free with subscription value-add-on features is interesting. These features entice users into the vendor walled gardens, nudging users subtly away from local-first pets.
That's where experience comes in. I can check someone's work way faster than I can write it. AI's right now are like junior assistants who need to be supervised. They are wrong sometimes but they can help you as long as you know their limits.
How does someone gain experience if they are using AI from the start? They never develop the skills by putting in the reps.
I also find it much harder to go through someone else’s code than to work with my own. If I’m just glancing at it, it seems to technically work, and it needs a rubber stamp… sure, that’s easy. If it doesn’t work, especially if it’s a logic issue rather than a hard error, that takes time to read and learn the context of everything that’s going on. Will someone in school today even have the skills to do that if they only ever use AI?
Sorry guy, but I don't believe that. You can check for obvious mistakes faster. You will not see any other than obvious mistakes analyzing the code as double as long than it took to write.
Poor analogy. The farmer doesn’t learn new skills or get better at walking every day.
When automation actually does equivalent work in less time or at lower cost it will replace human labor. Right now LLMs cannot deliver equivalent work compared to an experienced and skilled human programmer. If you experience an LLM able to do your job that says more about your skills than the LLM.
I have lived through several waves of technology and change that would supposedly make me obsolete as a programmer and system admin. Over forty years in the business now, I have learned to take those predictions with a lot of skepticism.
If LLMs get close to doing any part of my work I will incorporate them, like I have previous “threats” like cloud computing. I won’t panic too soon or give up because I have enough experience and expertise to know better.
I can't predict what might happen, but right now LLMs don't do much of value in the domains I work in. They seem mainly to exist as excuses for stock pumping and managers cutting their labor costs to the bone. The people who don't get laid off cling to their jobs out of fear. The threat to (mostly junior) programmers comes from turning our work into a commodity, not from automation itself.
Maybe your analogy does apply. A person who can drive a tractor will indeed outperform a farmer in one important but limited way: covering more area in a day than a farmer on foot. But the person who can only drive a tractor likely won't have all of the skills and expertise of a farmer who knows the land, soil, crops, weather, etc. from direct experience. And the person who can only drive a tractor -- but cannot actually grow anything or manage a farm -- becomes more interchangeable and replaceable. Tractor driving already gets replaced by robots, but knowing how to run a farm does not. Likewise a good programmer offers a lot more value than just the code they write; in fact code itself has no value without the context of a business domain and a theory [1] of the system.
By the way, I'm one of those old guys who still use vim and the command line, as I have since the late 1970s, and will continue to do so. I can use IDEs and GUIs if I have to, I didn't get left behind, but those don't improve my efficiency or quality. They can slow me down and distract me with futzing with tools rather than getting real work done, and they don't work in all of the environments I need them to. I haven't had any problem keeping up with the IDE users.
old as well, in the industry since 1996… I get what you are saying but I can honestly tell you I have spent last 6-ish months analyzing these tools and making comparison between vim and IDE vs. someone (not) using LLMs is not even remotely reasonable comparison…
given your age/time in the industry you may be able to coast to retirement and finish the career and say “vim got me through it all…” - someone with 10-years in will never be able to experience that
I have played with LLMs (and many IDEs). I won't say they suck, but they don't give me any huge boost either. Other programmers may get more mileage out of them, but so far no one I have worked with reports any difference that seems more significant than novelty. We all feel more "productive" when we challenge ourselves to learn a new language, tool, programming technique, platform, etc. When that wears off we have to find the next shiny thing. LLMs have the added feature of FOMO and actual fear of unemployment that, for example, VSCode did not have.
All tools take time to master, and then once mastered they can maintain an advantage over competitors that seem objectively better in some ways. I used VSCode for a year and ended up going back to vim. Nothing really wrong with VSCode but it wasn't a 10X or even 2X improvement for me. But I have internalized vim and the Unix/Linux command line tools to a degree that -- as you point out -- younger programmers don't have the runway to catch up with.
As a freelancer (and for a long time a f/t employee in various companies) I gradually figured out that the best value I can offer (and the best way to stay employed) is not how fast I can produce code, but how well I can understand the business domain and then tease out and translate requirements into software solutions. Optimizing for producing more or "better" code (however we measure that) I think sends a lot of programmers down the wrong path, one that leads to over-specialization, commodification of skills, and eventual replacement by outsourcing or AI. I think it will take a lot longer for LLMs to do the real work of software development, which has about as much to do with producing code as word processing has to do with writing an interesting book.
Wake me when AI is useful for more than novices in a field, in a small codebase, which is where I've seen the people most bullish on LLMs improving their productivity.
Did you make the same predictions about dBASE/Paradox, MS Access, 4GLs, that weird Sun thing there you made Java applets via dragging and dropping logic diagrams, etc, at the time? What makes LLMs different from the last ten times the industry tried to convince us that programming would be obsolete any day now?
saying “i am learning less, growing less…” is just so amazingly shortsighted, like farmers when first tractors arrived going “fuck this, imma walk this field in 100 degree weather - doing it myself”(sound familiar?).