I really don't get this point of view at all. I acknowledge that two yours into my quarter century of experience, most of what I knew was easily replaceable by the AI of today. After two decades of experience however, syntax and specific algorithm and language knowledge was perhaps 10% of my value, nowhere near the vast majority.
The idea that low-paid LLM wranglers are going to push out the experienced engineers just doesn't wash. What I think is much more likely to happen is the number of software engineers greatly reduces, but the remaining ones actually get paid more, because writing code is no longer the long pole, and having fewer minds designing the system at a high level will allow for more cohesive higher-level design, and less focus on local artesenal code quality.
To be honest AI is just the catalyst and excuse for overhiring that happened due to the gold rush over the last 20 years related to the internet and smart phone revolutions, zero-interest rate, and pandemic effect.
> What I think is much more likely to happen is the number of software engineers greatly reduces
So you just believe you'll be one of the ones left behind?
Best of luck to you
> What I think is much more likely to happen is the number of software engineers greatly reduces, but the remaining ones actually get paid more.
You realize that this is contradictory, right? If the number of competitors remains the same, yet there are far fewer jobs, it's a buyer's market: companies have to offer very little to find someone desperate enough.
> It will allow for more cohesive higher-level design, and less focus on local artesenal code quality.
I don't buy this, LLM code is extremely bloated. It never reuses abstractions or comes up with novel designs to simplify systems. It can't say no, it just keeps bolting on code. In a very very abstract sense you might be right, but that's outside the realm of engineering, that's product design.
You raise some good points about the economics, that's where I feel the least confident, but let me explain my reasoning.
Software has eaten the world, and thus the value of maintaining software has never been hire. Engineers are the people who understand how software works. Therefore unless we move away from software, the value of software engineering remains high.
AI does not reduce software, it increases the amount of software, makes messier software and generally increases the surface area of what needs to be maintained. I could be wrong, but as impressive as LLM's language and code processing capabilities are, I believe there is a huge chasm that will likely never be crossed between the human intent of systems and their implementation that only human engineers can actually bridge. And even if I'm wrong, there's another headwind which is that, as Simon Willison has point out, you can't hold an LLM accountable, and therefore corporate leaders are very unlikely to put AI in any position of power, because all the experience and levers they have for control are based on millenia of evolution and a shared understanding of human experience; in short they want a throat to choke.
The other factor is that while AI can clearly replace rote coding today, I think the demos oversell the utility of that software. Sure it's fine to get started, but you quickly paint yourself in a corner if you attempt to run a business on that code overtime where UX cohesion, operational stability and data integrity over time are paramount and not something that can be solved for without a lot of knowledge and guardrails.
So net of all this, where I think we land is a lot of jobs that are based purely on knowledge of one slow-changing system and specific code syntax will go away, but there will be engineers who maintain all the same code, they'll just cover more scope with LLM assisted tools. You put your finger on something, that I do believe this moves engineering closer to product design, but I still think there's a huge amount on the engineering side that LLMs won't be able to do any time soon (both for technical and the social reasons stated above), and ultimately I don't see the boundary the same way you do, as software engineers we have always had to justify our systems by their real world interaction.
> Software is everywhere and thus the value of maintaining software and the value of software engineering remains high.
This is an unfinished argument. What if we get coding agents to maintain software? What if frequent rewriting becomes cheap enough? Something that's a tenth or one hundredth of your salary doesn't have to be good to make for a good business decision. Why do you think every native application has been replaced by slop made up of 10 layers of JS frameworks on top of electron? Nothing matters as long as the product is cheap and fast to pump out, barely works on modern hardware, and makes dough.
> AI does not reduce software, it increases the amount of software.
There's not infinite demand for software. If AI inference costs take 50% of the prior payroll expenses, while making a company twice as efficient, that means we need 4 times as much demand in software engineering at the same salary for everyone to keep their job. What new or improved subscription, app, website, device, or other software product does the world need right now? 99.9% of people use the same 5 apps. Most of their free time, attention, and disposable income has already been captured by trash that is unbeatable due to network effects. Are we all going to sell shitty LLM frontends to businesses until they notice they could have done the same thing themselves? There might be an explosion in new software, but no one there to care about using it.
> I believe there is a huge chasm that will likely never be crossed between the human intent of systems and their implementation that only human engineers can actually bridge.
Maybe, or the AI might just be missing context. Think of all the unwritten culture, practices, and conversations the LLM hasn't been made aware of.
> In short they want a throat to choke.
You're responsible for those under you anyway, this doesn't help. Banking on those in charge being irrational forever in a way that is bad for business, and without ever noticing, is a bad gamble.
> The other factor is that while AI can clearly replace rote coding today [...], X is not something that can be solved for without a lot of knowledge and guardrails.
I'm talking about the world the AI-maximalists predict is rapidly approaching, not where we are today. None of that knowledge and none of those guardrails are hard to grasp intellectually, compared to advanced mathematics for example. Put your institutional knowledge in a .md file and add another agent that enforces guardrails in a loop. The only way out I see is a situation where there are complex patterns that we intuitively grasp, but can't articulate. Patterns that somehow span too much data or don't have enough examples for LLMs to pick up on.
> There will be engineers who maintain all the same code, they'll just cover more scope with LLM assisted tools.
So fewer jobs with lesser qualifications?
> Ultimately I don't see the boundary the same way you do, as software engineers we have always had to justify our systems by their real world interaction.
I've seen the way engineers design products, and I like products designed by engineers, but no layperson does. Laypeople don't want power, privacy, or agency. They care about how things work, and they lie to themselves and others about what they really want. They don't want a native desktop app that streams high-quality audio from a self-hosted collection, they want a subscription that autoplays algorithmic slop through a react native app on their iPhone. Do you really think you're better at appealing to/fleecing customers than people with actual UX, marketing, and behavioral psychology experience? This example only applies to mass-market software, but I'm sure it's not much different in other fields. Engineers keep thinking they could everyone else's job, but they don't do so well in practice.
I'm sort of shocked at how little of my argument seemed to land with you in any way. I'm wondering how many cycles of software hype have you been through? Were you here for the PC revolution, the .com era, smartphone mass adoption?
There's a lot of what-ifs, and worst case scenarios in your reply that I simply don't find likely. I am not drinking the koolaid from the AI maximalists or the doomers. I could be wrong of course, no one can predict the future, but to me the very real, novel and broad utility of LLMs that we are just learning to harness combined with the investment outlook are leading to a mania that has people overestimating where things will land when the dust settles. If I'm wrong then I guess I'll join the disenfranchised masses picking up pitchforks, but I'm not going to waste time worrying about that until I see more evidence that it's actually going that badly.
So far what I see is that software engineers are the ones getting the most actual utility of AI tooling. The reason is that it still requires a precision of thought and specificity to get anything sustainable out AI coding tools. Note this doesn't mean that engineers can design better apps than proper designers, rather my point is that designers and other disciplines still can not go much further than prototypes, they still need engineers to write the prompts, test the output, maintain the system, and debug things when they go wrong. I have worked long enough with large cross-functional teams to know that the vast majority of folks in non-engineering functions simply can not get enough specificity and clarity in their requests to allow an LLM to turn it into a working system that will work over time. The will hit a wall very quickly where new features add bugs faster than they improve things, and the whole thing collapses under its own weight like a mansion of popsicle sticks. And by the way, I don't consider AI-assisted coding to require less qualification than regular coding. Sure you don't need to know as much syntax or algorithms, but you absolutely need to know data modeling, performance, reliability, debugging, consistency, and migration knowledge in order to use AI to contribute to any software that powers a real business, and yeah you might need to develop your product and business sensibilities, but to me that's what been happening throughout the history of computing. Wiring up ENIAC, certainly required qualifications that were not needed for assembly programing, which in turn required certain things that C programmers did not need and so forth, but harnessing the increasing compute power and complexity required new qualifications. I don't think AI will ultimately be that different, it will change the way we work, it doesn't replace what senior engineers do.
> language knowledge was perhaps 10% of my value, nowhere near the vast majority.
Do you not see LLM's catching up with your experience fast?
You might not lose your job, but you'll definitely have to take a pay cut