I hope I don’t come across as too harsh here, but I think a lot of developers are finally being forced to understand that their high salaries and above-average job security were fundamentally predicated on business models that largely didn’t have a ton of competition. In that kind of environment, there is space for a focus on the actual fundamentals, the things-in-themselves, the theory behind the action. Most of this stuff is good and it was a beneficial situation to have that buffer space to allow it.
But ultimately business reality has changed, largely because achieving business goals is dramatically easier with AI tools. This undercuts a lot of the focus on building solid fundamentals, and in a lot of cases that’ll come back to bite the business. But in many scenarios it won’t, and the industry will rumble on.
Those of us working in marketing or journalism or education have already been forced to accept this new reality decades ago, largely because of inventions by software developers. Now devs are just late to their own party.
>I hope I don’t come across as too harsh here, but I think a lot of developers are finally being forced to understand that their high salaries and above-average job security were fundamentally predicated on business models that largely didn’t have a ton of competition.
Would love to see the business and manager types manage software and infrastructure. What's the worst that could happen? Go on, do it. Every time a foot gun goes off it'll be followed by a condescending chuckle.
I used to see 'passion' as the defining factor of how to stay in the field and do well, and that was advice given to people who wanted to join the industry -- who showed the minimum of interest. Now we're going to have these non-technical people who definitely aren't interested and definitely don't have passion for it try to make and manage quality software?
There's a lot of value to be extracted in the period between "we fired all the qualified staff" and "oops, we lost all our customers due to unreliability". In physical industries that may happen sooner or in a more alarming way - you discover the loss of your safety personnel in the form of, say, a refinery explosion. But in software you can just .. break stuff, and leak personal data, and deliver a service which is down quite a lot (see github discussion passim, or endless complaining about Windows 11), and nobody goes away. Partly because software switching costs are so high, partly because the alternatives have the same problems.
This sort of thing happened to, for example, Maplin.
The big poster child is sadly Twitter. A lot of people said it would collapse without 90% of the staff, and that hasn't materialized. I suspect they can't deploy huge changes to the backend, but they never did that much anyway.
(also, those of us not in the US and not in FAANG always wondered how such a steep salary differential could have been maintained forever; more than doctors and lawyers? Comparable to finance bros or the fabled quants? All of those are much more onerous jobs with much harder entrance criteria!)
In many European countries there aren't high salaries and above-average job security for developers, you are considered an office worker like everyone else, this isn't Silicon Valey over here, especially if you come from the Southern Europe countries.
I'm in Southern Europe and developer salaries here are definitely above average. Sure, much less exaggerated than in the US, but still above than the average salary in the country. Even if you limit the comparison to only office workers in the same city, dev is still on the upper half, at least for now.
There is a huge difference between above average and multiples of the average.
I've worked 10+ years as a developer in France where salaries weren't too high to begin with, but I certainly noticed the added competition as it was harder to find a job. I stopped "fighting" for a high-paying role so my experience didn't provide net gains but it still protected me from inflation. The net "gains" rather came from spending less by moving from a rent to a mortage and then making it smaller.
I'm OK with this now, it is what it is, but these years weren't smooth as there were ups and downs and a down after an up can be stressful if you're not ready for it.
nice first-principal analysis, but little connection to material reality, you're looking back at a mere 1-2 years. it is outsourcing dev labor that has killed the domestic market. engineering roles have been moving overseas now (latin america, southern europe, india). any american dev now will have international colleagues, something rarely the case 15 years ago.
not to say AI-tools do not contribute, they lower the bar of entry to the profession after all, but any C-suite/hiring manager is much more arbitrating labor expense than AI-subscriptions.
Ah the threat of no work in order to depress wages. A scenario we’ve seen play out time and again. Typical capitalism.
That doesn’t mean we should accept mediocre. Businesses might not care. Few businesses have bought a product based on how many lines of code it has or how easy the code is to maintain.
Even building software for them for nearly 3 decades it became apparent early on that businesses don’t care. It has always been a point of contention: the struggle to ship now, faster and making sure we ship the right thing and do it well. We had to learn when to give ground and when to pull hard… because in the end there are times when it absolutely matters.
Just because business can’t recognize when it’s about to shoot itself in the foot doesn’t mean we should let them.
This has been the excuse of mediocre developers for decades too. It’s how we ended up with sloppy code in production. Terminals that can’t scroll without flickering or handle much data. Apps that have loading screens on super computers. Software that sometimes works. Ship fast and break stuff.
“The actual fundamentals, the things-in-themselves, the theory behind the action” don’t go away, they change.
Programmers used to work with punch cards, then assembly, then low-level languages with odd quirks. Today few developers even think about first-party code size, micro-optimizations, register allocation, etc. LLMs are just another abstraction.
A developer with the ideal AI code writer (which we’re not at yet) must still think about idea, design, scope, etc. like a product owner or manager. And these concepts have theory, sometimes even math (e.g. time complexity).
EDIT to comment on the article: all abstractions are leaky, but sometimes it rarely matters. Today we do still need to understand code quality and architecture when working with LLMs, or the software will get bad enough that it will affect the company. But maybe not next year. An analogy: stack vs heap, memory allocations, etc. still matter in high-performance software, which isn’t uncommon, but programmers almost never think about register allocation.
LLMs are not another abstraction. ALL OTHER LAYERS you named are fully deterministic, understood, debuggable, etc.
You cannot be serious.
Counter-point: most developers have no idea or eagerness to actually do that debugging, so it doesn't really matter.
It DOES matter, because the claim that LLMs are a layer of abstraction implies that it's somehow more than a random word generator. It does a great job at generating words in the right order, and often, given enough time, datacenter resources, money, and training, they can produce code that runs and does things as expected.
However, there is absolutely nothing stopping an LLM from "deciding" tomorrow that a fix it built a week ago is no longer real, because not only has that fix left its context, but also the bug was not obvious.
LLMs are one of the most general abstractions possible.
LLMs are also quite deterministic if you want them to be - generally, their final token selection is deliberately randomized (the model “temperature”). But the word you’re looking for here is probably not actually determinism, it’s probably something closer to predictability.
In any case, it’s perfectly possible to ensure that the output of LLMs is fully deterministic, debuggable, understandable, and testable.
> You cannot be serious.
I don’t think you’re thinking about this clearly.
With a sufficiently complex prompt and a sufficiently complex codebase, LLMs consistently fail and make mistakes, "forget" parts of the prompt, etc.
There's no comparison to be made between this and, for example, a compiler. It's an incompetent comparison.
> I don’t think you’re thinking about this clearly.
My literal job is dealing with layers of abstraction. I'm thinking pretty clearly when I tell you that, not only are LLMs a super leaky, terrible abstraction, they are also not comparable to any other layers of abstraction. All other layers of abstraction we use are well understood, predictable (as you put it), and DEBUGGABLE.
When claude deletes a fix it did two weeks ago, while trying to fix some unrelated error, do you never stop and think "this is not quite the same as what GCC does"?
This isn’t harsh at all. As I’ve commented before (but this time as well I do no have the receipts/links), it’s been reported that highly paid programmers in the US also brought in a ton of profit; it was not at all the case that their employers had some thin profit margins because the labor was expensive to them. We talking one million USD profit for a 100K USD salary.
They didn’t even earn anything close to what they were worth. According to Marx’ Labor Theory of Value anyway.
However the dice fall now, one of the possible outcomes is that the tech billionaires take that 100K USD for themselves. The very deserving individuals whose job is to sit their arses on automation assets.
Meanwhile workers from other sectors can gloat about how they are now in the same boat as them. The boat of accepting your ever-meagre reality.
> that highly paid programmers in the US also brought in a ton of profit
In Germany for instance I've seen many a company that treated their programmers as a cost center and they actually were (probably a mutually reinforcing self-fulfilling prophecy).
Too many instances of programmers being deployed in such a way that I couldn't possibly see a way that they would get back even that meagre investment that was being made. Fully irrational dev teams doing useless busy work.
Most German "startups" used to be replaceable with Zapier and Pipedrive. That has probably only gotten worse with the advent of LLMs.
Or the margins shrink significantly as the space becomes more mature and competitive, and that surplus mostly goes away.