> People don't realize how much software engineering has improved.
It has, but we have gotten there by stacking turtles, by building so many layers of abstraction that things no longer make sense.
Think about this hardware -> hypervisor -> vm -> container -> python/node/ruby run time all to compile it back down to Bytecode to run on a cpu.
Some layers exist because of the push/pull between systems being single user (PC) and multi user (mainframe). We exacerbated the problem when "installable software" became a "hard problem" and wanted to mix in "isolation".
And most of that software is written on another pile of abstractions. Most codebases have disgustingly large dependency trees. People keep talking about how "no one is reviewing all this ai generated code"... Well the majority of devs sure as shit arent reviewing that dependency tree... Just yesterday there was yet another "supply chain attack".
How do you protect yourself from such a thing... stack on more software. You cant really use "sub repositories/modules" in git. It was never built that way because Linus didnt need that. The rest of us really do... so we add something like artifactory to protect us from the massive pile of stuff that you're dependent on but NOT looking at. It's all just more turtles on more piles.
Lots of corporate devs I know are really bad at reviewing code (open source much less so). The PR code review process in many orgs is to either find the person who rubber-stamps and avoid the people who only bike shed. I suspect it's because we have spent the last 20 years on the leet code interview where memorizing algorithms and answering brain teasers was the filter. Not reading, reviewing, debugging and stepping through code... Our entire industry is "what is the new thing", "next framework" pilled because of this.
You are right that it got better, but we got there by doing all the wrong things, and were going to have to rip a lot of things apart and "do better".