If you haven't noticed a dramatic decline in average software quality, you're not paying attention or willfully ignoring it. The article is right.
This is partly related to the explosion of new developers entering the industry, coupled with the classic "move fast and break things" mentality, and further exacerbated by the current "AI" wave. Junior developers don't have a clear path at becoming senior developers anymore. Most of them will overly rely on "AI" tools due to market pressure to deliver, stunting their growth. They will never learn how to troubleshoot, fix, and avoid introducing issues in the first place. They will never gain insight, instincts, understanding, and experience, beyond what is acquired by running "AI" tools in a loop. Of course, some will use these tools for actually learning and becoming better developers, but I reckon that most won't.
So the downward trend in quality will only continue, until the public is so dissatisfied with the state of the industry that it causes another crash similar to the one in 1983. This might happen at the same time as the "AI" bubble pop, or they might be separate events.
Is this measureable? Like code readability scores on the GitHub corpus over time?
Maybe. Personally I've observed an increase of major system and security failures in the past 5 years, especially failures that impact very large tech companies. You could measure these public failures and see if frequency or impact has increased.
The number of security failures now is nothing close to the golden age of malware in the 90s/early 2000s.
The #1 security exploit today is tricking the user into letting you in, because attacking the software is too hard.
You make a strong point, but now we also have smartphones, ioT devices and cloud networks EVERYWHERE and there is tons of shared open source code (supply chain attacks), and there are tons of open-source attacker tools,vuln databases and exploits (see nuclei on github).
Yes, many/most systems now offer some form of authentication, and many offer MFA, but look at the recent Redis vulns -- yet there are thousands of Redis instances vulnerable to RCE just sitting on the public internet right now.
Bah.
It's #1 one because it's easier than the alternative. But the alternative is also not hard. It's just not worth the effort.
The complaint is not about the readability of the code but of the quality and cost effectiveness of the deployed software.
Code readability has nothing to do with it.
I suppose it could be quantified by the amount of financial damage to businesses. We can start with high-profile incidents like the CrowdStrike one that we actually know about.
But I'm merely speaking as a user. Bugs are a daily occurrence in operating systems, games, web sites, and, increasingly, "smart" appliances. This is also more noticeable since software is everywhere these days compared to a decade or two ago, but based on averages alone, there's far more buggy software out there than robust and stable software.
Eh, after 20 years in the industry, I think that the overall quality of software is roughly the same. Matter of fact, my first job was by far the worst codebase I ever worked at. A masterclass in bad practices.