I don't want to sound too dismissive, but all these arguments have been brought up time and again. The move from assembler to high level languages. The introduction of OOP. Component architecture / COM / CORBA / etc. The development of the web browser. The introduction of Java.

2018 isn't "the start of the decline", it's just another data point on a line that leads from, y'know, Elite 8-bit on a single tape in a few Kb through to MS Flight Simulator 2020 on a suite of several DVDs. If you plot the line it's probably still curving up and I'm not clear at which point (if ever) it would start bending the other way.

We have always had, and always will have, the quality of software that people are willing to pay for.

That would be the case under market conditions where buyers are making rational decisions with perfect knowledge based on all available choices. Does that sound like the system we have? To me, reality seems more like a small set of oligopolies or effective monopolies, byzantine ownership structures and a pursuit of short term profits pushing future costs elsewhere as externalities.

I didn't say we get the quality of software people would rationally pay for in a rational system, if the right people were paying for it. I said we get the quality of software that people pay for.

To me on markets where customer actually gets to choose what to buy or play the weaker options have much less success. Gaming is really one example. There is still sales, but they are lot less than expected even from big players if they don't look like good products.

This. There are plenty of people trying to keep using Windows 10, and Microsoft is trying to force them to use Windows 11, which they do not want. The same goes for Mac OS 26. "Choice" doesn't matter.

No. Not "willing," that implies that the options meaningfully exist. They don't.

"Willing AND ABLE" works here though.

> If you plot the line it's probably still curving up and I'm not clear at which point (if ever) it would start bending the other way.

I suspect when Moore‘s law ends and we cannot build substantially faster machines anymore.

One interesting thing that most non-systems programmers don’t know is that memory and cpu performance have improved at completely different rates. That’s a large part of why we have x times faster CPUs but software is still slow.

The systems people worry more about memory usage for this reason, and prefer manual memory management.

> ... memory and cpu performance have improved at completely different rates.

This is overly simplified. To a first approximation, bandwidth has kept track with CPU performance, and main memory latency is basically unchanged. My 1985 Amiga had 125ns main-memory latency, though the processor itself saw 250ns latency - current main memory latencies are in the 50-100ns range. Caches are what 'fix' this discrepancy.

You would need to clarify how manual memory management relates to this... (cache placement/control? copying GCs causing caching issues? something else?)

Moore's Law has been dead for a long time. The doubling rate of transistors is now drastically below Moore's prediction.

We're adding transistors at ~18%/year. That's waaaaay below the ~41% needed to sustain Moore's law.

Even the "soft" version of Moore's law (a description of silicon performance vs. literally counting transistors) hasn't held up. We are absolutely not doubling performance every 24 months at this point.

Moore's law has kind of ended already though, and maybe has done for a few years, and even if you can make a chip which is faster there's a basic thermodynamics problem running it at full tilt for any meaningful period of time. I would have expected that to have impacted software development, and I don't think it particularly has, and there's also no obvious gain in e.g. compilers or other optimization which would have countered the effect.

Probably architecture changes (x86 has a lot of historic baggage that difficults newer designs) and also more specialized hardware in the CPU, probably this might also be one of the reasons Apple went this way with the M Silicon

But the machines aren't really "faster" in clock speed— for a long time now the gains have been in better and more local caching + parallelism at both the core and instruction level.

> parallelism at both the core and instruction level

Which most programs don't take advantage of.

Neural networks do, which is part of why they’re taking off right now.

[deleted]

I blame software updates. That's when software went from generally working on release to not at all.

Agile management methods set up a non-existent release method called "waterfall" as a straw man, where software isn't released until it works, practically eliminating technical debt. I'm hoping someone fleshes it out into a real management method. I'm not convinced this wasn't the plan in the first place, considering that the author of Cunningham's law, that "The best way to get the right answer on the Internet is not to ask a question; it's to post the wrong answer." was a co-signer of the Agile manifest.

It'll take a lot of work at first, especially considering how industry-wide the technical debt is (see also: https://xkcd.com/2030/), but once done, having release-it-and-forget-it quality software would be a game changer.

> a non-existent release method called "waterfall" as a straw man

The person that invented the name never saw it, but waterfall development is extremely common and the dominant way large companies outsource software development even today.

The only thing that changed now is that now those companies track the implementation of the waterfall requirements in scrum ceremonies. And yes, a few more places actually adopted agile.

> I blame software updates. That's when software went from generally working on release to not at all.

I agree. So much software these days treats users as testers and is essentially a giant test-in-production gaffe.

Ha. I was tasked to teach (classic) Project Management without being super-familiar.

Then I had to get familiar with the new stuff; waterfall, agile whatever.

They literally are all nothing but hacks that violate the basic points of actual project management. (e.g. Projects have a clear end)

I think another part of this, is that Tech is perhaps the only industry that hasn't quite gotten over itself yet.

Writing code is artistic the same way plumbing is artistic.

Writing code is artistic the same way home wiring is artistic.

Writing code is artistic the same way HVAC is artistic.

Which is to say, yes, there is satisfaction to be had, but companies don't care as long as it gets the job done without too many long-term problems, and never will care beyond that. What we call tech debt, an electrician calls aluminum wiring. What we call tech debt, a plumber calls lead solder joints. And I strongly suspect that one day, when the dust settles on how to do things correctly (just like it did for electricity, plumbing, flying, haircutting, and every other trade eventually), we will become a licensed field. Every industry has had that wild experimentation phase in the beginning, and has had that phase end.

> Writing code is artistic the same way home wiring is artistic.

Instead of home wiring, consider network wiring. We've all seen the examples of datacenter network wiring, with 'the good' being neat, labeled and easy to work with and 'the bad' being total chaos of wires, tangled, no labels, impossible to work with.

IE. The people using the datacenter don't care as long as the packets flow. But the others working on the network cabling care about it A LOT. The artistry of it is for the other engineers, only indirectly for the customers.

Perhaps. But put another way:

Writing code is artistic the same way writing text is.

Whether that is a function call, an ad, a screen script, a newspaper article, or a chapter in a paperback the writer has to know what one wants to communicate, who the audience/users will be, the flow of the text, and how understandable it will be.

Most professionally engaged writers get paid for their output, but many more simply write because they want to, and it gives them pleasure. While I'm sure the jobs can be both monetarily and intellectually rewarding, I have yet to see people who do plumbing or electrical work for fun?

> companies don't care as long as it gets the job done without too many long-term problems

Companies don't care as long as it gets the job done without too many VERY SHORT TERM problems. Long term problems are for next quarter, no reason to worry about them.

And they somewhat have a point. What's the point of code quality, if it delays your startup 6 months, and the startup goes under? What's the point of code quality, if it will be replaced with the newest design or architecture change in 6 months? What's the point of planning for 5 years if a pandemic or supply chain shock could muck it up? What's the point of enforcing beautiful JQuery code... in 2012?

The problem isn't that companies make these tradeoffs. It's that we pretend we're not in the same boat as every other trade that deals with 'good enough' solutions under real-world constraints. We're not artists, we're tradesmen in 1920 arguing about the best home wiring practices. Imagine what it would be like if they were getting artistic about their beautiful tube-and-knob installations and the best way to color-code a fusebox; that's us.

What in the bad rhetoric is this? The trades did and still do have standards.

Hell there was a whole TikTok cycle where people learned there is a right and wrong way to lay tile/grout. One way looks fine until it breaks, the other lasts lifetimes.

It’s the exact same trend as in software: shitty bad big home builders hire crap trades people to build cheap slop houses for suckers that requires extensive ongoing maintenance. Meanwhile there are good builders and contractors that build durable quality for discerning customers.

The problem is exploitation of information asymmetries in the buyer market.

> The trades did and still do have standards.

Yes, they do; after regulation, and after the experimentation phase was forcibly ended. You can identify 'right and wrong' tile work, precisely because those standards were codified. This only reinforces my point: we're pre-standardization, they're post-standardization, and most pre-standardization ideas never work out anyway.

For a startup good quality code will never make a difference if everything else is wrong, i.e. product market fit etc. But conversely poor quality code can destroy a startup (the product cannot pivot fast enough, feature development grinds to a halt, developers leave, customers are unsatisfied etc.) even if everything else is right.

I don't see working for most of my employers as "artistic."

I do see it as more of a craft than a typical trade. There are just too many ways to do things to compare it to e.g. an electrician. Our industry does not have (for better or for worse) a "code" like the building trades or even any mandated way to do things, and any attempts to impose (cough cough Ada, etc.) that have been met with outright defiance and contempt in fact.

When I'm working on my own projects -- it's a mix of both. It's a more creative endeavour.

> I do see it as more of a craft than a typical trade. There are just too many ways to do things to compare it to e.g. an electrician.

There are sooo many ways to get electricity from one point to another. The reason that a lot of those options are no longer used is not because they don't exist but because they were legislated out. For example, if you want to run wild just run a single "hot" wire to all your outlets and connect each outlet's neutral to the nearest copper plumbing. Totally esoteric, but it would deliver electricity to appliances just fine. Safety is another matter.

I don't see this as really disproving my point.

If we look at most trades historically:

- Electricians in the 1920s? Infinite ways to do things. DC vs AC wars. Knob-and-tube vs conduit vs armored cable. Every electrician had their own "creative" approach to grounding. Regional variations, personal styles, competing philosophies. Almost all of those other ways are gone now. Early attempts to impose codes on electricians and electrical devices were disasters.

- Plumbers in the 1920s? Lead vs iron vs clay pipes. Every plumber had their own joint compound recipe. Creative interpretations of venting. Artistic trap designs. Now? Why does even installing a basic pipe require a license? We found out after enough cholera outbreaks, methane explosions, and backed-up city sewer systems.

- Doctors in the 1920s? Bloodletting, mercury treatments, lobotomies, and their own "creative" surgical techniques. They violently resisted the American Medical Association, licensing requirements, and standardized practices. The guy who suggested handwashing was literally driven insane by his colleagues.

We're early, not special. And just like society eventually had enough of amateur electricians, plumbers, and doctors in the 1920s, they'll have enough of us too. Give it 40 years, and they'll look at our data breaches and system designs the same way we look at exposed electrical wiring, obviously insane no matter the amount of warnings.

While I agree with the general point of treating coding as any other craft or tradeskill, I disagree that in 40 yers non-technical people will be able to evaluate on system design or data breaches. Programming is too arcane and esoteric for non technical people. It all happens too behind the scenes for people to connect the dots.

I always say that code quality should be a requirement as any other. Many businesses are fine with rough edges and cut corners if it means things are sort of working today rather than being perfect tomorrow. Other businesses have a lower tolerance for fail and risk.

If you haven't noticed a dramatic decline in average software quality, you're not paying attention or willfully ignoring it. The article is right.

This is partly related to the explosion of new developers entering the industry, coupled with the classic "move fast and break things" mentality, and further exacerbated by the current "AI" wave. Junior developers don't have a clear path at becoming senior developers anymore. Most of them will overly rely on "AI" tools due to market pressure to deliver, stunting their growth. They will never learn how to troubleshoot, fix, and avoid introducing issues in the first place. They will never gain insight, instincts, understanding, and experience, beyond what is acquired by running "AI" tools in a loop. Of course, some will use these tools for actually learning and becoming better developers, but I reckon that most won't.

So the downward trend in quality will only continue, until the public is so dissatisfied with the state of the industry that it causes another crash similar to the one in 1983. This might happen at the same time as the "AI" bubble pop, or they might be separate events.

Is this measureable? Like code readability scores on the GitHub corpus over time?

Maybe. Personally I've observed an increase of major system and security failures in the past 5 years, especially failures that impact very large tech companies. You could measure these public failures and see if frequency or impact has increased.

The number of security failures now is nothing close to the golden age of malware in the 90s/early 2000s.

The #1 security exploit today is tricking the user into letting you in, because attacking the software is too hard.

You make a strong point, but now we also have smartphones, ioT devices and cloud networks EVERYWHERE and there is tons of shared open source code (supply chain attacks), and there are tons of open-source attacker tools,vuln databases and exploits (see nuclei on github).

Yes, many/most systems now offer some form of authentication, and many offer MFA, but look at the recent Redis vulns -- yet there are thousands of Redis instances vulnerable to RCE just sitting on the public internet right now.

Bah.

It's #1 one because it's easier than the alternative. But the alternative is also not hard. It's just not worth the effort.

The complaint is not about the readability of the code but of the quality and cost effectiveness of the deployed software.

Code readability has nothing to do with it.

I suppose it could be quantified by the amount of financial damage to businesses. We can start with high-profile incidents like the CrowdStrike one that we actually know about.

But I'm merely speaking as a user. Bugs are a daily occurrence in operating systems, games, web sites, and, increasingly, "smart" appliances. This is also more noticeable since software is everywhere these days compared to a decade or two ago, but based on averages alone, there's far more buggy software out there than robust and stable software.

Eh, after 20 years in the industry, I think that the overall quality of software is roughly the same. Matter of fact, my first job was by far the worst codebase I ever worked at. A masterclass in bad practices.

One salient difference is that typically abstraction layers trade performance (usually less than polemicists like the article author think) for improvements in developer efficiency, safety, generality, and iteration speed.

Current tools seem to get us worse results on bug counts, safety, and by some measures even developer efficiency.

Maybe we'll end up incorporating these tools the same way we did during previous cycles of tool adoption, but it's a difference worth noting.

I don't think software has gotten worse, quite the opposite, but Java and OOP were mistakes.

Every time someone says this I ask them “what is your solution for maintainable software architecture?” And they say “what is software architecture? I just write code”

I’ll bite: use objects sparingly, and mainly to namespace functions that operate on data. Use inheritance even more sparingly, because it’s a nightmare to work with a poorly conceived inheritance hierarchy, and they’re so easy to get wrong. Pure interfaces are an exception, in languages where you need them. Write mostly functions that transform data. Push IO to the edge where it’s easy to swap out.

Most importantly, never ever abstract over I/O. Those are the ones that leak out and cause havic.

Yeah Go and Rust made the right choice of not supporting inheritance at all.

Non-OOP is pretty mainstream practice already, and it's maintainable. They even redid React. Java conceded a bit with newer features like lambdas.