When I was first starting out as a professional developer 25 years ago doing web development, I had a friend who had retired from NASA and had worked on Apollo.

I asked him “how did you deal with bugs”? He chuckled and said “we didn’t have them”.

The average modern AI-prompting, React-using web developer could not fathom making software that killed people if it failed. We’ve normalized things not working well.

there's a different level of 'good-enough' in each industry and that's normal. When your highest damage of a bad site is reduced revenue (or even just missed free user), you have lower motivation to do it right compared to a living human coming back in one piece.

Yes, of course, but a culture of “good enough” can go too far. One may work in a lower-risk context, but we can still learn a lot from robust architectural thinking. Edge cases, security, and more.

Low quality for a shopping cart feels fine until someone steals all the credit card numbers.

Likewise, perfectionism when it is unneeded can slow teams down to a halt for no reason. The balance in most cases is in the middle, and should shift towards 100% correctness as consequences get more dire.

This is not to say your code should be a buggy mess, but 98% bug free when you're a SaaS product and pushing features is certainly better than 100% bug free and losing ground to competitors.

True, though I'd say more bug impact than bug free-ness. If the 2% of bugs is in the most critical area of your app and causes users to abandon your product then you're losing ground.

That's one thing I think is good to learn from mission critical architecture: an awareness of the impact and risk tolerance of code and bugs, which means an awareness of how the software will be used and in what context by users.