It's a simple, timeless, inescapable law of the universe that failures, while potentially damaging, are acceptable risks. The Pareto principle suggests that addressing only the most critical 20% of issues issues yields a disproportionate 80% of the benefits, while the rest of the big bounties yield diminishing marginal returns.
We're seeing bugs in bigger slices because technology is, overall, a bigger pie. Full of bugs. The bigger the pie, the easier it is to eat around them.
Another principle at play might be "induced demand," most notoriously illustrated by widening highways, but might just as well apply to the widening of RAM.
Are we profligate consumers of our rareified, finite computing substrate? Perhaps, but the Maximum Power Transfer Theorem suggests that anything less than 50% waste heat would slow us down. What's the rush? That's above my pay grade.
I guess what I'm saying is that I don't see any sort of moral, procedural, or ideological decay at fault.
In my circles, QA is still very much a thing, only "shifted left" for tighter integration into CI/CD.
Edit: It's also worth reflecting on "The Mess We're In."[0] Approaches that avoid or mitigate the pitfalls common to writing software must be taught or rediscovered in every generation, or else wallow in the obscure quadrant of unknown-unknowns.
>… are acceptable risks.
Close. Failure-free is simply impossible. And believing the opposite fails even harder and dies out.
This is not "acceptable", because there is no alternative, there is no choice or refutation (non-acceptance). It is a fact of life. Maybe even more so than gravity and mechanical friction.