The cause and effect here are reversed. Ever increasing abstraction does not reduce quality, ever reduced emphasis on quality leads to greater abstraction. You only abstract away the nuances when you stop caring about them. When you have so much memory that you can tolerate a massive memory leak, fixing the memory leak gets deprioritized and the tool that saves time at the expense of abstracting away memory management becomes more attractive.

The real issue though is that the costs of these abstractions are obfuscated after the fact. There is almost certainly low hanging fruit that could massively improve performance and reduce operating costs with little effort, but there is not a will to analyze and quantify these costs. Adding new features is sexy, it's the stuff people want to work on, it's the stuff people get promoted for doing. You don't get promoted for preventing the issue that never happened, even if it saves billions of dollars. You can't brag about making your software imperceptibly faster, even if it saves a tremendous amount of time across the user base.

Software has always only needed to be good enough, and the threshold for good enough has been lowering exponentially with hardware improvements since time immemorial. The system is not set up to favor people who care. If nobody cares, nothing will ever improve.

If there were a good way to track every time a fixed bug would have been triggered had it gone unfixed, and if that cost were quantified and visible, there would be a massive push for better quality.