There's almost no product or piece of software that I use today that doesn't have at least 2 bugs that I run into on a daily basis. Every website, web app, mobile app, console app, etc, they all have clearly user-affecting bugs. And nearly every one of those makes it hard for me to diagnose or report the bugs. I spent at least 15 to 30 minutes every day working around bugs so I can live my life.

We have a vastly different software culture today. Constant churning change is superior to all else. I can't go two weeks without a mobile app forcing me to upgrade it so that it will keep operating. My Kubuntu 24.04 LTS box somehow has a constant stream of updates even though I've double-checked I'm on the LTS apt repos. Rolling-release distros are an actual thing people use intentionally (we used to call that the unstable branch).

I could speculate on specifics but I'm not a software developer so I don't see exactly what's going on with these teams. But software didn't used to be made or used this way. It felt like there were more adults in the room who would avoid making decisions that would clearly lead to problems. I think the values have changed to accept or ignore those problems. (I don't want to jump to the conclusion that "they're too ignorant to even know what potential problems exist", but it's a real possibility)

I was a software developer, and have some idea where this comes from. Keeping track of multiple versions of software to separate bug fixes from new features is hard, but calling whatever is in version control on the first Friday of every month "version N+1" is easy. Back when users had to pay for a stack of floppies or CDs to get a new version, you had to give them a compelling reason to do so. When it's nearly impossible for them to prevent the new version from being auto-installed, you don't.