Throughout using Linux here and there for like two decades or so, my only issues were Ubuntu forcing some very Microsoft-ish decisions on me, which I did not like. Plus, this very very stable very stable Debian breaking upon version upgrades (I have no idea why, I keep running mostly default Debian since forever). These days I mostly use Arch and Fedora (on those shared computers I don’t bother to config to my liking), and they were mostly flawless for like years. I have some things I don’t like, but they aren’t too many and minuscule. I used a MacBook Pro for like over a decade, but left macOS earlier than this LiquidAss fiasco, so I cannot relate really. But still reading all these complaints about Windows and macOS, it looks like your guys only issue with Linux is ‘I did not make any effort to understand the system, I’d use my weird pervert Windows baggage and expect it would just work the same way.’ Hey, it wouldn’t. Take a weekend to research, take a month to play with Linux on some non-critical hardware (buy a used ThinkPad or ThinkCentre). I’d say Linux is quite ready for most things these days. Yes, not all hardware may work well, but once you understand the reasons for that, you won’t blame the community, but rather those who intentionally do nothing to make their own hardware work. I’m looking at you Nvidia. And even them, it looks like, started doing something. Switching my desktops from almost a decade on macOS, I mostly feel like an upgrade. Even on a MacBook! I wish some software to be better, but it’s getting there slowly, even without my help.
Thanks for reinforcing my point