I fully get that macOS is not perfect, but checking out "modern" Linux (like a customized Arch) is a bit underwhelming. It still looks to me like Linux 20 years ago. And I started with Linux in the mid 90s. Not much has changed or improved on the pure fundamentals. I guess it's fine if all you do is sit doing CLI or spending your day in web browsers.
Day to day macOS driving to me is an absolute joy (granted, I'm still on Sonoma).
I do a lot of work in terminals but I also enjoy other apps, where that uniformity of Cocoa comes into play. And if you go deeper into Mach/Darwin, it's extremely powerful. On the userland .. from the launcher to dtrace and dynamic linker tricks and low level hooks. A lot of cool macOS APIs to experiment with, public or private. AppleScript/Automater, private frameworks like SkyLight (nifty!)
Oh and don't get me started on MLX...
To me, as a developer and as a power user, macOS delivers everything - and more.
Sometimes it does change and when that happens is for the worse.
Some developers suddenly realize that X system is old, and then they try to redo it from zero.
And when they do that, they throw decades of feature development down the drain:
- Xorg: Was Wayland worth the 10+ years of manpower needed to catch up?
- Synaptics: Now we have libinput, less configurable and with way fewer features
- Gnome: Something that happens when the devs think "If Apple can, then we can too" but without the money to invest in good UX (Gnome2 had actual UX research done by Sun)
- Systemd: I'll concede that nobody liked SystemV. But we also had OpenRC and strangely got ignored.
Sometimes "developercracy" is terrible, and we spend years arguing if Rust or Not, instead of trying to make good software
I agree with every point made except there’s two caveats;
1) I am a bonafide systemd hater, and I am bent out of shape about the fact other init systems (more akin to SMF) were (and are) routinely ignored when discussing what was available. But: I feel like Linux desktops are better now for systemd. Even if I can’t tolerate how it spiders into everything personally.
2) Wayland was a “We have pushed X as far as it will go, and now we’re going to have to pay down our tech debt” by the X11 developers themselves.
I know it was “baby with the bathwater”, but in theory we don’t need to do that again for the next 50 years because we have a significantly better baseline for how computers are actually used. The performance ceiling has been lifted because of Wayland; consistent support for multiple monitors and fractional scaling are things we have today because of Wayland.
I won’t argue about security, because honestly most people seem to want as little security as possible if it infringes on software that used to work a certain way, but it should be mentioned at some point that a lack of security posture leads to a pretty terrible experience eventually.
So, yes, Wayland was worth the 10y cost, because the debt was due and with interest. Kicking the can down the road would most likely kill desktop Linux eventually.
> because we have a significantly better baseline for how computers are actually used.
Except, they don't. X was device agnostic. Wayland makes some asumptions which will be wrong in 10 years. And being a monolith does not help.
this "device-agnosticism" is also the source of many of X11's modern problems. Because the X server has to handle all rendering and input, it acts as a middleman. For every frame, an application has to send rendering commands to the X server, which then composites the scene and sends it to the display. This extra step introduces latency and makes it difficult to implement modern features like smooth animations, variable refresh rates, and HDR. In contrast, Wayland's design is based on the idea of direct rendering. The application renders a frame directly to a buffer in memory, and then the Wayland compositor takes this finished buffer and displays it. This approach is highly efficient for modern GPUs and display technology. The trade-off is that it ties the display protocol more closely to the graphics hardware, but this is a necessary step to achieve the high performance and low latency that modern users expect.
"Wayland makes some assumptions which will be wrong in 10 years."
This is a fair and common criticism. Yes, Wayland assumes a graphics stack that is based on OpenGL/Vulkan and a kernel with a Direct Rendering Manager (DRM). This works well today because modern Linux graphics drivers are built around this model.
However, an X11 advocate might argue that this tight coupling could be a problem if a new, fundamentally different type of display technology or graphics hardware emerges. With its modular design, X11 could theoretically adapt by adding new extensions.
Wayland developers have addressed this by keeping the core protocol simple and extensible. New features, like HDR or adaptive sync, are implemented as extensions to the base protocol. The hope is that this design allows Wayland to evolve without the bloat and complexity that burdened X11. While it's impossible to predict the future, Wayland's developers believe that its modular design is flexible enough to handle future changes in display technology.
Which I think is fair.
> introduces latency and makes it difficult to implement modern features like smooth animations,
Move X [Wayland] into kernel space [provided stability isn't a concern].
It's easy to type that out, of course.
I backed up and upgraded to Tahoe earlier today so I could fully immerse myself in the disaster of a release that is being reported here today. After a half day with Tahoe I simply cannot understand all the drama.
It's fine. Maybe a bit more of a departure UI wise and not as polished as the previous release but whatever, I don't see how this would cause anyone to throw in the towel on Apple and move to Linux. I've used both for years and would never chose to use Linux exclusively or even primarily.
Yeah... that's my take so far. There are some issues here and there, but the incredible ranting that's been going on seems overblown.
> It still looks to me like Linux 20 years ago.
I know this seems like a down side to you but the person you are replying to notes this as something they love about the platform. It not changing over time "just to change" is the point.
I love(d) Linux and I've used it a lot over the years, but I finally got fed up and bought a Macbook Pro a few weeks ago. I find myself fiddling with my Linux machine way more than I'd like. I'm sure whatever distro I was using (Ubuntu) will seem like the problem to someone who likes another distro, but that's just more fiddling to me. It took me a while to get to where Zoom video calls with screen sharing worked properly. The last straw was that my wifi card stopped working one day. Debugging OS issues with no internet is really hard.
I've got my dev environment set up on my new Macbook Pro and everything is working perfectly and I'm very happy.
> It still looks to me like Linux 20 years ago. And I started with Linux in the mid 90s. Not much has changed or improved on the pure fundamentals.
Point taken, but that is exactly the quality I said I liked about it. I hope that 20 years from now my desktop will be exactly the same. The disjoint UI bothers me to an extent. I mostly use KDE apps or things built with Qt, but you're right that nothing is uniform. That said, I'd take disunity if it means stability. I don't care if the buttons in different apps look different, but don't take them away. Just look at what they did to Mail.app--in 2010 it was beautiful. Last I used it in 2020 it seemed like all the power user features of it were gone or hidden and everything was under a dot dot dot menu instead of out on the toolbar.
> Not much has changed or improved on the pure fundamentals
I'm not sure what you're trying to say here. What DE are you talking about? Or are you talking about something else?
Depends on the role. I build AI Agents for a living and Linux is for this edge case better.
> It still looks to me like Linux 20 years ago.
Like visually? I personally don't care much for animations, transitions, rounded corners (this one I actually hate, because you can't even disable them on mac). I'm not a florist, I am programmer. I want efficiency not the looks, bells and whistles. Although I recently started using Hyprland, and oh my, those window animations and transitions are super nice, not to mention that you can completely control every aspect of them.
> Oh and don't get me started on MLX...
Your pittance Apple gives you because they refuse to sign CUDA drivers? That MLX?
If you're into LLM's you need to consider using Apple Silicon so you can continue developing on your own machine with MLX. MLX isn't just a replacemnet for CUDA, no way - it's made for the Apple arch. The new M5 chip will likely have serious ANE and ability to use the unified memory. Then we're talking powerhouse. Current Mac Studios are already more cost effective than say RTX 6000.
That's just a unified memory architecture. Strix Halo has this, too. It's not only Apple. Memory bandwidth is lower than the ultra series but faster or on par for every other M series processor.