I used Mac for 10 years and started feeling the same irritation around 2017. In 2020 I finally bit the bullet and switched to Linux. The initial investment into getting a usable Arch desktop was horrible. It took several days just to get to the point of something that I could boot into and be productive. It takes me a long time to get things working the way I want, but I kind of enjoy that aspect of total customization. The best part is, though, nothing ever changes. I get things working the way I want, and it just stays that way year after year. No UI language updates, no replacing my default shell, nothing. It just keeps working the way I like.
Now if they could just produce a touchpad as good as a MacBook's, give me 8-10 hours of battery life, and make the construction feel slim and solid, and not like it's going to get crushed in my backpack, and I'd be satisfied.
I fully get that macOS is not perfect, but checking out "modern" Linux (like a customized Arch) is a bit underwhelming. It still looks to me like Linux 20 years ago. And I started with Linux in the mid 90s. Not much has changed or improved on the pure fundamentals. I guess it's fine if all you do is sit doing CLI or spending your day in web browsers.
Day to day macOS driving to me is an absolute joy (granted, I'm still on Sonoma).
I do a lot of work in terminals but I also enjoy other apps, where that uniformity of Cocoa comes into play. And if you go deeper into Mach/Darwin, it's extremely powerful. On the userland .. from the launcher to dtrace and dynamic linker tricks and low level hooks. A lot of cool macOS APIs to experiment with, public or private. AppleScript/Automater, private frameworks like SkyLight (nifty!)
Oh and don't get me started on MLX...
To me, as a developer and as a power user, macOS delivers everything - and more.
Sometimes it does change and when that happens is for the worse.
Some developers suddenly realize that X system is old, and then they try to redo it from zero.
And when they do that, they throw decades of feature development down the drain:
- Xorg: Was Wayland worth the 10+ years of manpower needed to catch up?
- Synaptics: Now we have libinput, less configurable and with way fewer features
- Gnome: Something that happens when the devs think "If Apple can, then we can too" but without the money to invest in good UX (Gnome2 had actual UX research done by Sun)
- Systemd: I'll concede that nobody liked SystemV. But we also had OpenRC and strangely got ignored.
Sometimes "developercracy" is terrible, and we spend years arguing if Rust or Not, instead of trying to make good software
I agree with every point made except there’s two caveats;
1) I am a bonafide systemd hater, and I am bent out of shape about the fact other init systems (more akin to SMF) were (and are) routinely ignored when discussing what was available. But: I feel like Linux desktops are better now for systemd. Even if I can’t tolerate how it spiders into everything personally.
2) Wayland was a “We have pushed X as far as it will go, and now we’re going to have to pay down our tech debt” by the X11 developers themselves.
I know it was “baby with the bathwater”, but in theory we don’t need to do that again for the next 50 years because we have a significantly better baseline for how computers are actually used. The performance ceiling has been lifted because of Wayland; consistent support for multiple monitors and fractional scaling are things we have today because of Wayland.
I won’t argue about security, because honestly most people seem to want as little security as possible if it infringes on software that used to work a certain way, but it should be mentioned at some point that a lack of security posture leads to a pretty terrible experience eventually.
So, yes, Wayland was worth the 10y cost, because the debt was due and with interest. Kicking the can down the road would most likely kill desktop Linux eventually.
> because we have a significantly better baseline for how computers are actually used.
Except, they don't. X was device agnostic. Wayland makes some asumptions which will be wrong in 10 years. And being a monolith does not help.
this "device-agnosticism" is also the source of many of X11's modern problems. Because the X server has to handle all rendering and input, it acts as a middleman. For every frame, an application has to send rendering commands to the X server, which then composites the scene and sends it to the display. This extra step introduces latency and makes it difficult to implement modern features like smooth animations, variable refresh rates, and HDR. In contrast, Wayland's design is based on the idea of direct rendering. The application renders a frame directly to a buffer in memory, and then the Wayland compositor takes this finished buffer and displays it. This approach is highly efficient for modern GPUs and display technology. The trade-off is that it ties the display protocol more closely to the graphics hardware, but this is a necessary step to achieve the high performance and low latency that modern users expect.
"Wayland makes some assumptions which will be wrong in 10 years."
This is a fair and common criticism. Yes, Wayland assumes a graphics stack that is based on OpenGL/Vulkan and a kernel with a Direct Rendering Manager (DRM). This works well today because modern Linux graphics drivers are built around this model.
However, an X11 advocate might argue that this tight coupling could be a problem if a new, fundamentally different type of display technology or graphics hardware emerges. With its modular design, X11 could theoretically adapt by adding new extensions.
Wayland developers have addressed this by keeping the core protocol simple and extensible. New features, like HDR or adaptive sync, are implemented as extensions to the base protocol. The hope is that this design allows Wayland to evolve without the bloat and complexity that burdened X11. While it's impossible to predict the future, Wayland's developers believe that its modular design is flexible enough to handle future changes in display technology.
Which I think is fair.
> introduces latency and makes it difficult to implement modern features like smooth animations,
Move X [Wayland] into kernel space [provided stability isn't a concern].
It's easy to type that out, of course.
I backed up and upgraded to Tahoe earlier today so I could fully immerse myself in the disaster of a release that is being reported here today. After a half day with Tahoe I simply cannot understand all the drama.
It's fine. Maybe a bit more of a departure UI wise and not as polished as the previous release but whatever, I don't see how this would cause anyone to throw in the towel on Apple and move to Linux. I've used both for years and would never chose to use Linux exclusively or even primarily.
Yeah... that's my take so far. There are some issues here and there, but the incredible ranting that's been going on seems overblown.
> It still looks to me like Linux 20 years ago.
I know this seems like a down side to you but the person you are replying to notes this as something they love about the platform. It not changing over time "just to change" is the point.
I love(d) Linux and I've used it a lot over the years, but I finally got fed up and bought a Macbook Pro a few weeks ago. I find myself fiddling with my Linux machine way more than I'd like. I'm sure whatever distro I was using (Ubuntu) will seem like the problem to someone who likes another distro, but that's just more fiddling to me. It took me a while to get to where Zoom video calls with screen sharing worked properly. The last straw was that my wifi card stopped working one day. Debugging OS issues with no internet is really hard.
I've got my dev environment set up on my new Macbook Pro and everything is working perfectly and I'm very happy.
> It still looks to me like Linux 20 years ago. And I started with Linux in the mid 90s. Not much has changed or improved on the pure fundamentals.
Point taken, but that is exactly the quality I said I liked about it. I hope that 20 years from now my desktop will be exactly the same. The disjoint UI bothers me to an extent. I mostly use KDE apps or things built with Qt, but you're right that nothing is uniform. That said, I'd take disunity if it means stability. I don't care if the buttons in different apps look different, but don't take them away. Just look at what they did to Mail.app--in 2010 it was beautiful. Last I used it in 2020 it seemed like all the power user features of it were gone or hidden and everything was under a dot dot dot menu instead of out on the toolbar.
> Not much has changed or improved on the pure fundamentals
I'm not sure what you're trying to say here. What DE are you talking about? Or are you talking about something else?
Depends on the role. I build AI Agents for a living and Linux is for this edge case better.
> It still looks to me like Linux 20 years ago.
Like visually? I personally don't care much for animations, transitions, rounded corners (this one I actually hate, because you can't even disable them on mac). I'm not a florist, I am programmer. I want efficiency not the looks, bells and whistles. Although I recently started using Hyprland, and oh my, those window animations and transitions are super nice, not to mention that you can completely control every aspect of them.
> Oh and don't get me started on MLX...
Your pittance Apple gives you because they refuse to sign CUDA drivers? That MLX?
If you're into LLM's you need to consider using Apple Silicon so you can continue developing on your own machine with MLX. MLX isn't just a replacemnet for CUDA, no way - it's made for the Apple arch. The new M5 chip will likely have serious ANE and ability to use the unified memory. Then we're talking powerhouse. Current Mac Studios are already more cost effective than say RTX 6000.
That's just a unified memory architecture. Strix Halo has this, too. It's not only Apple. Memory bandwidth is lower than the ultra series but faster or on par for every other M series processor.
> The initial investment into getting a usable Arch desktop was horrible.
It is my conviction that very few should go down the Arch route. If you want to sysadmin Linux or learn how to do so, fine. But if you want to do something else with your computer I'd strongly recommend looking into one of the https://universal-blue.org images (I use https://getaurora.dev btw).
These are based on atomic Fedora and my experience is that they offer extreme stability while still staying on the edge of development. Could we call it NixOS for mere mortals? Probably not if you ask the Nix peeps. :)
> It is my conviction that very few should go down the Arch route. If you want to sysadmin Linux or learn how to do so, fine.
I don't think sysadmin is fair, but certainly it's true that a lot of 'how do I do the equivalent of Windows/macOS built-in foobar' questions will have the answer 'well this is a non-exhaustive list of possible things you could install to do that'.
Which is to say that first time around it's almost inevitably going to be a lot of setup. But then it won't change, or when it does it will just be whichever puzzle piece changed - not 'Arch reimagined everything with Liquid Glass'.
>These are based on atomic Fedora and my experience is that they offer extreme stability
These were my hopes. Up until a new update introduced something that broke my nvidia drivers "integration".
After a few days I decided to try to update the system once more (which killed the oldest snapshot) and I was left with the system that can only be run in 1024 mode. I've tried every suggestion from the web to no avail.
It's been working well with my NVIDIA equipped laptop and desktop machine with a Radeon GPU. I would never buy NVIDIA for Linux use again though. GPU support needs to be in the kernel, So Arc or Radeon it is.
I attempted to make a ostree+Nix distro a while back, but ublue didn't exist at the time and the build documentation was a masterclass in obfuscation.
Immutable is definitely the future.
For better or worse, I'm exactly the person who should go down this route.
:D
If you gotta do it, you gotta do it.
> The best part is, though, nothing ever changes.
Wasn’t true when they switched to systemd, or when KDE 4 came out, or when the new Gnome came out, or when the kernel renamed Ethernet interfaces to enps-whatever.
Or when they switched from applications requesting exclusive access to ALSA audio devices to using sound servers for mixing, or when Xorg autoconfiguration was introduced (obviating manual Xorg.conf creation), or when the modesetting DDX replaced vendor specific DDX packages, or when Wayland was introduced with full backward compatibility with Xorg via XWayland. I suppose that last one is more of a lack of change.
> when Wayland was introduced with full backward compatibility with Xorg
you shouldn't believe everything you read on the internet.
Those things are changes that are announced years before they happens and you would some distributions where they’re still using the old stuff. It’s not that unavoidable changes we you only have three years of holding on to the old stuff before being abandoned.
Some of us use Slackware, with fvwm.
My biggest gripe with macOS vs. Linux is that the former allows IT departments to completely control them. They overload these otherwise very capable machines with so much crap that things start to crawl; they enable FileVault and install antivirus and NGFW clients; forcefully diminish the admin role to be a regular user, forcing users to use Admin-by-Request whenever they need to fart out anything sudo in the terminal.
I tried so hard to find reasons to like macOS, but frankly, if workspaces didn't force Macs, I would've totally chosen to use Linux. The only thing I miss in Linux is JXA/Applescript automation engine and Hammerspoon, nothing else - I don't use their web browser, or their mail app, GarageBand, and other crap like iTunes because frankly, they never felt to me like good solutions to solve specific problems, more like freeware before transitioning to better alternatives. Even the built-in terminal I use only to bootstrap Homebrew. Another good thing I should mention is that macOS really does set a good standard for accessibility features, even though I'm lucky not to have to rely on them, I'm sure many people do.
I honestly don't know how Apple has been getting away with so much crap for years - software developers are probably one of the biggest demographics of Mac users, and Apple keeps screwing them over, yet they stay loyal - partly because businesses force them to use Macs, partly because the alternatives suck even more - Linux ain't perfect and Windows is outright evil (really, I can't even rebind Win+L key on my computer? Fuck you MSFT!).
>The best part is, though, nothing ever changes. I get things working the way I want, and it just stays that way year after year. No UI language updates, no replacing my default shell, nothing. It just keeps working the way I like.
While Arch might make you safer by virtue of choice, some of the more "beginner friendly" distros aren't immune to changing things seemingly overnight. Ubuntu for instance dropped GNOME for Unity which I still have bad memories of to this day.
Yeah, I didn't mean to write a "Linux is better" polemic, just that building my own Linux desktop works better for me. I also haven't used Gnome since ~2009 except briefly to discover that Gnome 3 was trash. Unity was a travesty and I also hated it when Ubuntu force fed me that.
You're right. Locking yourself into a distro, especially the more user-friendly ones, can get you into just as much of a dictatorship as macOS.
I use i3 on X11 like a neanderthal and mostly Qt/KDE apps. I'll switch to Sway and Wayland when things stop working.
I'd switch if I were you - well, I did, after holding out for what I thought was a long time. A lot of minor nuisances I hadn't gotten around to investigating with X just disappeared, I think you'd find that by comparison you were/are actually nearer 'not working' than you realise, my sway experience has been much smoother.
Out of curiosity, what kind of work do you do on Linux. Because, some workflows are simply not possible on Linux due to missing applications.
I work on computer vision stuff. Most of my time is in the browser, or on the terminal (Alacritty, tmux, neovim). I've actually become more of a GUI user since switching from macOS to Linux. I used to be way more die-hard about CLI-everything, but I love using `git gui` now. I think Dolphin is the superior file manager experience available to mankind. Okular is an excellent PDF reader, but leaves a tiny bit to be missed about Preview.
I'm fairly serious about photography and spend a lot of time editing and post-processing photos. This is a major shortcoming of my desktop and I may end up getting a MacBook just for this purpose. digiKam is just okay for organizing photos, and RawTherapee is barely okay. I don't mind it's ugly UI, but I'm discovering that even aside from UI considerations it just can't produce the results that Adobe can. Things like noise reduction are just not there.
I exclusively used macOS for about 8 years and now I'm on Bluefin, work generic web development stacks... The last bit of software I'm missing on Linux is the Serif apps... gaming (in my particular case) is sorted, I was shocked to install Steam on Blufin and have almost my entire library available.
Some workflows not possible on Windows. I avoid using Windows thus I don't care about the pain of having to depend on it - have zero interest in what's not possible somewhere else. What's your point?
The only thing from macOS I truly miss when in Linux is JXA/Applescript automation engine. That's the only thing I miss.