The main problem with Vulkan isn't the programming model or the lack of features. These are tackled by Khronos. The problem is with coverage and update distribution. It's all over the place! If you develop general purpose software (like Zed), you can't assume that even the basic things like dynamic rendering are supported uniformly. There are always weird systems with old drivers (looking at Ubuntu 22 LTS), hardware vendors abandoning and forcefully deprecating the working hardware, and of course driver bugs... So, by the time I'm going to be able to rely on the new shiny descriptor heap/buffer features, I'll have more gray hair and other things on the horizon.
> Ubuntu LTS
This is why I try to encourage new Linux users away from Ubuntu: it's a laggard with, often important, functionality. It is now an enterprise OS (where durability is more important than functionality), it's not really suitable for a power user (like someone who would use Zed).
My understanding with Mesa is that it has very few dependencies and is ABI stable, so freezing Mesa updates is counterproductive. I'm not sure about Snaps, but Flatpak ships as it's own system managing Mesa versions.
> Flatpak ships as it's own system managing Mesa versions.
Mixing and matching the kernel and userspace mesa components is subject to limitations. However it will transparently fall back to software rendering so you might not notice if you aren't doing anything intensive.
Related, being a container flatpak has no choice but to ship the mesa userspace component. If it didn't nothing would work.
> My understanding with Mesa is that it has very few dependencies
Some of the shader compilers require LLVM which is a giant dependency to say the least. But with Valve's ACO for RADV I think that could technically be omitted.
" It is now an enterprise OS"
You really want enterprise standards support for your graphics API.
Bleeding edge ...is not nice in graphics. Especially the more complex the systems get, so do the edge cases.
I mean in general. If you are writing a high end game engine don't listen to me, you know better. But if you are a mid-tier graphics wonk like myself 20 year old concepts are usually quite pareto-optimal for _lots_ of stuff and should be robustly covered by most apis.
If I could give one advice for myself 20 years ago.
For anything practical - focus on the platform native graphics API. Windows - DirectX. Mac - OpenGL (20 years ago! Predates metal!. Today ofc would be metal).
I don't think that advice would be much different today (apart from Metal) IF you don't know what to do and just want to start on doing graphics. For senior peeps who know the field do whatever rights for you of course.
Linux - good luck. Find the API that has best support for your card & driver combo - meaning likely the most stabilized with most users.
You don't have to run LTS. There is a new release every 6 months.
Especially a 4 year old LTS. But I guess the point was that you will run into some users that do when you ship to the general audience.
You run into the same problem on other platforms too of course (eg Android)
I've been running Linux for a very long time.
Ubuntu has never ever been the most stable or useful distro. What it did have was apt and more up to date stuff than debian.
I would never willingly choose Ubuntu if allowed other options (Fedora, Debian, maybe CoreOS, etc)
I have a lot of respect for Canonical for driving a distro that was very "noob friendly" in an ecosystem where that's genuinely hard.
But I mostly agree with you. Once you get out of that phase, I don't really see much value in Ubuntu. I'd pick pretty much anything else for everything I do these days. Debian/Fedora/Alpine on the server. Arch on the desktop.
not to mention the OP mentioned 22 LTS which isn't even the most current LTS
And this is a prime example of development-centric thinking prioritizing developer comfort over the capabilities and usability of the actual software. Rather than targeting stable older feature sets it's always targeting the bleeding edge and then being confused that this doesn't work on machines that aren't their own and then blaming everyone else for their decision. 4 years is not a long time (LTS). 4 years is the minimum that software should be able to live.
Ubuntu's perfectly fine if you avoid LTS versions.
Which one would you recommend for regular users and power users?
If you want something relatively uninteresting: Fedora or Debian (honestly, stable is fine).
If you want something extremely reliable, more modern, but may require some learning to tweak: Silverblue or Kinoite.
Debian updates even less frequently than Ubuntu and stays with years old versions of packages. If you're looking for fresh, Debian is not it. Maybe Arch?
Yeah, the folks in here recommending Debian as a solution to this problem are insane.
I love Debian, it's a great distro. It's NOT the distro I'd pick to drive things like my laptop or personal development machine. At least not if you have even a passing interest in:
- Using team communication apps (slack/teams/discord)
- Using software built for windows (Wine/Proton)
- Gaming (of any form)
- Wayland support (or any other large project delivering new features relatively quickly)
- Hardware support (modern linux kernels)
I'd recommend it immediately as a replacement for Ubuntu as a server, but I won't run it for daily drivers.
Again - Arch (or it's derivatives) are basically the best you can get in that space.
I think Debian Stable, Ubuntu LTS, and derivatives thereof are particularly poor fits for general consumers who are more likely to try to run the OS on a random machine they picked up from Best Buy that’s probably built with hardware that kernels any older than what ships in Fedora are unlikely to support.
The stable/testing/etc distinction doesn't really help, either, because it's an alien concept to those outside of technical spheres.
I strongly believe that the Fedora model is the best fit for the broadest spread of users. Arch is nice for those capable of keeping it wrangled but that's a much smaller group of people.
I agree that they are a poor fit for a random user especially for debian install being not as intuitive but for supporting hardware I disagree.
I decided to try debian stable on my brand new gaming PC and it worked fine out of the box. Combine with steam flatpak for gaming and I have less issues than my friends who game on Arch.
I agree though that Fedora is probably a good general recommendation.
I find this a very reasonable take.
I'll add - I think the complexity is somewhat "over-stated" for Arch at this point. There was absolutely a period where just reading the entire install guide (much less actually completing it) was enough to turn a large number of even fairly technical people off the distro. Archinstall removed a lot of that headache.
And once it's up, it's generally just fine. I moved both my spouse and my children to Arch instead of Windows 11, and they don't seem particularly bothered. They install most of their own software using flatpaks through the store GUI in Gnome, or through Steam, the browser does most of the heavy lifting these days anyways.
I basically just grab their machine and run `pacman -Syu` on it once in a while, and help install something more complicated once in a blue moon.
Still requires someone who doesn't mind dropping into a terminal, but it's definitely not what I'd consider "all that challenging".
YMMV, but the issue I usually run into with Arch is that unless you watch patch notes like a hawk, updates will break random things every so often, which I found quite frustrating. The risk of this increases the longer the system goes without updates due to accumlated missing config file migrations and such.
Even as someone who uses the terminal daily it's more involved than I really care for.
> but the issue I usually run into with Arch is that unless you watch patch notes like a hawk,
The good news is you can run `yay -Pwwq` to get the latest Arch news headlines straight in your terminal.
I've wrapped that with running `pacman -Syu` into a little helper script so that I always get to see the news before I run an update.
This is built into my dotfiles by default at https://github.com/nickjj/dotfiles.
Debian has multiple editions, if you want Arch, go for sid/testing.
Stable is stable as in "must not be broken at all costs" kind of stable.
basically everything works just fine. there's occasionally a rare crash or gnome reset where you need to login again, but other than that not many problems.
Again, I like Debian a lot as a distro (much more than Ubuntu), but it's just not the same as a distro like Arch, even when you're on testing. Sid is close, but between Arch and sid... I've actually found fewer issues on Arch, and since there's an existing expectation that the community maintains and documents much of the software in AUR, there's almost always someone actually paying attention and updating things, rather than only getting around to it later.
It's not that Debian is a bad release, but it's the difference in a game on steam being completely unavailable for a few hours (Arch) or 10 days (Debian testing) due to an upstream issue.
I swapped a while back, mostly because I kept hitting issues that are accurately described and resolved by steps coming from Arch's community, even on distros like Debian and Fedora.
---
The power in debian is still that Ubuntu has made it very popular for folks doing commercial/closed source releases to provide a .deb by default. Won't always work... but at least they're targeting your distro (or almost always, ubuntu, but usually close enough).
Same for Fedora with the Redhat enterprise connections.
But I've generally found that the community in Arch is doing a better job at actually dogfooding, testing, and fixing the commercial software than most of the companies that release it... which is sad, but reality.
Arch has plenty of its own issues, but "Stale software" isn't the one to challenge it on. Much better giving it a pass due to arch/platform support limitations, security or stability needs, etc... All those are entirely valid critiques, and reasonable drivers for sticking to something like Debian.
No Debian is stable as in “it shall not change”.
There are times where there are known bugs in Debian which are purposely not fixed but instead documented and worked around. That’s part of the stability promise. The behaviour shall not change which sometimes includes “bug as a feature”
Over time I evolved to Debian testing for the base system and nix for getting precise versions of tools, which worked fairly well. But, I just converted my last Debian box to nixos
I'm using Debian testing in my daily driving desktop(s) for the last, checks notes, 20 years now?
Servers and headless boxes use stable and all machines are updated regularly. Most importantly, stable to stable (i.e. 12 to 13) upgrades takes around 5 minutes incl. final reboot.
I reinstalled Debian once. I had to migrate my system to 64 bit, and there was no clear way to move from 32 to 64 bit at that time. Well, once in 20 years is not bad, if you ask me.
I've had a couple outages due to major version upgrades: the worst was the major version update that introduced systemd, but I don't think I've ever irreparably lost a box. The main reason I like nixos now is:
1) nix means I have to install a lot fewer packages globally, which prevents accidentally using the wrong version of a package in a project.
2) I like having a version controlled record of what my systems look like (and I actually like the nix language)
You're allowed to throw debian testing or arch in a chroot. The only thing that doesn't work well for is gaming since it's possible for the mesa version to diverge too far.
Not joking, Arch. Pick Gnome/KDE/Sway as you please.
Arch is a wonderful daily driver distro for folks who can deal with even a small amount of configuration.
Excellent software availability through AUR, excellent update times (pretty much immediate).
The only downside is there's not a ton of direct commercial software packaged for it by default (ex - most companies they care give a .deb or a .rpm) but that's easily made up for by the rest of AUR.
It's not even particularly hard to install anymore - run `archinstall` https://wiki.archlinux.org/title/Archinstall make some choices, get a decent distro.
Throw in that steam support is pretty great... and it's generally one of the best distros available right now for general use by even a moderate user.
Also fine as a daily driver for kids/spouses as long as there's someone in the house to run pacman every now and then, or help install new stuff.
Arch or Endeavour
Debian/testing, with stable pinned on at low priority.
It slows down for a couple months around release, but generally provides pretty reliable & up to date experience with a very good OS.
Dance dance the red spiral.
A stable-testing mix is quite exotic. What are you trying to achieve here?
It's rare but every now and then testing has an unsatisfiable dependency. It's usually resolved within a day or so. But I keep a lower distro around basically to insure I have a fallback, so I'm not blocked now. The next update should likely get me back to testing.
You can go for sid too :)
I run sid (debian's unstable branch) on all my systems, it's great! With experimental pinned on at low priority! It's great, I love it!
I'm not quite bold enough to recommend it to people but if anyone asks I would definitely say yes to running sid. Apt-pin for testing at low priority is good to have, just because sometimes there's lag when one library updates for everyone using it to update, and you can get unsatisfiable dependencies.
I encourage them away from Ubuntu because of the Snaps. If people want an enterprise distro that lags upstreams by a lot they should go with Debian.
Yes, this is the problem. They tout this new latest and greatest extension that fixes and simplifies a lot, yet you go look up the extension on vulkan.gpuinfo.org and see ... currently 0.3% of all devices support it. Which means you can't in any way use it. So you wait 5 years, and now maybe 20% of devices support it. Then you wait another 5 years, and maybe 75% of devices support it. And maybe you can get away with limiting your code to running on 75% of devices. Or, you wait another 5 years to get into the 90s.
> There are always weird systems with old drivers (looking at Ubuntu 22 LTS)
While I agree with your general point, RHEL stands out way, way more to me. Ubuntu 22.04 and RHEL 9 were both released in 2022. Where Ubuntu 22.04 has general support until mid-2027 and security support until mid-2032, RHEL 9 has "production" support through mid-2032 and extended support until mid-2034.
Wikipedia sources for ubuntu[0] and RHEL [1]:
[0] https://en.wikipedia.org/wiki/Ubuntu#Releases
[1] https://upload.wikimedia.org/wikipedia/en/timeline/fcppf7prx...
Tbh, we should more readily abandon GPU vendors that refuse to go with the times. If we cater to them for too long, they have no reason to adapt.
I had a relatively recent graphics card (5 years old perhaps?). I don't care about 3D or games, or whatever.
So I was sad not to be able to run a text editor (let's be honest, Zed is nice but it's just displaying text). And somehow the non-accelerated version is eating 24 cores. Just for text.
https://github.com/zed-industries/zed/discussions/23623
I ended up buying a new graphics card in the end.
I just wish everyone could get along somehow.
The fact that we need advanced GPU acceleration for a text editor is concerning.
Such is life when built-in laptop displays are now pushing a billion pixels per second, rendering anything on the CPU adds up fast.
Sublime Text spent over a decade tuning their CPU renderer and it still didn't cut it at high resolutions.
https://www.sublimetext.com/blog/articles/hardware-accelerat...
Most of the pixels don't change every second though. Compositors do have damage tracking APIs, so you only need to render that which changed. Scrolling can be mostly offset transforms (browsers do that, they'd be unbearably slow otherwise).
That’s not the slow part. The slow part is moving any data at all to the GPU - doesn’t super matter if it’s a megabyte or a kilobyte. And you need it there anyway, because that’s what the display is attached to.
Now, the situation is that your display is directly attached to a humongously overpowered beefcake of a coprocessor (the GPU), which is hyper-optimized for calculating pixel stuff, and it can do it orders of magnitude faster than you can tell it manually how to update even a single pixel.
Not using it is silly when you look at it that way.
Sure, use it. But it very much shouldn't be needed, and if there's a bug keeping you from using it your performance outside video games should still be fine. Your average new frame only changes a couple pixels, and a CPU can copy rectangles at full memory speed.
Text editor developers get bored too!
No. I remember a phone app ( Whatsapp?) doggedly supporting every godforsaken phone, even the nokias with the zillion incompatible Java versions. A developer should go where the customers are.
What does help is an industry accepted benchmark, easily ran by everyone. I remember browser css being all over the place, until that whatsitsname benchmark (with the smiley face) demonstrated which emperors had no clothes. Everyone could surf to the test and check how well their favorite browser did. Scores went up quickly, and today, css is in a lot better shape.
The Acid2 test is the benchmark you’re thinking of, for anyone not aware: acid2.acidtests.org
> we should more readily abandon GPU vendors
This was so much more practical before the market coalesced to just 3 players. Matrox, it's time for your comeback arc! and maybe a desktop pcie packaging for mali?
The market is not just 3 players. These days we have these things called smartphones, and they all include a variety of different graphics cards on them. And even more devices than just those include decently powerful GPUs as well. If you look at the Contributors section of the extension in the post, and look at all the companies involved, you'll have a better idea.
NVidia says no new gamer GPUs in 2026, and increasing prices through 2030. They're too focused on enterprise AI machines.