> Still, I think it's also a case of the same thing: sell it to the right people. Find people making new distros suffering problems with DEB and RPM and tell them your tool can save them a lot of pain.

To quote one of my favorite Clone Wars episodes: Fifty tried, fifty died [1].

There have been so, so many attempts at solving the "how to ship binary builds for Linux" question... both deb and rpm have their good and their bad, and on top of that you got `alien`, flatpak, Docker images, the sledgehammer aka shipping everything as a fully static binary (e.g. UT2004 did this) or outright banning prebuilt binaries (the Gentoo and buildroot way). But that's not the actual problem that needs solving.

The actual problem is dependency hell. You might be lucky to be able to transplant a Debian deb into an Ubuntu installation and vice versa, or a SLES rpm to RHEL, but only if the host-side shared libraries that the package depends on are compatible enough on a binary level with what the package expects.

That suddenly drives up the complexity requirements for shipping software even for a single Linux distribution massively. In contrast to Windows, where Microsoft still invests significant financial resources into API-side backwards compatibility, this is not a thing in any Linux distribution. Even if you're focusing just on Debian and Ubuntu, you have to compile your software at least four different times (one each for Debian Stable, Debian Testing, Ubuntu <current rolling release> and Ubuntu <current LTS>), simply because of different versions of dependencies. Oh and in the worst case you might need different codepaths to account for API changes between these different dependency versions.

And even if you had some sort of DSL that generated the respective package manager control files to build packages for the most common combinations of package manager, distributions and actively supported releases of these, there's so, so much work involved in setting up and maintaining the repositories. Add in actually submitting your packages to upstream (which is only possible for reasonably-ish open source packages in the first place), and the process becomes even more of a nightmare.

And that's all before digging into the topics of autotools, vendoring (hello nodejs/php/python ecosystems), digital signature keyrings, desktop manager ecosystems and god knows what else. Oh, and distribution bureaucracy is even more of a nightmare... because you now have to deal with quirks in other people's software too, and in the worst case with a time span of many years of your own releases plus the distribution release cadence!

Shipping software that's not fully OSS on Linux sucks, shipping closed source software for Linux sucks even more. Windows has had that sort of developer experience figured out from day one. Even if you didn't want to pirate or pay up for InstallShield, it was and is trivial to just write an executable, compile it and it will run everywhere.

[1] https://starwars.fandom.com/wiki/Mystery_of_a_Thousand_Moons

IMO, binary compatibility on Linux isn't really solvable. There's just a thousand tiny projects that make up the Linux base that aren't on the same page, and that's not about to change.

I do think packaging can be improved. I hate almost everything about how dpkg works, it's amazing. So I'm squarely in the RPM camp because I find the tooling a lot more tolerable, but still surely further improvements can be made.

Anyway, the ecosystem stays heathy because of code contributions. So what’s the point of binary compatibility (from the point of view of the people actually making Linux work: Open Source developers and repo maintainers)?

> So what’s the point of binary compatibility (from the point of view of the people actually making Linux work: Open Source developers and repo maintainers)?

Want to see Linux on the desktop actually happen? Then allow a hassle free way for commercial software that is not "pray that WINE works good enough" aka use win32 as an ABI layer.

Of course we can stay on our high horses and demand that everything be open source and that life for closed source developers be made as difficult as possible (the Linux kernel is particularly and egregiously bad in that perspective), but then we don't get to whine about why Linux on the desktop hasn't fucking happened yet.

I don’t really know what the point of this “Linux on the desktop” event would be, or even what it is. (Clearly it isn’t just Linux on desktops, because that’s been working fine forever).

The whole point of my comment was to keep in mind the incentives of different sub-groups. If “Linux on the desktop” doesn’t benefit the people that make Linux work, I don’t see what the big deal is.

> I don’t really know what the point of this “Linux on the desktop” event would be, or even what it is.

Getting Linux adopted in F500 companies as the default desktop OS. That is the actual litmus test, because (large) companies need an OS that can be centrally managed with ease, doesn't generate a flood of DPU (Dumbest Possible User) support demand and can run the proprietary software that's vital to the company's needs in addition to the various spyware required by cybersecurity insurances and auditors these days.

At the moment, Linux just Is Not There. Windows has GPOs and AD (that, in addition, ties into Office 365 perfectly fine), Mac has JAMF and a few other MDM solutions. Many a corporate software doesn't even run properly under WINE (not surprising, the focus of Proton and, by it, WINE is gaming), there's a myriad ways of doing central management, and good luck trying to re-educate employees that have been at the company so long they grew roots into their chairs.

Sounds like an unpleasant task, done for users that don’t really want it, and don’t have much incentive or ability to contribute back to the community.

It sort of feels like we’re talking past each-other. I’ve been trying to point out that, due to the community nature of these open source projects, development tends to follow the interests of the people who tend to contribute open source code to the projects. You’ve listed a number of challenges or thresholds that you think are important. However, after reading your comments, I can’t articulate who those thresholds are important to or why they are worth following. I don’t need another litmus test, I need some reason to care about testing.

The idea of “Linux on the desktop” was a popular meme for a while, but I think it is a short-hand expression for a collection of things, some of which were achieved a decade ago, some of which weren’t, where there’s a strong correlation between “things that were accomplished” and “things that open source community contributors cared about,” and the remainder… were ignored because nobody wanted to do them.

I only started packaging relatively recently. Using OBS definitely made things easier, but it's crazy how much nicer RPM is than dpkg. So much better to have more-or-less everything inside a spec file with macros, versus dpgk's mess of static, purpose-specific files.

> ...or outright banning prebuilt binaries (the Gentoo and buildroot way).

You, uh, haven't used Gentoo in like twenty years, have you? You've been able to host your own prebuilt binaries (or use the prebuilts of others who bothered sharing them) for as long as I can remember (FWIW, I started using Gentoo in 2002 or 2004). The Gentoo folks decided to set up official binary package servers at the end of 2023 (look at the Dec 29, 2023 news item on the Gentoo home page for more info).