One of the follow up messages is interesting: https://lists.debian.org/debian-devel/2025/10/msg00288.html

> Rust is already a hard requirement on all Debian release architectures and ports except for alpha, hppa, m68k, and sh4 (which do not provide sqv).

Wonder what this means for those architectures then?

Who is still using these machines? Genuine question, not trolling.

It looks like the last machines of each architecture were released:

Alpha in 2007

HP-PA in 2008

m68k in pre-2000 though derivatives are used in embedded systems

sh4 in 1998 (though possible usage via "J2 core" using expired patents)

This means that most are nearly 20 years old or older.

Rust target triples exist for:

m68k: https://doc.rust-lang.org/nightly/rustc/platform-support/m68... and https://doc.rust-lang.org/nightly/rustc/platform-support/m68... both at Tier 3.

(Did not find target triples for the others.)

If you are using these machines, what are you using them for? (Again, genuinely curious)

Debian just cut i386, Wikipedia says the i386 was discontinued in 2007. These systems are all of the same vintage, so it does not seem a huge leap to be culled from the support list.

[0] https://en.wikipedia.org/wiki/I386

The actual Intel 80386 processor was discontinued in 2007, but the i386 architecture -- ia32, 32-bit x86 -- lived on for longer in the semi-mainstream (there were 32-bit Intel Atoms introduced to the market as late as 2012, AMD Geodes were sold until at least 2019, and I believe some VIA C3/C7 derivatives made it into the 2010s as well) and is in fact still in production to this day for the embedded and industrial markets (Vortex86, etc).

All of those are i586 or i686 though right? AMD Geode LX is the only i586 CPU produced in the last 15 years.

Everything else is at least i686 and Rust has perfectly adequate i686 support.

Is there any major distro left with pre i686 support?

Debian's i386 is actually i686 I believe.

i386 (32 bit) only processors we discontinued but 64bit processors can operate in 32bit mode so toolchain was still widely available and there was still demand for i386 OS that would run on modern hardware in i386 mode for some ancient software.

Yeah, and you can still run i386 binaries on widely available amd64 CPUs. So this is an even stronger argument for killing these other obsolete platforms.

You can only run the binaries if you have the libc to run them.

You say that like it is a higher or even comparable barrier to having alpha/m68k/sh4 hardware; it isn't.

Uh? I don't understand what you're saying.

> Who is still using these machines? Genuine question, not trolling.

Either legacy systems (which are most certainly not running the current bleeding-edge Debian) or retro computing enthusiast.

These platforms are long obsolete and there are no practical reasons to run them besides "I have a box in the corner that's running untouched for the last 20 years" and "for fun". I can get a more powerful and power efficient computer (than any of these systems) from my local e-waste recycling facility for free.

It’s usually a loud minority of trolls or hobbyists. It just takes one to spark a doubt.

Here is one famous example of a dude who’s managed to get PRs merged in dozens of packages, just to make them compatible with ancient versions of nodejs https://news.ycombinator.com/item?id=44831811

Wow that was an interesting read. I find it amusing that nobody seems to really know who he is or what his motives are, yet his code is run on millions of machines every day.

There’s hobbyists using m68k Macs, Amigas, and Atari STs. Definitely a small niche, and those running Linux on those machines more so.

Sure, but almost nobody uses or wants modern linuxes on those machines. It's almost always described (in their own words) as "some ancient crusty version of Debian"

Nobody wants to play games on Linux given the small userbase compared to Windows. Yet people work on it.

You could make this argument for so many usecases but apparently people just enjoy bashing retrocomputing here.

> Nobody wants to play games on Linux given the small userbase compared to Windows.

According to the last Steam survey, 3% of players use Linux. Steam has 130 million active players, so that means there are 4 million people playing on Linux. Definitely not "nobody", and way bigger than the whole retrocomputing community.

By the way, I am also one of those retrocomputing guys, I have a Pentium 2 running Windows 98 right here. IMHO, trying to shoehorn modern software on old hardware is ridiculous, the whole point of retro hardware is using retro software.

> Who is still using these machines? Genuine question, not trolling.

Well, there are so many things were you could argue about the relevance of a userbase.

If the size of a userbase would be the only argument, Valve could just drop support for the Linux userbase which is just 2-3% of their overall userbase.

Not your point, but Linux compatibility is Valve protecting themselves from the existential risk that is a capricious Microsoft. At one point, it seemed Microsoft was trying to make the Microsoft Store be the mechanism for distributing all software. Linux being viable for gaming is/was their safety net to avoid being locked out from the ecosystem.

Are there even close to 2% of Debian users using a DEC Alpha?

popcon.debian.org reports 3 alpha installations and 261750 amd64 installations. Assuming comparable opt-in rates there are less than 0.002% of the users using alpha.

The other mentioned architectures hppa, m68k and sh4 are at a similar level.

Valve isn't a good example. They have strong Linux support so they can sell Steamdecks without licensing with Microsoft. Without their work on Proton, Steam effectively lives or dies by the will of Microsoft.

Think about any time a computer is used in something designed to last 30+ years.

Cars, airplanes, construction equipment, etc.

I am pretty sure that those machines are not running Debian.

And almost certainly not whatever the next stable release of Debian is.

Why not? How do you know that? Debian is used pretty widely

they might run Debian but not upstream Debian/stable

you mainly find that with systems needing certification

this are the kind of situations where having a C language spec isn't enough but you instead need a compiler version specific spec of the compiler

similar they tend to run the same checkout of the OS with project specific security updates back-ported to it, instead of doing generic system updates (because every single updates needs to be re-certified)

but that is such a huge effort that companies don't want to run a full OS at all. Just the kernel and the most minimal choice of packages you really need and not one more binary then that.

and they might have picked Debian as a initial source for their packages, kernel etc. but it isn't really Debian anymore

You'd be surprised.

Even if they are, they are not updating to latest Debian stable.

They try if they are internet connected

I mean they can't in the first place, because ports only have unstable available

You would be wrong. People want new software

If we are talking about embedded control systems no, you don't want new software, you want your machine to do what is supposed to do. At my workplace we have some old VME crates running VxWorks, and nobody is gonna update those to the latest Linux distro.

This is incorrect. Internet connections and 3rd party integrations have changed this view of “the software doesn’t need to change once it leaves the factory”.

John Deere, Caterpillar, etc are leaning heavily into the “connected industrial equipment” world. GE engines on airplanes have updatable software and relay telemetry back to GE from flights.

The embedded world changed. You just might have missed it if your view is what shipped out before 2010.

My experience is in big scientific experiments like particle accelerators, I guess other fields are different. Still, my experience is that:

1) The control network is air gapped, any kind of direct Internet connection is very much forbidden.

2) Embedded real-time stuff usually runs on VxWorks or RTEMS, not Linux. If it is Linux, it is an specialized distro like NI Linux.

3) Anything designed in the last 15 years uses ARM. Older systems use PowerPC. Nobody has used Alpha, HPPA, SH4 or m68k in ages. So if you really want to run Debian on it, just go ahead and use Armbian.

I don't think those systems can/should be updated using apt though.

people enjoy running vintage stuff, and running modern stuff on some vintage thing is kinda cool.

But yeah, those can figure out how to keep their own port

Here’s Debian’s “Supported Architectures”: https://wiki.debian.org/SupportedArchitectures. These platforms are all in ‘unofficial’ status (like, they work, but are not officially supported by the core Debian project).

Who is actually _running_ Debian Trixie on these platforms now?

It is counter-intuitive to me that these platforms are still unofficially supported, but 32-bit x86 [edit: and all MIPS architectures!] are not!

I am emotionally sad to see them fall by the wayside (and weirdly motivated to dig out a 68k Amiga or ‘very old Macintosh’ and try running Trixie…) but, even from a community standpoint, I find it hard to understand where and how these ports are actually used.

They’re not going to fall to the wayside, we’re eventually getting Rust support on these.

It’s just a bit annoying that Rust proponents are being so pushy in some cases as if Rust was the solution to everything.

Can you actually install Debian on an Amiga - or other 68k system - now? I’ve been searching around the web and I haven’t found much evidence that you can.

This is not intended to bash you or anyone else who’s working on it - I think it’s a cool project (I have in the recent past got an 86duino ZERO to run Gentoo, just to see if an obscure old-ish piece of hardware can be useful with modern Linux on it - and it can). I do understand the reason a project like Debian might not want to have to spend resources even just to make it easier to do though.

Here is a recent blog post where somebody tried to install Debian on an Amiga 4000 with an 68040 running at 25 MHz.

https://sandervanderburg.blogspot.com/2025/01/running-linux-...

I didn't find what Debian version they tried but I think it's implied it's a recent version. They ran into memory issues. They had only 48MB while the recommendations are to use 64MB. It did boot though until it threw errors because of memory constraints.

They got a working system by trying Debian 3.1 though.

They're >20 years old and nobody seriously uses them for modern software. Come the fuck on. Let them go.

I was running Debian unstable on my 64-bit SPARC up until I moved a few months ago and haven't set the system back up again while unpacking.

It's been somewhat useful for finding weird edge cases in software where for whatever reason, it doesn't reproduce easily on AArch64 or x86, but does there. (Or vice-versa, sometimes.)

I don't know that I'd say that's sufficient reason to motivate dozens of people to maintain support, but it's not purely academic entertainment or nostalgia, for that.

m68k has a LLVM port already, so Rust can be implemented for that platform.[0] It would be nice to have LLVM backends for alpha, hppa and sh4 - these older architectures tend to be quite simple so a working LLVM has plenty of value as a reference and for educational use.

(LLVM even used to have an in-tree DEC Alpha backend, though that was back in 2011 and not relevant to any version of Rust.)

[0] Looks like there is basic initial support but no 'core' or 'std' builds yet. https://doc.rust-lang.org/rustc/platform-support/m68k-unknow... This should potentially be fixable.

Shouldn't the codegen_gcc project[1] theoretically allow Rust to target backends that only GCC supports?

[1] https://github.com/rust-lang/rustc_codegen_gcc

LLVM is desirable for other reasons (LLVMpipe for example), so investing into an LLVM port is probably a better use of limited resources than improving and maintaining a retargeted rustc for GCC.

yes and no

yes, from a pure code generation aspect

no, as all conditional-compiled platform specific code is missing.

So using it with #[no_core] should work (assuming the WIP part of the backend isn't a problem). But beyond that you have to first port libcore (should be doable) and then libstd (quite a bunch of work).

Yes, that is being worked on.

As far as I understand it, the m68k LLVM port is not ABI-compatible with GCC on Linux because the alignment is wrong: https://wiki.debian.org/M68k/Alignment (page says that the LLVM packages are FTBFS because of this)

The principled way to address this would be to define a new ABI suffix for the m68k-unknown-linux-gnu target triple, replacing the existing 'gnu' which would retain GCC compatibility with 2-byte alignment for int.

They are of no commercial interest to Ubuntu.

While that seems like it would be true, is that really relevant to Debian? :)

The person making the post is getting paid by Ubuntu.

sure but most frequent/core contributors for Linux (kernel) and most big Distros are payed by some company with commercial interest which mostly doesn't involve spending time on maintaining any of this old architectures

sure some are also payed by a foundation. Which is also payed by companies but with a degree of decoupling of influence.

and some pay them self, e.g. fully voluntary work, but most dev can't afford to do so on a long term, high time commitment manner. So a lot of major changes and contributions end up coming from people directly or indirectly "payed" by some company.

and that's pretty common across most "older, larger, sustainable and still developed OSS"

Needs a perhaps with question mark or some proof.

You could just read his signiture in the mailing list.

https://mastodon.social/@juliank

>Senior Engineer at Canonical.

Yes that's true and there's synergies but keep in mind I also have a personal mind

[flagged]

I think that's unkind absent any proof otherwise which you haven't posted.

In reality a lot of the heavily used parts of Linux (and open source generally) are going to have some commercial involvement or are we suggesting that no one should be paid by any of the companies backing Linux who use Linux because to me that sounds worse.

Whether you like it or not, Linux/Open Source hasn't been entirely the preserve of unpaid people doing it for fun for a long time.

https://en.wikipedia.org/wiki/The_Scorpion_and_the_Frog

I don't understand how people keep giving the benefit of the doubt to corporations.

They aren't people. They feel no guilt and have no shame.

Julian Klode is not a corporation, though, so we should give him the benefit of a doubt.

The whole thread started someone making the wrong point that:

>They are of no commercial interest to Ubuntu.

Which is plainly false.

Julian can believe whatever he wants, in fact the more naively idealistic he is the better for Canonical.

The fact of the matter is that corporate Linux's benefits from impossible to run setups. That's the whole business model. The more convoluted the setup is the more money they make. Rust fits into this business model perfectly.

You'd think people would have learned this after the Nth rug pull from supposed champions of free software like RedHat.

My whole point, which you've missed, is that the _interest_ is what we should be looking at. I don't care if he says he does or doesn't act on it. He has the interest.

Instead of insinuating, can you say what exactly the impropriety here would be, in your opinion?

https://news.ycombinator.com/newsguidelines.html

> Assume good faith.

[flagged]

Let me evolve that for you.

You should not only avoid appearance of impropriety, but also impropriety.

Are you just being clever by mocking someone with a job or so you think there is something wrong with this technical decision?

[flagged]

That's obnoxious. Do you only promote the interests of your employer? Do you expect people to assume that?

It's not even as if Canonical has a particularly bad reputation for that kind of thing.

[deleted]

more relevant they are all "unofficial" supported architectures and non in a "might get official support" context, so they all have been of very limited interest of Debian as a whole for quite a while

Those all seem to be completely obsolete so I guess they can just stay on the latest version of Debian that supports them, or make their own distro. (Or add Rust support I guess but that's probably not realistic.)

> Wonder what this means for those architectures then?

They will be rebranded as "retro computing devices"

Most don't need to be rebranded. Alpha and hppa are retro computing and haven't been available to buy for 18 and 15 years respectively. Sh4 died this year. Only m68k is still holding on, but is a rounding error in the number is users.

Aren’t m68k computers only a few models from the 1990s and 1980s, and some more recent hobby projects? That’s squarely in the retro computing enthusiasts category.

I’m not in the Debian world, but those do seem to me like the types of systems that could use their own specialized distros rather than being a burden to the mass market ones. It’s not as if you could run a stock configuration of any desktop environment on them anyway.

m68k is used in a number of embedded systems today. It is unknown (to me) how many of those run linux (as opposed to some other embedded OS), but I would guess at least some do. I also don't know how many run (or want to run) debian vs something else (a custom yacto distribution is my first guess), but that might be non-zero. It is possible someone is running a non-debian distribution and using debian packages to provide their updates.

Does or should debian care? I don't know.

All I find searching for “embedded m68k Linux distro” is people looking for, or coming up with, alternatives, as Debian was already “too big” fifteen years ago.

I don’t get the fuzz around the “retro computing” verbiage. I doubt anyone is actually running Debian on these devices out of necessity, someone who plays baroque music in reconstructed period instruments won’t balk at being called an “early music” enthusiast.

Well, we are on a mission to create The Universal Operating System. So maybe.

But I'm not sure. I think the new Rust dependencies are good. In an ideal world, the people who care about niche systems step up to help Rust target those systems.

> In an ideal world, the people who care about niche systems step up to help Rust target those systems.

I’m actually the person who added the m68k target to the Rust compiler and was also one of the driving forces of getting the backend into LLVM.

Generally speaking, getting a new backend into the Rust compiler is not trivial as it depends on LLVM support at the moment which is why asking someone to just do it is a bit arrogant.

Luckily, both rustc_codegen_gcc and gccrs are being worked on, so this problem will be resolved in the future.

Sorry, I didn't mean to insinuate that there's anything minor about it, or that nobody is doing the work. I should have phrased myself differently.

I'll try to rephrase: if we never want to give up support for a platform we've supported in the past, then I think we only have two options: (1) never adopt new technology where support for said platforms doesn't come for free, or (2) leave it up to those who care about the niches to ensure support.

Neither is pain-free, but the first seems like a recipe for stagnation.

It's lovely to see the two alternative compiler paths for Rust moving forward though! Thank you!

Interesting bit about SH-4. I thought that Renesas had previously promised parts availability until 2029?

Sh-4 is on the Product Longevity Program https://www.renesas.com/en/support/product-longevity-program... but what it actually means, I really cannot easily figure out. It's marked as "Last Time Buy" right now.

[deleted]