> Dependencies will be added, for very basic system utilities, on (parts of) a software ecosystem which is still a "moving target", not standardized,

This is the status quo and always has been. gcc has plenty of extensions that are not part of a language standard that are used in core tools. Perl has never had a standard and is used all over the place.

If you're designing an OS distribution, you would have your base system written adhering strictly to language standards and without relying on flakey extensions (not that GCC C extensions are flakey, I'm guessing most/all of them are stable since the 1990s), and minimizing reliance on additional tools.

For example, IIUC, you can build a perl interpreter using a C compiler and GNU Make. And if you can't - GCC is quite bootstrappable; see here for the x86 / x86_64 procedure:

https://stackoverflow.com/a/65708958/1593077

and you can get into that on other platforms anywhere along the bootstrapping chain. And then you can again easily build perl; see:

https://codereflections.com/2023/12/24/bootstrapping-perl-wi...

It feels like perhaps you’ve conflated the issue in this thread, which is about using Rust in apt, which is much, much later in the distribution bringup than this bootstrapping, and using Rust in something like the Linux kernel, which is more relevant to those types of bootstrapping discussions you posted.

apt is so late in the process that these bootstrapping discussions aren’t quite so relevant. My point was that at the same layer of the OS, there are many, many components that don't meet the same criteria posted, including perl.

The procedure to produce GCC you cited was 13 steps. Many of the tools were made after distributions required GCC. And a similar procedure could produce a Rust compiler.