On this topic, Rust just added an (unstable for now) option to delay codegen until needed https://blog.rust-lang.org/inside-rust/2025/07/15/call-for-t...

Just like Zig's decision it is a double edged sword: on one side you can avoid compiling code that will not be needed. However on the other side you make incremental compilation worse, since you will have to recompile it more often.

> HTTP, for example, can go into Zig's standard library, whereas C or Rust cannot do that.

Rust _could_ put HTTP in the standard library, the dead code would simply be removed when linking. The reason it's not in the stdlib has more to do with the fact that:

- it would need to be supported by a team that's already overworked - it would need to be supported indefinitely, so it would have to be the right interface on the first try.

This effectively prevents any big and complex feature from making to the Rust stdlib.

> This helps control package explosion.

Arguably it helps controlling the number of "packages", but not the amount of code in those packages. If I split a "package" in 4 crates in Rust to get better compile times it's still mostly the same code, but you now see a number 4 times bigger.

It's funny because the biggest impact in my eyes is social and not technical: the standard library is controlled by project governance whereas libraries are not.

Zig moves fast with it's BFDL structure, Rust moves slow with its committees, and both are by design. I'm excited to watch the Zig standard library evolve quickly, especially once the language has settled things like I/O.

And potentially, 4 times the number of different authors, and 4 times the supply chain attack potential.

It's not just about the number of bytes.

OP point was that Zig allows reducing the number of packages created due to technical issues around how code is compiled, and that generally does not result in different packages maintained by different people.