IME Zig's breaking changes are quite manageable for a lot of application types since most of the breakage these days happens in the stdlib and not in the language. And if you just want do read and write files, the highlevel file-io interfaces are nearly identical, they just moved to a different namespace and now require a std.Io pointer to be passed in.

And tbh, I take a 'living' language any day over a language that's ossified because of strict backward compatibility requirements. When updating a 3rd-party dependency to a new major version it's also expected that the code needs to be fixed (except in Zig those breaking changes are in the minor versions, but for 0.x that's also expected).

I actually hope that even after 1.x, Zig will have a strategy to keep the stdlib lean by aggressively removing deprecated interfaces (maybe via separate stdlib interface versions, e.g. `const std = @import("std/v1");`, those versions could be slim compatibility wrappers around a single core stdlib implementation.

> I take a 'living' language any day over of a language that's ossified because of strict backward compatibility requirements

Maybe you would, but >95% of serious projects wouldn't. The typical lifetime of a codebase intended for a lasting application is over 15 or 20 years (in industrial control or aerospace, where low-level languages are commonly used, codebases typically last for over 30 years), and while such changes are manageable early on, they become less so over time.

You say "strict" as if it were out of some kind of stubborn princple, where in fact backward compatibility is one of the things people who write "serious" software want most. Backward compatibility is so popular that at some point it's hard to find any feature that is in high-enough demand to justify breaking it. Even in established languages there's always a group of people who want somethng badly enough they don't mind breaking compatibility for it, but they're almost always a rather small minority. Furthermore, a good record of preserving compatibility in the past makes a language more attractive even for greenfield projects written by people who care about backward compatibility, who, in "serious" software, make up the majority. When you pick a language for such a project, the expectation of how the language will evolve over the next 20 years is a major concern on day one (a startup might not care, but most such software is not written by startups).

> The typical lifetime of a codebase intended for a lasting application is over 15 or 20 years (in industrial control or aerospace).

Either those applications are actively maintained, or they aren't. Part of the active maintenance is to decide whether to upgrade to a new compiler toolchain version (e.g. when in doubt, "never change a running system"), old compiler toolchains won't suddenly stop working.

FWIW, trying to build a 20 or 30 year old C or C++ application in a modern compiler also isn't exactly trivial, depending on the complexity of the code base (especially when there's UB lurking in the code, or the code depends on specific compiler bugs to be present - e.g. changing anything in a project setup always comes with risks attached).

> Part of the active maintenance is to decide whether to upgrade to a new compiler toolchain version

Of course, but you want to make that as easy as you can. Compatibility is never binary (which is why I hate semantic versioning), but you should strive for the greatest compatibility for the greatest portion of users.

> FWIW, trying to build a 20 or 30 year old C or C++ application in a modern compiler also isn't exactly trivial

I know that well (especially for C++; in C the situation is somewhat different), and the backward compatibility of C++ compilers leaves much to be desired.

You could fix versions, and probably should. However willful disregard of prior interfaces encourages developers code to follow suit.

It’s not like Clojure or Common Lisp, where a decades old software still runs, mostly unmodified, the same today, any changes mainly being code written for a different environment or even compiler implementation. This is largely because they take breaking user code way more seriously. Alot of code written in these languages seem to have similar timelessness too. Software can be “done”.

I would also add that Rust manages this very well. Editions let you do breaking changes without actually breaking any code, since any package (crate) needs to specify the edition it uses. So when in 30 years you're writing code in Rust 2055, you can still import a crate that hasn't been updated since 2015 :)

Unfortunately editions don't allow breaking changes in the standard library, because Rust codes written in different "editions" must be allowed to interoperate freely even within a single build. The resulting constraint is roughly similar to that of never ever breaking ABI in C++.

> The resulting constraint is roughly similar to that of never ever breaking ABI in C++.

No, not even remotely. ABI-stability in C++ means that C++ is stuck with suboptimal implementations of stdlib functions, whereas Rust only stabilizes the exposed interface without stabilizing implementation details.

> Unfortunately editions don't allow breaking changes in the standard library

Surprisingly, this isn't true in practice either. The only thing that Rust needs to guarantee here is that once a specific symbol is exported from the stdlib, that symbol needs to be exported forever. But this still gives an immense amount of flexibility. For example, a new edition could "remove" a deprecated function by completely disallowing any use of a given symbol, while still allowing code on an older edition to access that symbol. Likewise, it's possible to "swap out" a deprecated item for a new item by atomically moving the deprecated item to a new namespace and making the existing item an alias to that new location, then in the new edition you can change the alias to point to the new item instead while leaving the old item accessible (people are exploring this possibility for making non-poisoning mutexes the default in the next edition).

Only because Rust is a source only language for distribution.

One business domain that Rust currently doesn't have an answer for, is selling commercial SDKs with binary libraries, which is exactly the kind of customers that get pissed off when C and C++ compilers break ABIs.

Microsoft mentions this in the adoption issues they are having with Rust, see talks from Victor Ciura, and while they can work around this with DLLs and COM/WinRT, it isn't optimal, after all Rust's safety gets reduced to the OS ABI for DLLs and COM.

Rust allows binary libraries with a C ABI. Having safety within any given module is still a big deal, and it's hard to guarantee safety across dynamic modules when the code that's actually loaded can be overridden by a separately-built version.

I'm not expecting to convince you of this position, but I find it to be a feature, not a bug, that Rust is inherently hostile to companies whose business models rely on tossing closed-source proprietary blobs over the wall. I'm fairly certain that Andrew Kelley would say the same thing about Zig. Give me the source or GTFO.

In the end it is a matter of which industries the Rust community sees as relevant to gain adoption, and which ones the community is happy that Rust will never take off.

Do you know one industry that likes very much tossing closed-source proprietary blobs over the wall?

Game studios, and everyone that works in the games industry providing tooling for AAA studios.

> Game studios, and everyone that works in the games industry providing tooling for AAA studios.

You know what else is common in the games industry? C# and NDA's.

C# means that game development is no longer a C/C++ monoculture, and if someone can make their engine or middleware usable with C# through an API shim, Native AOT, or some other integration, there are similar paths forward for using Rust, Zig, or whatever else.

NDA's means that making source available isn't as much of a concern. Quite a bit of the modern game development stack is actually source-available, especially when you're talking about game engines.

Do you know what C# has and Rust doesn't? A binary distribution package for libraries with a defined ABI.

> I'm fairly certain that Andrew Kelley would say the same thing about Zig. Give me the source or GTFO.

Thus it will never be even considered outside the tech bubble.

Compiler vendors are free to chose what ABI stability their C++ implementations provide.

ISO C++ standard is silent on how the ABI actually looks like, the ABI not being broken in most C and C++ compilers is a consequence of customers of those compilers not being happy about breakages.

> Compiler vendors are free to chose what ABI stability their C++ implementations provide.

In theory. In practice the standards committee, consisting of compiler vendors and some of their users, shape the standard, and thus the standard just so happens to conspire to avoid ABI breakages.

This is in part why Google bowed out of C++ standardization years ago.

I know, but still go try to push for ABI breaks on Android.

Sure, but considering that Zig is a modern C alternative, one should not and cannot afford to forget that C has been successful also because it stayed small and consistent for so long.

The entire C, C ABI and standard lib specs, combined, are probably less words than the Promise spec from ECMAScript 262.

A small language that stays consistent and predictable lets developers evolve it in best practices, patterns, design choices, tooling. C has achieved all that.

No evolving language has anywhere near that freedom.

I don't want an ever evolving Zig too for what is worth. And I like Zig.

I don't think any developer can resolve all of the design tensions a programming language has, you can't make it ergonomic on its own.

But a small, modern, stable C would still be welcome, besides Odin.

I'm pretty sure the point of aggressively evolving now is to have to basically not evolve it at some point in the future?

Besides Odin? Does Odin give you most of this?