> When mathematical notation evolves, old proofs do not become obsolete! There is no analogy to a "breaking change" in math.

I disagree. The development of non-euclidean geometry broke a lot of theorems that were used for centuries but failed to generalize. All of a sudden, parallels could reach each other.

> Can we write a family of nested programming languages where core features are guaranteed not to change in breaking ways, and you take on progressively more risk as you use features more to the "outside" of the language?

We could, the problem is everyone disagrees on what that core should be. Should it be memory-efficient? Fast? Secure? Simple? Easy to formally prove? Easy for beginners? Work on old architecture? Work on embedded architecture? Depending on who you ask and what your goals are, you'll pick a different set of core features, and thus a different notation for your core language.

That's the difference between math & programming languages. Everyone agrees on math's overall purpose. It's a tool to understand, formalise and reason about abstractions. And mathematical notation should make that easier.

That being said, the most serious candidate for your "core language guaranteed not to change and that you can build onto" would be ANSI C. It's been there more more than 35 years, is a standard, is virtually everywhere, you can even write a conforming compiler for a brand new architecture, even an embedded microchip very easily, and most of not all the popular languages nowadays are build on it (C++ of course, but also C#, java, javascript, python, go, php, perl, haskell, rust, all have a C base), and they all use a C FFI. I'm not sure ANSI C was the best thing that ever happened to our industry, though.

> Should it be memory-efficient? Fast? Secure? Simple? Easy to formally prove? Easy for beginners? Work on old architecture? Work on embedded architecture?

What do any of these have to do with guarantees of long-term compatibility? I'm not arguing that there should be One Programming Language To Rule Them All, I'm asking about whether we can design better guarantees about long-term compatibility into new programming languages.

The biggest source of incompatibility isn't in the programming languages. It's either in the specifications ("hmm maybe one byte isn't enough after all for a character, let's break all those assumptions") or in the hardware ("maybe 16 bits isn't enough of address space").