Yeah. I think it's disingenuous to talk about breaking things for users, as though people are forced to use a newer language standard.

C99 "broke" implicit declarations, but few if any people were forced to use C99 and it never became the default in, say, GCC (-std=gnu11 became the default in GCC 5.5, released in 2017).

Agreed, the biggest concern with this point of view is that the developer then has to ensure the older version of the compiler stays functional as OS’s and execution environments progress. That may be a reach, but I think one of the heavy imperatives of going to a more defined standard for the semantics of C would be forcing compiler implementations to be very clear about what standard they are supporting and what kind of guarantees they are making about support timeframes for that standard.

The above is because I would hope that one result of pushing a nearly fully ‘defined’ (well or implementation) standard would be a strong interconnectedness and compositionality of semantics between all semantic constructions. This should mean compiler implementers can not just fall back on a mish-mash of standards compliance and then claim undefined behavior lets them just omit certain semantic constructs. I would like to think having the language be very clearly defined would almost require a complete adoption of some given standard to ensure the compiler was compliant.

I am aware the possibility of such a radical realignment of C’s structure is nearly impossible, but if C can not or will not do it, there may be the option for an incredibly similar language to piggy back it’s way to common use. This may also satisfy some of the arguments/positions in TFA concerning ‘Trust the Programmer’, where this superseding language can ‘unsafe/non-conforming’ out to C directly in C syntax in the event a non-conforming semantic construction is needed or desired by a developer.