One thing I love about Go, not fancy-latest-hype features, until the language collapses or every upgrade becomes a nightmare, just adding useful stuff and getting out of the way.
One thing I love about Go, not fancy-latest-hype features, until the language collapses or every upgrade becomes a nightmare, just adding useful stuff and getting out of the way.
I know, I recently upgraded and skipped several releases without any issues with some large codebases.
The compatability guarantee is a massive win, so exciting to have a boring language to build on that doesn’t change much but just gradually gets better.
Really? My experience is that of C, C++, Go, Python, and Rust, Go BY FAR breaks code most often. (except the Python 2->3 change)
Sure, most of that is not the compiler or standard library, but dependencies. But I'm not talking random opensource library (I can't blame the core for that), but things like protobuf breaking EVERY TIME. Or x/net, x/crypto, or whatever.
But also yes, from random dependencies. It seems that language-culturally, Go authors are fine with breaking changes. Whereas I don't see that with people making Rust crates. And multiple times I've dug out C++ projects that I have not touched in 25 years, and they just work.
The stdlib has been very very stable since the first release - I still use some code from Go 1.0 days which has not evolved much.
The x/ packages are more unstable yes, that's why they're outside stdlib, though I haven't personally noticed any breakage and have never been bitten by this. What breakage did you see?
I think protobuf is notorious for breaking (but more from user changes). I don't use it I'm afraid so have no opinion on that, though it has gone through some major revisions so perhaps that's what you mean?
I don't tend to use much third party code apart from the standard library and some x libraries (most libraries are internal to the org), I'm sure if you do have a lot of external dependencies you might have a different experience.
Well, for C++ the backwards compatability is even better. Unless you're using `gets()` or `auto_ptr`, old C++ code either just continue to compile perfectly, or was always broken.
Sure, the Go standard library is in some sense bigger, so it's nice of them to not break that. But short of a Python2->3 or Perl5->6 migration, isn't that just table stakes for a language?
The only good thing about Go is that its standard library has enough coverage to do a reasonable number of things. The only good thing. But any time you need to step outside of that, it starts a bit-rotting timer that ticks very quickly.
> though [protobuf] has gone through some major revisions so perhaps that's what you mean?
No, it seems it's broken way more often than that, requiring manual changes.
But any time you need to step outside of that, it starts a bit-rotting timer that ticks very quickly.
This is not my experience with my own or third party code. I can't remember any regressions I experienced caused by code changes to the large stdlib at all in the last decade, and perhaps one caused by changes to a third party library (sendgrid, who changed their API with breaking changes, not really a Go problem).
A 'bit-rotting timer' isn't very specific or convincing, do you have examples in mind?
Isn't the x for experimental and therefore breaking API changes are expected?