There’s also a cost that installs take much longer, you need the full toolchain installed, and are no longer reproducible due to variations in the local build environment. If everything you do is a first-party CI build of a binary image you deploy, that’s okay but for tools you’re installing outside of that kind of environment it adds friction.
Agreed, in the JS world? Hell no. Ironically, doing a local build would itself pull in a bunch of dependencies, whereas now you can at least have one built dependency technically.
All not problems for Go: pull through proxy is fast and eliminates the need for a toolchain if you just want to download, and Go builds are fully bit-for-bit reproducible.
That would be an impossible expectation on the Go toolchain. The pull through proxy can’t magically avoid the need to transfer all dependencies to my device, especially including any native code or other resources. Large projects are going to need to download stuff - think about how some cloud clients build code dynamically from API definition or how many codecs wrap native code.
Similarly, newer versions of Go change the compiler–which to be first is a good thing–so even if I start with the same source in Git I might not have the same compiled bytes in the result.
Again, mone of this is a bad thing: it just means that I want to compile binaries and ship those so they don’t unexpectedly change in the future and my CI pipeline doesn’t need to have a full Go build stage when all I want is to use Crane to do something with a container.
sometimes i think shipping source + compiler would be faster...
the other day i was wondering why the terraform aws provider binary was now around 800MB compiled https://github.com/hashicorp/terraform-provider-aws/issues/3...
We have a terraform monorepo with many small workspaces (ie: state files). The amount of disk space used by the .terraform directories on a fully inited clone is wild
As a lot of these npm "packages" are glorified code snippets that should never have been individual libraries, perhaps this would drive people to standardise and improve the build tooling, or even move towards having sensibly sized libraries?
Yes, there’s widespread recognition that the small standard library makes JavaScript uniquely dependent on huge trees of packages, and that many of them (e.g. is-arrayish from last week) are no longer necessary but still linger from the era where it was even worse.
However, this isn’t a problem specific to JavaScript – for example, Python has a much richer standard library and we still see the same types of attacks on PyPI. The entire open source world has been built on an concept of trust which was arguably always more optimistic than realistic, and everyone is pivoting – especially after cryptocurrency’s inherent insecurity created enough of a profit margin to incentivize serious attacks.