So, what's your counterproposal?

Each of these tools provides real value.

* Bundlers drastically improve runtime performance, but it's tricky to figure out what to bundle where and how.

* Linting tools and type-safety checkers detect bugs before they happen, but they can be arbitrarily complex, and benefit from type annotations. (TypeScript won the type-annotation war in the marketplace against other competing type annotations, including Meta's Flow and Google's Closure Compiler.)

* Code formatters automatically ensure consistent formatting.

* Package installers are really important and a hugely complex problem in a performance-sensitive and security-sensitive area. (Managing dependency conflicts/diamonds, caching, platform-specific builds…)

As long as developers benefit from using bundlers, linters, type checkers, code formatters, and package installers, and as long as it's possible to make these tools faster and/or better, someone's going to try.

And here you are, incredulous that anyone thinks this is OK…? Because we should just … not use these tools? Not make them faster? Not improve their DX? Standardize on one and then staunchly refuse to improve it…?

I'm being a little coy because I do have a very detailed proposal.

In want the JS toolchain to stay written in JS but I want to unify the design and architecture of all those tools you mentioned so that they can all use a common syntax tree format and so can share data, e.g. between the linter and the formatter or the bundler and the type checker.

Yeah it's a shame that few people realize running 3 (or more) different programs that have separate parsing and AST is the bigger problem.

Not just because of perf (though the perf aspect is annoying) but because of how often the three will get out of sync and produce bizarre results

Hasn't that already been tried (10+ years ago) with projects like https://github.com/jquery/esprima ? Which have since seen usage dramatically reduced for performance reasons.

Yeah, you are correct. But that means I have the benefit of ten years development in the web platform, as well as having hindsight on the earlier effort.

I would say the reason the perf costs feel bad there is that the abstraction was unsuccessful. Throughtput isn't all that big a deal for a parser at all if you only need to parse the parts of the code that have actually changed

You can rip fast builds from my cold, dead hands. I’m not looking back to JS-only tooling, and I was there since the gulp days.

All I can say for sure is that the reason the old tools were slow was not that the JS runtime is impossible to build fast tools with.

And anyway, these new tools tend to have a "perf cliff" where you get all the speed of the new tool as long as you stay away from the JS integration API sued to support the "long tail" of uses cases. Once you fall off the cliff though, you're back to the old slow-JS cost regime...