> No one claims that good type systems prevent buggy software. But, they do seem to improve programmer productivity.
They really don’t. How did you arrive at such a conclusion?
> No one claims that good type systems prevent buggy software. But, they do seem to improve programmer productivity.
They really don’t. How did you arrive at such a conclusion?
Not that I can answer for OP but as a personal anecdote; I've never been more productive than writing in Rust, it's a goddamn delight. Every codebase feels like it would've been my own and you can get to speed from 0 to 100 in no time.
Yeah, I’ve been working mainly in rust for the last few years. The compile time checks are so effective that run time bugs are rare. Like you can refactor half the codebase and not run the app for a week, and when you do it just works. I’ve never had that experience in other languages.
Through empirical evidence? Do you think that the vast majority of software devs moved to typing for no reason?
> Do you think that the vast majority of software devs moved to typing for no reason?
It is quite clear that this industry is mostly driven by hype and fades, not by empirical studies.
Empirical evidence in favor of a claim that static typing and complex type systems reduce bugs or improve productivity is highly inconclusive at best
It's a bad reason. A lot of best practices are temporary blindnesses, comparable, in some sense, with supposed love to BASIC before or despite Dijkstra. So, yes, it's possible there is no good reason. Though I don't think it's the case here.
We don't actually have empirical evidence on the topic, surprisingly.
It's just people's hunches.
I feel like the terms logical, empirical, rational and objective are used interchangeably by the general public, with one being in vogue at a time.