People adapt to the circumstances. A lot of Python uses are no longer about fast iteration on the REPL. Instead of that we are shipping Python to execute in clusters on very long running jobs or inside servers. It's not only about having to start all over after hours, it's simply that concurrent and distributed execution environments are hostile to interactive programming. Now you can't afford to wait for an exception and launch the debugger in postmortem. Or even if you do it's not very useful.
And now my personal opinion: If we are going the static typing way I would prefer simply to use Scala or similar instead of Python with types. Unfortunately in the same way that high performance languages like C attracts premature optimizers static types attract premature "abstracters" (C++ both). I also think that dynamic languages have the largest libraries for technical merit reasons. Being more "fluid" make them easier to mix. In the long term the ecosystem converges organically on certain interfaces between libraries.
And so here we are with the half baked approach of gradual typing and #type: ignore everywhere.
Here we are because:
* Types are expensive and dont tend to pay off on spikey/experimental/MVP code, most of which gets thrown away.
* Types are incredibly valuable on hardened production code.
* Most good production code started out spikey, experimental or as an MVP and transitioned.
And so here we are with gradual typing because "throwing away all the code and rewriting it to be "perfect" in another language" has been known for years to be a shitty way to build products.
Im mystified that more people here dont see that the value and cost of types is NOT binary ("they're good! theyre bad!") but exists on a continuum that is contingent on the status of the app and sometimes even the individual feature.
> Types are expensive and dont tend to pay off on spikey/experimental/MVP code, most of which gets thrown away.
I find I’ve spent so much time writing with typed code that I now find it harder to write POC code in dynamic languages because I use types to help reason about how I want to architect something.
Eg “this function should calculate x and return”, well if you already know what you want the function to do then you know what types you want. And if you don’t know what types you want then you haven’t actually decided what that function should do ahead of building it.
Now you might say “the point of experimental code is to figure out what you want functions to do”. But even if you’re writing an MVP, you should know what that each function should do by the time you’ve finished writing it. Because if you don’t know who to build a function then how do you even know that the runtime will execute it correctly?
Python doesn’t have “no types,” in fact it is strict about types. You just don’t have to waste time reading and writing them early on.
While a boon during prototyping, a project may need more structural support as the design solidifies, it grows, or a varied, growing team takes responsibility.
At some point those factors dominate, to the extent “may need” support approaches “must have.”
My point is if you don’t know what types you need, then you can’t be trusted to write the function to begin with. So you don’t actually save that much time in the end. typing out type names simply isn’t the time consuming part of prototyping.
But when it comes to refactoring, having type safety makes it very easy to use static analysis (typically the compiler) check for type-related bugs during that refactor.
I’ve spent a fair amount of years in a great many different PL paradigms and I’ve honestly never found loosely typed languages any fast for prototyping.
That all said, I will say that a lot of this also comes down to what you’re used to. If you’re used to thinking about data structures then your mind will go straight there when prototyping. If you’re not used to strictly typed languages, then you’ll find it a distraction.
Right after hello world you need a list of arguments or a dictionary of numbers to names. Types.
Writing map = {}, is a few times faster than map: Dictionary[int, str] = {}. Now multiply by ten instances. Oh wait, I’m going to change that to a tuple of pairs instead.
It takes me about three times longer to write equivalent Rust than Python, and sometimes it’s worth it.
Rust is slower to prototype than Python because Rust is a low level language. Not because it’s strictly typed. So that’s not really a fair comparison. For example, assembly doesn’t have any types at all and yet is slower to prototype than Rust.
Let’s take Visual Basic 6, for example. That was very quick to prototype in even with “option explicit” (basically forcing type declarations) defined. Quicker, even, than Python.
Typescript isn’t any slower to prototype in than vanilla JavaScript (bar setting up the build pipeline — man does JavaScript ecosystem really suck at DevEx!).
Writing map = {} only saves you a few keystrokes. And Unless you’re typing really slowly with one finger like an 80 year old using a keyboard for the first time, you’ll find the real input bottleneck isn’t how quickly you can type your data structures into code, but how quickly your brain can turn a product spec / Jira ticket into a mental abstraction.
> Oh wait, I’m going to change that to a tuple of pairs instead
And that’s exactly when you want the static analysis of a strict type system to jump in and say “hang on mate, you’ve forgotten to change these references too” ;)
Having worked on various code bases across a variety of different languages, the refactors that always scare me the most isn’t the large code bases, it’s the ones in Python or JavaScript because I don’t have a robust type system providing me with compile-time safety.
There’s an old adage that goes something like this: “don’t put off to runtime what can be done in compile time.”
As computers have gotten exponentially faster, we’ve seemed to have forgotten this rule. And to our own detriment.
I've found the transition point where types are useful to start even within a few hundred lines of code, and I've found types are not that restrictive if at all, especially if the language started out typed. The rare case I need to discard types that is available usually, and a code smell your doing something wrong.
Even within a recent toy 1h python interview question having types would've saved me some issues and caught an error that wasn't obvious. Probably would've saved 10m in the interview.
Yep, depends on your memory context capacity.
For me I often don't feel any pain-points when working before about 1kloc (when doing JS), however if a project is above 500loc it's often a tad painful to resume it months later when I've started to forget why I used certain data-structures that aren't directly visible (adding types at that point is usually the best choice since it gives a refresher of the code at the same time as doing a soundness check).
The transition to where type hints become valuable or even necessary isnt about how many lines of code you have it is about how much you rely upon their correctness.
Type strictness also isnt binary. A program with lots of dicts that should be classes doesnt get much safer just because you wrote : dict[str, dict] everywhere.
> * Types are expensive and dont tend to pay off on spikey/experimental/MVP code, most of which gets thrown away.
Press "X" to doubt. Types help _a_ _lot_ by providing autocomplete, inspections, and helping with finding errors while you're typing.
This significantly improves the iteration speed, as you don't need to run the code to detect that you mistyped a varible somewhere.
Pycharm, pyflakes, et all can do most of these without written types.
The more interesting questions, like “should I use itertools or collections?” Autocomplete can’t help with.
In some fields throwing away and rewriting is the standard, and it works, more or less. I'm thinking about scientific/engineering software: prototype in Python or Matlab and convert to C or C++ for performance/deployment constraints. It happens frequently with compilers too. I think migrating languages is actually more successful than writing second versions.
> * Types are expensive and dont tend to pay off on spikey/experimental/MVP code, most of which gets thrown away.
This is what people say, but I don't think it's correct. What is correct is that say, ten to twenty years ago, all the statically typed languages had other unacceptable drawbacks and "types bad" became a shorthand for these issues.
I'm talking about C (nonstarter for obvious reasons), C++ (a huge mess, footguns, very difficult, presumably requires a cmake guy), Java (very restrictive, slow iteration and startups, etc.). Compared to those just using Python sounds decent.
Nowadays we have Go and Rust, both of which are pretty easy to iterate in (for different reasons).
> Nowadays we have Go and Rust, both of which are pretty easy to iterate in (for different reasons).
It's common for Rust to become very difficult to iterate in.
https://news.ycombinator.com/item?id=40172033
I think Java was the main one. C/C++ are (relatively) close to the metal, system-level languages with explicit memory management - and were tacitly accepted to be the "complicated" ones, with dynamic typing not really applicable at that level.
But Java was the high-level, GCed, application development language - and more importantly, it was the one dominating many university CS studies as an education language before python took that role. (Yeah, I'm grossly oversimplifying - sincere apologies to the functional crowd! :) )
The height of the "static typing sucks!" craze was more like a "The Java type system sucks!" craze...
For me it was more the “java can’t easily process strings” craze that made it impractical to use for scripts or small to medium projects.
Not to mention boilerplate BS.
Recently, Java has improved a lot on these fronts. Too bad it’s twenty-five years late.
The issue with moving the ship where it's passanger wants it to be makes it more difficult for new passengers to get on.
This is clearly seen with typescript and the movement for "just use JS".
Furthermore, with LLMs, it should be easier than ever to experiment in one language and use another language for production loads.
I don't think types are expensive for MVP code unless they're highly complicated (but why would you do that?) Primitives and interfaces are super easy to type and worth the extra couple seconds.
Software quality only pays off on the long time. For the short time, garbage is quick and gets the job done.
Also, in my experience, the long time for software arrives in a couple of weeks.
PHP is a great example of the convergence of interfaces. Now they have different “PSR” standards for all sorts of things. There is one for HTTP clients, formatting, cache interfaces, etc. As long as your library implements the spec, it will work with everything else and then library authors are free to experiment on the implementation and contribute huge changes to the entire ecosystem when they find a performance breakthrough.
Types seem like a “feature” of mature software. You don’t need to use them all the time, but for the people stuck on legacy systems, having the type system as a tool in their belt can help to reduce business complexity and risk as the platform continues to age because tooling can be built to assert and test code with fewer external dependencies.
Python is ubiquitous in ML, often you have no choice but to use it
[dead]