The thing that finally got me on board with optional type hints in Python was realizing that they're mainly valuable as documentation.

But it's really valuable documentation! Knowing what types are expected and returned just by looking at a function signature is super useful.

The best kind of documentation is the kind you can trust is accurate. Type defs wouldn't be close to as useful if you didn't really trust them. Similarly, doctests are some of the most useful documentation because you can be sure they are accurate.

The best docs are the ones you can trust are accurate. The second best docs are ones that you can programmatically validate. The worst docs are the ones that can’t be validated without lots of specialized effort.

Python’s type hints are in the second category.

I’d almost switch the order here! In a world with agentic coding agents that can constantly check for type errors from the language server powering the errors/warnings in your IDE, and reconcile them against prose in docstrings… types you can programmatically validate are incredibly valuable.

Do you have an example of the first?

When I wrote that, I was thinking about typed, compiled languages' documentation generated by the compiler at build time. Assuming that version drift ("D'oh, I was reading the docs for v1.2.3 but running v4.5.6") is user error and not a docs-trustworthiness issue, that'd qualify.

But now that I'm coming back to it, I think that this might be a larger category than I first envisioned, including projects whose build/release processes very reliably include the generation+validation+publication of updated docs. That doesn't imply a specific language or release automation, just a strong track record of doc-accuracy linked to releases.

In other words, if a user can validate/regenerate the docs for a project, that gets it 9/10 points. The remaining point is the squishier "the first party docs are always available and well-validated for accuracy" stuff.

Languages with strong static type systems

Is there a mainstream language where you can’t arbitrarily cast a variable to any other type?

This resonates with me this so much. I feel like half the comments in this thread are missing the value typing, but maybe they've never had the misfortune of working with hundreds of other developers on a project with no defined contracts on aggregates / value objects outside of code comments and wishful thinking.

I've worked on large python codebases for large companies for the last ~6 years of my career; types have been the single biggest boon to developer productivity and error reduction on these codebases.

Just having to THINK about types eliminates so many opportunities for errors, and if your type is too complex to express it's _usually_ a code smell; most often these situations can be re-written in a more sane albeit slightly verbose fashion, rather than using the more "custom" typing features.

No one gets points for writing "magical" code in large organizations, and typing makes sure of this. There's absolutely nothing wrong with writing "boring" python.

Could we have accomplished this by simply having used a different language from the beginning? Absolutely, but often times that's not a option for a company with a mature stack.

TL;DR -- Typing in python is an exception tool to scale your engineering organization on a code-base.

Decent argument in principle. It still sucks for non-obvious types though:

https://old.reddit.com/r/Python/comments/10zdidm/why_type_hi...

Edit: Yes, one can sometimes go with Any, depending on the linter setup, but that's missing the point, isn't it?

The correct response to this is to figure what is the use case for your function: IE: add two numbers. Set the input and output as decimal and call it a day

Sure. Let me just quickly refactor Pytorch.

Actually, it's not missing the point. Sometimes you really do want duck typing, in which case you allow Any. It's not all-or-nothing.

What the reddit post is demonstrating is that the Python type system is still too naive in many respects (and that there are implementation divergences in behavior). In other languages, this is a solved problem - and very ergonomic and safe.

As the top comment says, if you don't know or want to define the type just use Any. That's what it's there for.

That entire Reddit post is a clueless expert beginner rant about something they don't really understand, unfortunate that it's survived as long as it has or that anyone is taking it as any sort of authoritative argument just because it's long.

> if you don't know or want to define the type

That's not the issue the reddit post is raising. The reddit post is pointing out that what a "type" is is not as simple as it looks. Particularly in a language like Python where user-defined types proliferate, and can add dunder methods that affect statements that involve built-in operations. "Just use Any" doesn't solve any of those problems.

> just use Any.

All the above said: not putting a type in at all is even easier than using Any, and is semantically equivalent.

The Reddit post falls under the case of "don't know" the type. If you want to allow users to pass in any objects, try to add and fail at runtime... that's exactly what Any is for.

But the entire post is built upon the premise that accepting all types is good API design. Which it isn't, at all.

> The Reddit post falls under the case of "don't know" the type.

No, it doesn't. The desired type is known; it's "Addable" (i.e., "doesn't throw an exception when the built-in add operator is used"). The problem is expressing that in Python's type notation in a way that catches all edge cases.

> If you want to allow users to pass in any objects, try to add and fail at runtime

Which is not what the post author wants to do. They want to find a way to use Python's type notation to catch those errors with the type checker, so they don't happen at runtime.

> the entire post is built upon the premise that accepting all types is good API design

It is based on no such thing. I don't know where you're getting that from.

> The desired type is known; it's "Addable" (i.e., "doesn't throw an exception when the built-in add operator is used").

The mistake both you and the reddit posts' author make is treating the `+` operator the same as you would an interface method. Despite Python having __add__/__radd__ methods, this isn't true, nor is it true in many other programming languages. For example, Go doesn't have a way to express "can use the + operator" at all, and "can use comparison operators" is defined as an explicit union between built-in types.[0] In C# you could only do this as of .NET 7, which was released in Nov 2022[1] -- was the C# type system unusable for the 17 years prior, when it didn't support this scenario?

If this were any operation on `a` and `b` other than a built-in operator, such as `a.foo(b)`, it would be trivial to define a Protocol (which the author does in Step 4) and have everything work as expected. It's only because of misunderstanding of basic Python that the author continues to struggle for another 1000 words before concluding that type checking is bad. It's an extremely cherry-picked and unrealistic scenario either from someone who is clueless, or knows what they're doing and is intentionally being malicious in order to engagement bait.[2]

This isn't to say Python (or Go, or C#) has the best type system, and it certainly lacks compared to Rust which is a very valid complaint, but "I can't express 'type which supports the '+' operator'" is an insanely esoteric and unusual case, unsupported in many languages, that it's disingenuous to use it as an excuse for why people shouldn't bother with type hinting at all.

[0] https://pkg.go.dev/cmp#Ordered

[1] https://learn.microsoft.com/en-us/dotnet/standard/generics/m...

[2] actually reading through the reddit comments, the author specifically says they were engagement baiting so... I guess they had enough Python knowledge to trick people into thinking type hinting was bad, fair enough!

> treating the `+` operator the same as you would an interface method

In other words, you agree that the Python type hint system does not give you a good, built-in way to express the "Addable" type.

Which means you are contradicting your claims that the type the article wants to express is "unknown" and that the article is advocating using "Any" for this case. The type is not unknown--it's exactly what I said: "doesn't throw an exception when using the + operator". That type is just not expressible in Python's type hint system in the way that would be needed. And "Any" doesn't address this problem, because the article is not saying that every pair of objects should be addable.

> "I can't express 'type which supports the '+' operator'" is an insanely esoteric and unusual case

I don't see why. Addition is a very commonly used operation, and being able to have a type system that can express "this function takes two arguments that can be added using the addition operator" seems like something any type system that delivers the goods it claims to deliver ought to have.

> unsupported in many languages

Yes, which means many languages have type systems that claim to deliver things they can't actually deliver. They can mostly deliver them, but "mostly" isn't what advocates of using type systems in all programs claim. So I think the article is making a useful point about the limitations of type systems.

> it's disingenuous to use it as an excuse for why people shouldn't bother with type hinting at all.

The article never says that either. You are attacking straw men.

> I don't see why. Addition is a very commonly used operation, and being able to have a type system that can express "this function takes two arguments that can be added using the addition operator" seems like something any type system that delivers the goods it claims to deliver ought to have.

If your comparison is Rust, sure, but you can't even express this in Java. No, Java's type system is not great, but it's a type system that's been used for approximately 500 trillion lines of production code powering critical systems and nobody has ever said "Java sucks because I can't express 'supports the + operator' as a generic type". (It sucks for many other reasons.)

Again, it is factually and objectively an esoteric and unusual case. Nobody in the real world is writing generics like this, only academics or people writing programming blogs about esoterica.

If your argument is that all type systems are bad or deficient, fine, but calling out Python for this when it has the exact same deficiency as basically every other mainstream language is asinine.

> The article never says that either. You are attacking straw men.

The article says "Turning even the simplest function that relied on Duck Typing into a Type Hinted function that is useful can be painfully difficult." The subterfuge is that this is not even remotely close to a simple function because the type being expressed, "supports the + operator", is not even remotely close to a simple type.

> it is factually and objectively an esoteric and unusual case.

Sorry, but your unsupported opinion is not "factual and objective".

> If your argument is that all type systems are bad or deficient

I said no such thing, any more than the article did. Again you are attacking a straw man. (If you had said "limited in what they can express", I might buy that. But you didn't.)

I think I've said all I have to say in this subthread.

It's factual and objective that billions, if not trillions of lines of Java and Go have been deployed and the language still cannot express "supports the + operator" as a type constraint. In production, non-academic settings, people don't generally write code like that.

Again, this is an esoteric limitation from the perspective of writing code that runs working software, not a programming language theory perspective.

How many of those lines of code would have benefited from being able to express that type constraint, if the language made it possible?

You have no idea, and nor does anyone else. But that's what you would need "factual and objective" evidence about to support the claim you made.

By your argument, anything that programming languages don't currently support, must be an "esoteric limitation" because billions if not trillions of lines of code have been written without it. Which would mean programming languages would never add new features at all. But it's certainly "factual and objective" that programming languages add new features all the time. Maybe this is another feature that at some point a language will add, and programmers will find it useful. You don't even seem to be considering such a possibility.

> But the entire post is built upon the premise that accepting all types is good API design. Which it isn't, at all.

Was Tim Peters also wrong way back in the day when he counseled Guido van Rossum to allow floats to be added to integers without a cast, like other popular languages?

How is `float | int` anywhere close to equivalent to `Any`?

How is "responds to the `__add__` method" anywhere close to equivalent to `Any`?

If your implication is that "implementing __add__ means you can use the + operator", you are incorrect. This is a common Python beginner mistake, but it isn't really a Python type checking issue, this is complexity with Python built-ins and how they interact with magic methods.

My suggestion -- don't rely on magic methods.

This is a strange and aggressive bit of pedantry. Yes, you'd also need `__radd__` for classes that participate in heterogenous-type addition, but it's clear what was meant in context. The fundamentals are not all "beginner" level and beginners wouldn't be implementing operator overloads in the first place (most educators hold off on classes entirely for quite a while; they're pure syntactic sugar after all, and the use case is often hard to explain to beginner).

Regardless, none of that bears on the original `slow_add` example from the Reddit page. The entire point is that we have an intuition about what can be "added", but can't express it in the type system in any meaningful way. Because the rule is something like "anything that says it can be added according to the protocol — which in practical terms is probably any two roughly-numeric types except for the exceptions, and also most container types but only with other instances of the same type, and also some third-party things that represent more advanced mathematical constructs where it makes sense".

And saying "don't rely on magic methods" does precisely nothing about the fact that people want the + symbol in their code to work this way. It does suggest that `slow_add` is a bad thing to have in an API (although that was already fairly obvious). But in general you do get these issues cropping up.

Dynamic typing has its place, and many people really like it, myself included. Type inference (as in the Haskell family) solves the noise problem (for those who consider it a problem rather than something useful) and is elegant in itself, but just not the strictly superior thing that its advocates make it out to be. People still use Lisp family languages, and for good reason.

But maybe Steve Yegge would make the point better.

> This is a strange and aggressive bit of pedantry.

There's nothing pedantic about it. That's how Python works, and getting into the nuts and bolts of how Python works is precisely why the linked article makes type hinting appear so difficult.

> The entire point is that we have an intuition about what can be "added", but can't express it in the type system in any meaningful way.

As the post explores, your intuition is also incorrect. For example, as the author discovers in the process, addition via __add__/__radd__ is not addition in the algebraic field sense. There is no guarantee that adding types T + T will yield a T. Or that both operands are of the same type at all, as would be the case with "adding" a string and int. Or that A + B == B + A. We can't rely on intuition for type systems.

> your intuition is also incorrect.

No, it definitionally isn't. The entire point is that `+` is being used to represent operations where `+` makes intuitive sense. When language designers are revisiting the decision to use the `+` symbol to represent string concatenation, how many of them are thinking about algebraic fields, seriously?

And all of this is exactly why you can't just say that it's universally bad API design to "accept all types". Because the alternative may entail rejecting types for no good reason. Again, dynamically typed languages exist for a reason and have persisted for a reason (and Python in particular has claimed the market share it has for a reason) and are not just some strictly inferior thing.

> you can't just say that it's universally bad API design to "accept all types"

Note, though, that that's not really the API design choice that's at stake here. Python will still throw an exception at runtime if you use the + operator between objects that don't support being added together. So the API design choice is between that error showing up as a runtime exception, vs. showing up as flagged by the type checker prior to runtime.

Or, to put it another way, the API design choice is whether or not to insist that your language provide explicit type definitions (or at least a way to express them) for every single interface it supports, even implicit ones like the + operator, and even given that user code can redefine such interfaces using magic methods. Python's API design choice is to not care, even with its type hinting system--i.e., to accept that there will be interface definitions that simply can't be captured using the type hinting system. I personally am fine with that choice, but it is a design choice that language users should be aware of.

> No, it definitionally isn't. The entire point is that `+` is being used to represent operations where `+` makes intuitive sense.

Huh? There's no restriction in Python's type system that says `+` has to "make sense".

  import requests

  class Smoothie:
      def __init__(self, fruits):
          self.fruits = fruits

      def __repr__(self):
          return " and ".join(self.fruits) + " smoothie"

  class Fruit:
      def __init__(self, name):
          self._name = name

      def __add__(self, other):
          if isinstance(other, Fruit):
              return Smoothie([self._name, other._name])
          return requests.get("https://google.com")

  if __name__ == "__main__":
      print(Fruit("banana") + Fruit("mango"))
      print(Fruit("banana") + 123)
> banana and mango smoothie

> <Response [200]>

So we have Fruit + Fruit = Smoothie. Overly cute, but sensible from a CS101 OOP definition and potentially code someone might encounter in the real world, and demonstrates how not all T + T -> T. And we have Fruit + number = requests.Response. Complete nonsense, but totally valid in Python. If you're writing a generic method `slow_add` that needs to support `a + b` for any two types -- yes, you have to support this nonsense.

I guess that's the difference between the Python and the TypeScript approach here. In general, if something is possible, valid, and idiomatic in JavaScript, then TypeScript attempts to model it in the type system. That's how you get things like conditional types and mapped types that allow the type system to validate quite complex patterns. That makes the type system more complex, but it means that it's possible to use existing JavaScript patterns and code. TypeScript is quite deliberately not a new language, but a way of describing the implicit types used in JavaScript. Tools like `any` are therefore an absolute last resort, and you want to avoid it wherever possible.

When I've used Python's type checkers, I have more the feeling that the goal is to create a new, typed subset of the language, that is less capable but also easier to apply types to. Then anything that falls outside that subset gets `Any` applied to it and that's good enough. The problem I find with that is that `Any` is incredibly infective - as soon as it shows up somewhere in a program, it's very difficult to prevent it from leaking all over the place, meaning you're often back in the same place you were before you added types, but now with the added nuisance of a bunch of types as documentation that you can't trust.

> My suggestion -- don't rely on magic methods.

So no e.g. numpy or torch then?

> The thing that finally got me on board with optional type hints in Python was realizing that they're mainly valuable as documentation.

> But it's really valuable documentation! Knowing what types are expected and returned just by looking at a function signature is super useful.

So ... you didn't have this realisation prior to using Python type hints? Not from any other language you used prior to Python?

I didn't. I've been mainly a Python, PHP and JavaScript programmer for ~25 years and my experience with typed languages was mostly pre-type-inference Java which felt wildly less productive than my main languages.

> I didn't. I've been mainly a Python, PHP and JavaScript programmer for ~25 years

Maybe its time you expanded your horizons, then. Try a few statically typed languages.

Even plain C gives you a level of confidence in deployed code that you will not get in Python, PHP or Javascript.

Maybe if your C has aggressive test coverage and you’re using Valgrind religiously and always checking errno when you’re supposed to and you’re checking the return value of everything. Otherwise lol. C as it’s written by middling teams is a soup of macros, three-star variables, and questionable data structure implementations, where everybody fiddles with everybody else’s data. I’ll take good C over bad Python, but good C is rare.

> C as it’s written by middling teams is a soup of macros, three-star variables, and questionable data structure implementations, where everybody fiddles with everybody else’s data. I’ll take good C over bad Python, but good C is rare.

Ironically, the worst production C written in 2025 is almost guaranteed to be better than the average production Python, Javascript, etc.

The only people really choosing C in 2025 are those with a ton of experience under their belt, who are comfortable with the language and its footguns due to decades of experience.

IOW, those people with little experience are not choosing C, and those that do choose it have already, over decades, internalised patterns to mitigate many of the problems.

At the end of the day, in 2025, I'd still rather maintain a system written in a statically typed language than a system written in a dynamically typed language.

> The only people really choosing C in 2025 are those with a ton of experience under their belt, who are comfortable with the language and its footguns due to decades of experience.

Experienced users of C can't be the only people who use it if the language is going to thrive. It's very bad for a language when the only ones who speak it are those who speak it well. The only way you get good C programmers is by cultivating bad C programmers, you can't have one without the other. If you cut off the bad programmers (by shunning or just not appealing to them, or loading your language with too many beginner footguns), there's no pipeline to creating experts, and the language dies when the experts do.

The people who come along to work on their legacy systems are better described as archaeologists than programmers. COBOL of course is the typical example, there's no real COBOL programming community to speak of, just COBOL archeologists who maintain those systems until they too shall die and it becomes someone else's problem, like the old Knight at the end of Indiana Jones.

> Experienced users of C can't be the only people who use it if the language is going to thrive.

I don't think it's going to thrive. It's going to die. Slowly, via attrition, but there you go.

I find automated tests give me plenty of confidence in the Python code I deploy. I'd rather deploy a codebase with comprehensive tests and no types over one with types and no tests.

I've been dabbling with Go for a few projects and found the type system for that to be pleasant and non-frustrating.

I feel like Go is a very natural step from Python because it's still pretty easy and fast to start with.

(and if you want to embrace static types, the language starting with them might get advantages over an optional backwards compatible type system)

You may have read this already but the biggest surprise one of the Go creators had was Go was motivated by unhappiness with C++, and they expected to get C++ users, but instead people came from Python and Ruby: https://commandcenter.blogspot.com/2012/06/less-is-exponenti...

> I'd rather deploy a codebase with comprehensive tests and no types over one with types and no tests.

With Python, PHP and Javascript, you only option is "comprehensive tests and no types".

With statically typed languages, you have options other than "types with no tests". For example, static typing with tests.

Don't get me wrong; I like dynamically typed languages. I like Lisp in particular. But, TBH, in statically typed languages I find myself producing tests that test the business logic, while in Python I find myself producing tests that ensure all callers in a runtime call-chain have the correct type.

BTW: You did well to choose Go for dipping your toes into statically typed languages - the testing comes builtin with the tooling.

That was more because of Java than static typing.

Type hints as documentation are a gateway drug to type hints for bug finding. Keep at it :)

This is a naive realization. When type checking is used to the maximum extent they become as just as important as unit testing. It is an actual safety contribution to the code.

Many old school python developers don't realize how important typing actually is. It's not just documentation. It can actually roughly reduce dev time by 50% and increase safety by roughly 2x.

It's claims like that which used to put me off embracing type hints!

I'd been programming for 20+ years and I genuinely couldn't think of any situations where I'd had a non-trivial bug that I could have avoided if I'd had a type checker - claims like "reduce dev time by 50%" didn't feel credible to me, so I stuck with my previous development habits.

Those habits involved a lot of work performed interactively first - using the Python terminal, Jupyter notebooks, the Firefox/Chrome developer tools console. Maybe that's why I never felt like types were saving me any time (and in fact were slowing me down).

Then I had my "they're just interactive documentation" realization and finally they started to click for me.

It depends on the project. If you're working always on one project and you have all the time in the world to learn it (or maybe you wrote it), then you can get away with dynamic types. It's still worse but possible.

But if you aren't familiar with a project then dynamic typing makes it an order of magnitude harder to navigate and understand.

I tried to contribute some features to a couple of big projects - VSCode and Gitlab. VSCode, very easy. I could follow the flow trivially, just click stuff to go to it etc. Where abstract interfaces are used it's a little more annoying but overall wasn't hard and I have contributed a few features & fixes.

Gitlab, absolutely no chance. It's full of magically generated identifiers so even grepping doesn't work. If you find a method like `foo_bar` it's literally impossible to find where it is called without being familiar with the entire codebase (or asking someone who is) and therefore knowing that there's a text file somewhere called `foo.csv` that lists `bar` and the method name is generated from that (or whatever).

In VSCode it was literally right-click->find all references.

I have yet to succeed in modifying Gitlab at all.

I did contribute some features to gitlab-runner, but again that is written in Go so it is possible.

So in some cases those claims are not an exaggeration - static types take you from "I give up" to "not too hard".

> In VSCode it was literally right-click->find all references.

Flip side of this is that I hate trying to read code written by teams relying heavily on such features, since typically zero time was spent on neatly organizing the code and naming things to make it actually readable (from top to bottom) or grep-able. Things are randomly spread out in tiny files over countless directories and it's a maze you stumble around just clicking identifiers to jump somewhere. Where something is rarely matter as the IDE will find it. I never develop any kind of mental image of that style of code and it completely rules out casually browsing the code using simpler tools.

That hasn't been my experience at all. I think maybe it feels more like a maze because when you go-to-definition you often don't actually check where you are in the filesystem, so you don't build a mental map of the repo as quickly as you do when you are forced to manually search through all the files. But I wouldn't say that is better.

Kind of like how you don't learn an area when you always use satnav as quickly as you do when you manually navigate with paper maps. But do you want to go back to paper maps? I don't.

Static type checking (which is what I assume you mean by "typing") can also be a massive pain in the ass that stands in the way of incremental development, even if the end-goal is to ship an api with clear type signatures.

There are developers who design apis by trying to figure out readable invocations. These developers discover, rather than design, type hierarchies and library interfaces.

> Many old school python developers don't realize how important typing actually is.

I don't think this is true. There's simply a communication breakdown where type-first developers don't see the benefits of disabling static checking to design interfaces, and interface-first developers don't see why they should put static checking ahead of interface iteration speed.

> Static type checking (which is what I assume you mean by "typing") can also be a massive pain in the ass that stands in the way of incremental development,

No they dont. There is nothing about types that would make incremental develpment harder. They keep having the same benefits when being incremental.

> There is nothing about types that would make incremental develpment harder.

Oh, please, this is either lack of imagination or lack of effort to think. You've never wanted to test a subset of a library halfway through a refactor?

Yes, type checkers are very good at tracking refactoring progress. If it turns out that you can proceed to test some subset, then congratulations, you found a new submodule.

I am continually astounded by the stubborn incuriosity of humans with a bone to pick.

What in the world are you talking about. Please specify how lack of types helped you in your aforementioned scenario.

I don't think it's a lack of curiosity from others. But it's more like fundamental lack of knowledge from you. Let's hear it. What is it are you actually talking about? Testing a subset of a library halfway though a refactor? How does a lack of types help with that?

> There are developers who design apis by trying to figure out readable invocations. These developers discover, rather than design, type hierarchies and library interfaces.

My hunch is that the people who see no downsides whatsoever in static typing are those who mostly just consume APIs.

There are downsides. But the upsides outweigh the downsides.

I'm not a consumer of APIs. I've done game programming, robotics, embedded system development (with C++ and rust), (web development frontend with react/without react, with jquery, with angurar, with typescript, with js, zod) (web development backend with golang, haskell, nodejs typescript, and lots and lots of python with many of the most popular frameworks with flask + sqlalchemy, django, FastApi + pydantic, )

I've done a lot. I can tell you. If you don't see how types outweigh untyped languages, you're a programmer with experience heavily weighed toward untyped programming. You don't have balanced experience to make a good judgement. Usually these people have a "data scientist" background. Data analyst or data scientist or machine learning engineers... etc. These guys start programing heavily in the python world WITHOUT types and they develop unbalanced opinions shaped by their initial styles of programming. If this describes you, then stop and think... I'm probably right.

You are wrong. I learned programming mostly in C++ in the late 90's, and programmed in C, C++ and Java in professional settings for a decade or so, and still do from time to time.

Hm if you want or have time, can you give me a specific example of where no types are clearly superior to types? Maybe you can convince me but I still think your opinion is wrong despite your relevant experience.

>There are developers who design apis by trying to figure out readable invocations. These developers discover, rather than design, type hierarchies and library interfaces.

No, you're one of the old school python developers. Types don't hinder creativity, they augment it. The downside is the slight annoyance of updating a type definition and the run time definition vs. just updating the runtime definition.

Let me give you an example of how it hinders creativity.

Let's say you have a interface that is immensely complex. Many nested structures thousands of keys, and let's say you want to change the design by shifting 3 or 4 things around. Let's also say this interface is utilized by hundreds of other methods and functions.

When you move 3 or 4 things around in a complex interface you're going to break a subset of those hundreds of other methods or functions. You're not going to know where they break if you don't have type checking enabled. You're only going to know if you tediously check every single method/function OR if it crashes during runtime.

With a statically typed definition you can do that change and the type checker will identify EVERY single place where an additional change to the methods that use that type needs to be changed as well. This allows you to be creative and make any willy nilly changes you want because you are confident that ANY change will be caught by the type checker. This Speeds up creativity, while without it, you will be slowed down, and even afraid to make the breaking change.

You are basically the stereotype I described. An old school python developer. Likely one who got used to programming without types and now hasn't utilized types extensively enough to see the benefit.

>I don't think this is true. There's simply a communication breakdown where type-first developers don't see the benefits of disabling static checking to design interfaces, and interface-first developers don't see why they should put static checking ahead of interface iteration speed.

This is true. You're it. You just don't know it. When I say these developers don't know I'm literally saying they think like you and believe the same things you believe BECAUSE they lack knowledge and have bad habits.

The habit thing is what causes the warped knowledge. You're actually slowed down by types because you're not used to it as you spent years coding in python without types so it's ingrained for you to test and think without types. Adding additional types becomes a bit of a initial overhead for these types of people because their programming style is so entrenched.

Once you get used to it and once you see that it's really just a slight additional effort, than you will get it. But it takes a bit of discipline and practice to get there.

> It can actually roughly reduce dev time by 50% and increase safety by roughly 2x.

Type annotations don’t double productivity. What does “increase safety by 2×” even mean? What metric are you tracking there?

In my experience, the main non-documentation benefit of type annotations is warning where the code is assuming a value where None might be present. Mixing up any other kind of types is an extremely rare scenario, but NoneType gets everywhere if you let it.

> Type annotations don’t double productivity.

Obviously this post is still firmly in made up statistics land, but i agree with OP, in some cases they absolutely do.

New code written by yourself? No, probably not. But refactoring a hairy old enterprise codebase? Absolutely a 2×, 3× multiplier to productivity / time-to-correctness there.

>Type annotations don’t double productivity. What does “increase safety by 2×” even mean? What metric are you tracking there?

My own anecdotal metric. Isn't that obvious? The initial post was an anecdotal opinion as well. I don't see a problem here.

>In my experience, the main non-documentation benefit of type annotations is warning where the code is assuming a value where None might be present. Mixing up any other kind of types is an extremely rare scenario, but NoneType gets everywhere if you let it.

It's not just None. Imagine some highly complex object with nested values and you have some function like this:

   def modify_direction(direction_object) -> ...
wtf is direction object? Is it in Cartesian or is it in polar? Is in 2D or 3D? Most old school python devs literally have to find where modify_direction is called and they find this:

   def modify_data(data) -> ...
       ...
       modify_direction(data.quat)
Ok then you have to find where modify data is called, and so on and so forth until you get to here:

   def combind_data(quat) -> ...

   def create_quat() -> quat

And then boom you figure out what it does by actually reading all the complex quaternion math create_quat does.

Absolutely insane. If I have a type, I can just look at the type to figure everything out... you can see how much faster it is.

Oh and get this. Let's say there's someone who feels euler angles are better. So he changes create_quat to create_euler. He modifies all the places create_quat is used (which is about 40 places) and he misses 3 or 4 places where it's called.

He then ships it to production. Boom The extra time debugging production when it crashes, ans also extra time tediously finding where create_quat was used. All of that could have been saved by a type checker.

I'm a big python guy. But I'm also big into haskell. So I know both the typing worlds and the untyped worlds really well. Most people who complain like you literally have mostly come from a python background where typing isn't used much. Maybe you used types occasionally but not in a big way.

If you used both untyped languages and typed languages extensively you will know that types are intrinsically better. It's not even a contest. Anyone who still debates this stuff just lacks experience.

> If you used both untyped languages and typed languages extensively you will know that types are intrinsically better. It's not even a contest. Anyone who still debates this stuff just lacks experience.

Or have enough experience to have lived e.g. the J2EE and C++ template hells and see where this is going.

typing can get extreme to the point where it becomes proof based typing. So I know what you mean here. I've lived through it and done it.

In general types outweigh no types EVEN with the above.

> My own anecdotal metric. Isn't that obvious? The initial post was an anecdotal opinion as well. I don't see a problem here.

WTF is “an anecdotal metric”‽ That just sounds like an evasive way to say “I want to make up numbers I can’t justify”.

> wtf is direction object? Is it in Cartesian or is it in polar? Is in 2D or 3D?

This seems very domain-specific.

> Most people who complain like you literally have mostly come from a python background where typing isn't used much. Maybe you used types occasionally but not in a big way.

> If you used both untyped languages and typed languages extensively you will know that types are intrinsically better. It's not even a contest. Anyone who still debates this stuff just lacks experience.

I’ve got many years of experience with static typed languages over a 25 year career. Just because somebody disagrees with you, it doesn’t mean they are a clueless junior.

> WTF is “an anecdotal metric”

It's a metric (how much more productive he is), and anecdotal (base only on his experience). Pretty obvious I would have thought.

> This seems very domain-specific.

It was an example from one domain but all domains have types of things. Are you really trying to say that only 3D games specifically would benefit from static types?

> Just because somebody disagrees with you, it doesn’t mean they are a clueless junior.

Clueless senior then I guess? Honestly I don't know how you can have this much experience and still not come to the obvious conclusion. Perhaps you only write small scripts or solo projects where it's more feasible to get away without static types?

What would you say to someone who said "I have 25 years of experience reading books with punctuation and I think that punctuation is a waste of time. Just because you disagree with me doesn't mean I'm clueless."?

>WTF is “an anecdotal metric”‽ That just sounds like an evasive way to say “I want to make up numbers I can’t justify”.

What I have to have scientific papers for every fucking opinion I have? The initial Parent post was an anecdotal opinion. Your post is an opinion. I can't have opinions here without citing a scientific paper that's 20 pages long and no is going to read but just blindly trust because it's "science"? Come on. What I'm saying is self evident to people who know. There are thousands of things like this in the world where people just know even though statistical proof hasn't been measured or established. For example eating horse shit everyday probably isn't healthy even though it there isn't SCIENCE that proves this action as unhealthy directly. Type checking is just one of those things.

OBVIOUSLY I think development is overall much better, much faster and much safer with types. I can't prove it with metrics, but I'm confident my "anecdotal" metrics with I prefaced with "roughly" are "roughly" ballpark trueish.

>This seems very domain-specific.

Domain specific? Basic orientation with quaternions and euler angles is specific to reality. Orientation and rotations exist in reality and there are thousands and thousands of domains that use it.

Also the example itself is generic. Replace euler angles and quats with vectors and polar coordinates. Or cats and dogs. Same shit.

>I’ve got many years of experience with static typed languages over a 25 year career. Just because somebody disagrees with you, it doesn’t mean they are a clueless junior.

The amount of years of experience is irrelevant. I know tons of developers with only 5 years of experience who are better than me and tons of developers with 25+ who are horrible.

I got 25 years as well. If someone disagrees with me (on this specific topic), it absolutely doesn't mean they are a junior. It means they lack knowledge and experience. This is a fact. It's not an insult. It just means for a specific thing they don't have experience or knowledge which is typical. I'm sure there's tons of things where you could have more experience. Just not this topic.

If you have experience with static languages it likely isn't that extensive. You're likely more of a old school python guy who spend a ton of time programming without types.

> What I have to have scientific papers for every fucking opinion I have?

No, but if you’re going to say things like “increase safety by roughly 2x” then if you can’t even identify the unit then you are misleading people.

It’s absolutely fine to have an opinion. It’s not fine to make numbers up.

> I'm confident my "anecdotal" metrics with I prefaced with "roughly" are "roughly" ballpark trueish.

Okay, so if it’s 1.5×, 2.0×, or 2.5×… again, what metric? What unit are we dealing with?

You’re claiming that it’s “in the ballpark”, but what is “in the ballpark”? The problem is not one of accuracy, the problem is that it’s made up.

> If someone disagrees with me (on this specific topic), it absolutely doesn't mean they are a junior. It means they lack knowledge and experience. This is a fact.

It’s not a fact, it’s ridiculous. You genuinely believe that if somebody disagrees with you, it’s a fact that they lack knowledge and experience? It’s not even remotely possible for somebody to have an informed difference of opinion with you?

>No, but if you’re going to say things like “increase safety by roughly 2x” then if you can’t even identify the unit then you are misleading people.

So when I talk about multipliers I have to have a unit? What is the unit of safety? I can't say something like 2x more safe? I just have to say more safe? What if I want to emphasize that it can DOUBLE safety?

Basically with your insane logic people can't talk about productivity or safety or multipliers at the same time because none of these concepts have units.

Look I told YOU it's anecdotal, EVERYONE can read it. You're no longer "deceived" and no one else is.

>Okay, so if it’s 1.5×, 2.0×, or 2.5×… again, what metric? What unit are we dealing with?

If you don't have the capacity to understand what I'm talking about without me specifying a unit than I'll make one up:

I call it safety units. The amount of errors you catch in production. That's my unit: 1 caught error in prod in a year. For Untyped languages let's say you catch about 20 errors a year. With types that goes down to 10.

>It’s not a fact, it’s ridiculous. You genuinely believe that if somebody disagrees with you, it’s a fact that they lack knowledge and experience? It’s not even remotely possible for somebody to have an informed difference of opinion with you?

What? and you think all opinions are equal and everyone has the freedom to have any opinion they want and no one can be right or wrong because everything is just an opinion? Do all opinions need to be fully respected even though it's insane?

Like my example, if you have the opinion that eating horse shit is healthy, I'm going to make a judgement call that your opinion is WRONG. Lack of Typing is one of these "opinions"

Take a step back and look at what you are saying:

> If someone disagrees with me (on this specific topic), it absolutely doesn't mean they are a junior. It means they lack knowledge and experience. This is a fact.

You think it’s impossible for anybody to have an informed opinion that disagrees with yours. You literally think yours is the only possible valid opinion. If that doesn’t set off big warning bells in your head, you are in dire need of a change in attitude.

This conversation is not productive, let’s end it.

>You think it’s impossible for anybody to have an informed opinion that disagrees with yours. You literally think yours is the only possible valid opinion. If that doesn’t set off big warning bells in your head, you are in dire need of a change in attitude.

I mean do you think we should have a fair and balanced discussion about the merits of child molestation and rape? We should respect other people's opinion and not tell them they are wrong if there opinion differs? That's what I think of your opinion. I think your opinion is utterly wrong, and I do think my opinion is the valid opinion.

Now that doesn't mean I disrespect your opinion. That doesn't mean your not allowed to have a different opinion. It just means I tell you straight up, you're wrong and you lack experience. You're free to disagree with that and tell me the exact same thing. I'm just blunt, and I welcome you to be just as blunt to me. Which you have.

The thing I don't like about you is that you turned it into discussion about opinions and the nature of holding opinions. Dude. Just talk about the topic. If you think I'm wrong. Tell me straight up. Talk about why I'm wrong. Don't talk about my character and in what manner I should formulate opinions and what I think are facts.

>This conversation is not productive, let’s end it.

I agree let's end it. But let's be utterly clear. YOU chose to end it with your actions by shifting the conversation into saying stuff like "you literally think yours is the only possible opinion." Bro. All you need to do is state why you think my opinion is garbage and prove it wrong. That's the direction of the conversation, you ended it by shifting it to a debate on my character.

[deleted]