The Reddit post falls under the case of "don't know" the type. If you want to allow users to pass in any objects, try to add and fail at runtime... that's exactly what Any is for.
But the entire post is built upon the premise that accepting all types is good API design. Which it isn't, at all.
> The Reddit post falls under the case of "don't know" the type.
No, it doesn't. The desired type is known; it's "Addable" (i.e., "doesn't throw an exception when the built-in add operator is used"). The problem is expressing that in Python's type notation in a way that catches all edge cases.
> If you want to allow users to pass in any objects, try to add and fail at runtime
Which is not what the post author wants to do. They want to find a way to use Python's type notation to catch those errors with the type checker, so they don't happen at runtime.
> the entire post is built upon the premise that accepting all types is good API design
It is based on no such thing. I don't know where you're getting that from.
> The desired type is known; it's "Addable" (i.e., "doesn't throw an exception when the built-in add operator is used").
The mistake both you and the reddit posts' author make is treating the `+` operator the same as you would an interface method. Despite Python having __add__/__radd__ methods, this isn't true, nor is it true in many other programming languages. For example, Go doesn't have a way to express "can use the + operator" at all, and "can use comparison operators" is defined as an explicit union between built-in types.[0] In C# you could only do this as of .NET 7, which was released in Nov 2022[1] -- was the C# type system unusable for the 17 years prior, when it didn't support this scenario?
If this were any operation on `a` and `b` other than a built-in operator, such as `a.foo(b)`, it would be trivial to define a Protocol (which the author does in Step 4) and have everything work as expected. It's only because of misunderstanding of basic Python that the author continues to struggle for another 1000 words before concluding that type checking is bad. It's an extremely cherry-picked and unrealistic scenario either from someone who is clueless, or knows what they're doing and is intentionally being malicious in order to engagement bait.[2]
This isn't to say Python (or Go, or C#) has the best type system, and it certainly lacks compared to Rust which is a very valid complaint, but "I can't express 'type which supports the '+' operator'" is an insanely esoteric and unusual case, unsupported in many languages, that it's disingenuous to use it as an excuse for why people shouldn't bother with type hinting at all.
[0] https://pkg.go.dev/cmp#Ordered
[1] https://learn.microsoft.com/en-us/dotnet/standard/generics/m...
[2] actually reading through the reddit comments, the author specifically says they were engagement baiting so... I guess they had enough Python knowledge to trick people into thinking type hinting was bad, fair enough!
> treating the `+` operator the same as you would an interface method
In other words, you agree that the Python type hint system does not give you a good, built-in way to express the "Addable" type.
Which means you are contradicting your claims that the type the article wants to express is "unknown" and that the article is advocating using "Any" for this case. The type is not unknown--it's exactly what I said: "doesn't throw an exception when using the + operator". That type is just not expressible in Python's type hint system in the way that would be needed. And "Any" doesn't address this problem, because the article is not saying that every pair of objects should be addable.
> "I can't express 'type which supports the '+' operator'" is an insanely esoteric and unusual case
I don't see why. Addition is a very commonly used operation, and being able to have a type system that can express "this function takes two arguments that can be added using the addition operator" seems like something any type system that delivers the goods it claims to deliver ought to have.
> unsupported in many languages
Yes, which means many languages have type systems that claim to deliver things they can't actually deliver. They can mostly deliver them, but "mostly" isn't what advocates of using type systems in all programs claim. So I think the article is making a useful point about the limitations of type systems.
> it's disingenuous to use it as an excuse for why people shouldn't bother with type hinting at all.
The article never says that either. You are attacking straw men.
> I don't see why. Addition is a very commonly used operation, and being able to have a type system that can express "this function takes two arguments that can be added using the addition operator" seems like something any type system that delivers the goods it claims to deliver ought to have.
If your comparison is Rust, sure, but you can't even express this in Java. No, Java's type system is not great, but it's a type system that's been used for approximately 500 trillion lines of production code powering critical systems and nobody has ever said "Java sucks because I can't express 'supports the + operator' as a generic type". (It sucks for many other reasons.)
Again, it is factually and objectively an esoteric and unusual case. Nobody in the real world is writing generics like this, only academics or people writing programming blogs about esoterica.
If your argument is that all type systems are bad or deficient, fine, but calling out Python for this when it has the exact same deficiency as basically every other mainstream language is asinine.
> The article never says that either. You are attacking straw men.
The article says "Turning even the simplest function that relied on Duck Typing into a Type Hinted function that is useful can be painfully difficult." The subterfuge is that this is not even remotely close to a simple function because the type being expressed, "supports the + operator", is not even remotely close to a simple type.
> it is factually and objectively an esoteric and unusual case.
Sorry, but your unsupported opinion is not "factual and objective".
> If your argument is that all type systems are bad or deficient
I said no such thing, any more than the article did. Again you are attacking a straw man. (If you had said "limited in what they can express", I might buy that. But you didn't.)
I think I've said all I have to say in this subthread.
It's factual and objective that billions, if not trillions of lines of Java and Go have been deployed and the language still cannot express "supports the + operator" as a type constraint. In production, non-academic settings, people don't generally write code like that.
Again, this is an esoteric limitation from the perspective of writing code that runs working software, not a programming language theory perspective.
How many of those lines of code would have benefited from being able to express that type constraint, if the language made it possible?
You have no idea, and nor does anyone else. But that's what you would need "factual and objective" evidence about to support the claim you made.
By your argument, anything that programming languages don't currently support, must be an "esoteric limitation" because billions if not trillions of lines of code have been written without it. Which would mean programming languages would never add new features at all. But it's certainly "factual and objective" that programming languages add new features all the time. Maybe this is another feature that at some point a language will add, and programmers will find it useful. You don't even seem to be considering such a possibility.
> But the entire post is built upon the premise that accepting all types is good API design. Which it isn't, at all.
Was Tim Peters also wrong way back in the day when he counseled Guido van Rossum to allow floats to be added to integers without a cast, like other popular languages?
How is `float | int` anywhere close to equivalent to `Any`?
How is "responds to the `__add__` method" anywhere close to equivalent to `Any`?
If your implication is that "implementing __add__ means you can use the + operator", you are incorrect. This is a common Python beginner mistake, but it isn't really a Python type checking issue, this is complexity with Python built-ins and how they interact with magic methods.
My suggestion -- don't rely on magic methods.
This is a strange and aggressive bit of pedantry. Yes, you'd also need `__radd__` for classes that participate in heterogenous-type addition, but it's clear what was meant in context. The fundamentals are not all "beginner" level and beginners wouldn't be implementing operator overloads in the first place (most educators hold off on classes entirely for quite a while; they're pure syntactic sugar after all, and the use case is often hard to explain to beginner).
Regardless, none of that bears on the original `slow_add` example from the Reddit page. The entire point is that we have an intuition about what can be "added", but can't express it in the type system in any meaningful way. Because the rule is something like "anything that says it can be added according to the protocol — which in practical terms is probably any two roughly-numeric types except for the exceptions, and also most container types but only with other instances of the same type, and also some third-party things that represent more advanced mathematical constructs where it makes sense".
And saying "don't rely on magic methods" does precisely nothing about the fact that people want the + symbol in their code to work this way. It does suggest that `slow_add` is a bad thing to have in an API (although that was already fairly obvious). But in general you do get these issues cropping up.
Dynamic typing has its place, and many people really like it, myself included. Type inference (as in the Haskell family) solves the noise problem (for those who consider it a problem rather than something useful) and is elegant in itself, but just not the strictly superior thing that its advocates make it out to be. People still use Lisp family languages, and for good reason.
But maybe Steve Yegge would make the point better.
> This is a strange and aggressive bit of pedantry.
There's nothing pedantic about it. That's how Python works, and getting into the nuts and bolts of how Python works is precisely why the linked article makes type hinting appear so difficult.
> The entire point is that we have an intuition about what can be "added", but can't express it in the type system in any meaningful way.
As the post explores, your intuition is also incorrect. For example, as the author discovers in the process, addition via __add__/__radd__ is not addition in the algebraic field sense. There is no guarantee that adding types T + T will yield a T. Or that both operands are of the same type at all, as would be the case with "adding" a string and int. Or that A + B == B + A. We can't rely on intuition for type systems.
> your intuition is also incorrect.
No, it definitionally isn't. The entire point is that `+` is being used to represent operations where `+` makes intuitive sense. When language designers are revisiting the decision to use the `+` symbol to represent string concatenation, how many of them are thinking about algebraic fields, seriously?
And all of this is exactly why you can't just say that it's universally bad API design to "accept all types". Because the alternative may entail rejecting types for no good reason. Again, dynamically typed languages exist for a reason and have persisted for a reason (and Python in particular has claimed the market share it has for a reason) and are not just some strictly inferior thing.
> you can't just say that it's universally bad API design to "accept all types"
Note, though, that that's not really the API design choice that's at stake here. Python will still throw an exception at runtime if you use the + operator between objects that don't support being added together. So the API design choice is between that error showing up as a runtime exception, vs. showing up as flagged by the type checker prior to runtime.
Or, to put it another way, the API design choice is whether or not to insist that your language provide explicit type definitions (or at least a way to express them) for every single interface it supports, even implicit ones like the + operator, and even given that user code can redefine such interfaces using magic methods. Python's API design choice is to not care, even with its type hinting system--i.e., to accept that there will be interface definitions that simply can't be captured using the type hinting system. I personally am fine with that choice, but it is a design choice that language users should be aware of.
> No, it definitionally isn't. The entire point is that `+` is being used to represent operations where `+` makes intuitive sense.
Huh? There's no restriction in Python's type system that says `+` has to "make sense".
> banana and mango smoothie> <Response [200]>
So we have Fruit + Fruit = Smoothie. Overly cute, but sensible from a CS101 OOP definition and potentially code someone might encounter in the real world, and demonstrates how not all T + T -> T. And we have Fruit + number = requests.Response. Complete nonsense, but totally valid in Python. If you're writing a generic method `slow_add` that needs to support `a + b` for any two types -- yes, you have to support this nonsense.
I guess that's the difference between the Python and the TypeScript approach here. In general, if something is possible, valid, and idiomatic in JavaScript, then TypeScript attempts to model it in the type system. That's how you get things like conditional types and mapped types that allow the type system to validate quite complex patterns. That makes the type system more complex, but it means that it's possible to use existing JavaScript patterns and code. TypeScript is quite deliberately not a new language, but a way of describing the implicit types used in JavaScript. Tools like `any` are therefore an absolute last resort, and you want to avoid it wherever possible.
When I've used Python's type checkers, I have more the feeling that the goal is to create a new, typed subset of the language, that is less capable but also easier to apply types to. Then anything that falls outside that subset gets `Any` applied to it and that's good enough. The problem I find with that is that `Any` is incredibly infective - as soon as it shows up somewhere in a program, it's very difficult to prevent it from leaking all over the place, meaning you're often back in the same place you were before you added types, but now with the added nuisance of a bunch of types as documentation that you can't trust.
> My suggestion -- don't rely on magic methods.
So no e.g. numpy or torch then?