> treating the `+` operator the same as you would an interface method
In other words, you agree that the Python type hint system does not give you a good, built-in way to express the "Addable" type.
Which means you are contradicting your claims that the type the article wants to express is "unknown" and that the article is advocating using "Any" for this case. The type is not unknown--it's exactly what I said: "doesn't throw an exception when using the + operator". That type is just not expressible in Python's type hint system in the way that would be needed. And "Any" doesn't address this problem, because the article is not saying that every pair of objects should be addable.
> "I can't express 'type which supports the '+' operator'" is an insanely esoteric and unusual case
I don't see why. Addition is a very commonly used operation, and being able to have a type system that can express "this function takes two arguments that can be added using the addition operator" seems like something any type system that delivers the goods it claims to deliver ought to have.
> unsupported in many languages
Yes, which means many languages have type systems that claim to deliver things they can't actually deliver. They can mostly deliver them, but "mostly" isn't what advocates of using type systems in all programs claim. So I think the article is making a useful point about the limitations of type systems.
> it's disingenuous to use it as an excuse for why people shouldn't bother with type hinting at all.
The article never says that either. You are attacking straw men.
> I don't see why. Addition is a very commonly used operation, and being able to have a type system that can express "this function takes two arguments that can be added using the addition operator" seems like something any type system that delivers the goods it claims to deliver ought to have.
If your comparison is Rust, sure, but you can't even express this in Java. No, Java's type system is not great, but it's a type system that's been used for approximately 500 trillion lines of production code powering critical systems and nobody has ever said "Java sucks because I can't express 'supports the + operator' as a generic type". (It sucks for many other reasons.)
Again, it is factually and objectively an esoteric and unusual case. Nobody in the real world is writing generics like this, only academics or people writing programming blogs about esoterica.
If your argument is that all type systems are bad or deficient, fine, but calling out Python for this when it has the exact same deficiency as basically every other mainstream language is asinine.
> The article never says that either. You are attacking straw men.
The article says "Turning even the simplest function that relied on Duck Typing into a Type Hinted function that is useful can be painfully difficult." The subterfuge is that this is not even remotely close to a simple function because the type being expressed, "supports the + operator", is not even remotely close to a simple type.
> it is factually and objectively an esoteric and unusual case.
Sorry, but your unsupported opinion is not "factual and objective".
> If your argument is that all type systems are bad or deficient
I said no such thing, any more than the article did. Again you are attacking a straw man. (If you had said "limited in what they can express", I might buy that. But you didn't.)
I think I've said all I have to say in this subthread.
It's factual and objective that billions, if not trillions of lines of Java and Go have been deployed and the language still cannot express "supports the + operator" as a type constraint. In production, non-academic settings, people don't generally write code like that.
Again, this is an esoteric limitation from the perspective of writing code that runs working software, not a programming language theory perspective.
How many of those lines of code would have benefited from being able to express that type constraint, if the language made it possible?
You have no idea, and nor does anyone else. But that's what you would need "factual and objective" evidence about to support the claim you made.
By your argument, anything that programming languages don't currently support, must be an "esoteric limitation" because billions if not trillions of lines of code have been written without it. Which would mean programming languages would never add new features at all. But it's certainly "factual and objective" that programming languages add new features all the time. Maybe this is another feature that at some point a language will add, and programmers will find it useful. You don't even seem to be considering such a possibility.