What in the Hacker News in this comment?
Mathematical notation evolved to its modern state over centuries. It's optimized heavily for its purpose. Version numbers? You're being facetious, right?
What in the Hacker News in this comment?
Mathematical notation evolved to its modern state over centuries. It's optimized heavily for its purpose. Version numbers? You're being facetious, right?
>evolved
Yes, it evolved. It wasn't designed.
>Version numbers?
Without version numbers, it has to be backwards-compatible, making it difficult to remove cruft. What would programming be like if all the code you wrote needed to work as IBM mainframe assembly?
Tau is a good case study. Everyone seems to agree tau is better than pi. How much adoption has it seen? Is this what "heavy optimization" looks like?
It took hundreds of years for Arabic numerals to replace Roman numerals in Europe. A medieval mathematician could have truthfully said: "We've been using Roman numerals for hundreds of years; they work fine." That would've been stockholm syndrome. I get the same sense from your comment. Take a deep breath and watch this video: https://www.youtube.com/watch?v=KgzQuE1pR1w
>You're being facetious, right?
I'm being provocative. Not facetious. "Strong opinions, weakly held."
> Without version numbers, it has to be backwards-compatible
If there’s one thing that mathematical notation is NOT, it’s backwards compatible. Fields happily reuse symbols from other fields with slightly or even completely different meanings.
https://en.wikipedia.org/wiki/Glossary_of_mathematical_symbo... has lots of examples, for example
÷ (division sign)
Widely used for denoting division in Anglophone countries, it is no longer in common use in mathematics and its use is "not recommended". In some countries, it can indicate subtraction.
~ (tilde)
1. Between two numbers, either it is used instead of ≈ to mean "approximatively equal", or it means "has the same order of magnitude as".
2. Denotes the asymptotic equivalence of two functions or sequences.
3. Often used for denoting other types of similarity, for example, matrix similarity or similarity of geometric shapes.
4. Standard notation for an equivalence relation.
5. In probability and statistics, may specify the probability distribution of a random variable. For example, X∼N(0,1) means that the distribution of the random variable X is standard normal.
6. Notation for proportionality. See also ∝ for a less ambiguous symbol.
Individual mathematicians even are known to have broken backwards compatibility. https://en.wikipedia.org/wiki/History_of_mathematical_notati...
* Euler used i to represent the square root of negative one (√-1) although he earlier used it as an infinite number*
Even simple definitions have changed over time, for example:
- how numbers are written
- is zero a number?
- is one a number?
- is one a prime number?
> Fields happily reuse symbols from other fields with slightly or even completely different meanings.
Symbol reuse doesn't imply a break in backwards compatibility. As you suggest with "other fields", context allows determining how the symbols are used. It is quite common in all types of languages to reuse symbols for different purposes, relying on context to identify what purpose is in force.
Backwards incompatibility tells that something from the past can no longer be used with modern methods. Mathematical notation from long ago doesn't much look like what we're familiar with today, but we can still make use of it. It wasn't rendered inoperable by modern notation.
> Mathematical notation from long ago doesn't much look like what we're familiar with today, but we can still make use of it.
But few modern mathematicians can understand it. Given enough data, they can figure out what it means, but that’s similar to (in this somewhat weak analogy) running code in an emulator.
What we can readily make use of are mathematical results from long ago.
> Given enough data, they can figure out what it means
Right, whereas something that isn't backwards compatible couldn't be figured out no matter how much data is given. Consider this line of Python:
There is no way you can know what the output should be. That is, unless we introduce synthetic context (i.e. a version number). Absent synthetic context we can reasonably assume that natural context is sufficient, and where natural context is sufficient, backwards compatibility is present.> What we can readily make use of are mathematical results from long ago.
To some degree, but mostly we've translated the old notation into modern notation for the sake of familiarity. And certainly a lot of programming that gets done is exactly that: Rewriting the exact same functionality in something more familiar.
But like mathematics, while there may have been a lot of churn in the olden days when nothing existed before it and everyone was trying to figure out what works, programming notation has largely settled on what is familiar with some reasonable stability and no doubt will only continue to find even greater stability and it matures in kind.
Mathematical notation isn't at all backwards compatible, and it certainly isn't consistent. It doesn't have to be, because the execution environment is the abstract machine of your mind, not some rigidly defined ISA or programming language.
> Everyone seems to agree tau is better than pi. How much adoption has it seen?
> It took hundreds of years for Arabic numerals to replace Roman numerals in Europe.
What on earth does this have to do with version numbers for math? I appreciate this is Hacker News and we're all just pissing into the wind, but this is extra nonsensical to me.
The reason math is slow to change has nothing to do with backwards compatibility. We don't need to institute Math 2.0 to change mathematical notation. If you want to use tau right now, the only barrier is other people's understanding. I personally like to use it, and if I anticipate its use will be confusing to a reader, I just write `tau = 2pi` at the top of the paper. Still, others have their preference, so I'm forced to understand papers (i.e. the vast majority) which still use pi.
Which points to the real reason math is slow to change: people are slow to change. If things seem to be working one way, we all have to be convinced to do something different, and that takes time. It also requires there to actually be a better way.
> Is this what "heavy optimization" looks like?
I look forward to your heavily-optimized Math 2.0 which will replace existing mathematical notation and prove me utterly wrong.