One thing that annoys me is:
Why don’t kilobyte continue to mean 1024 and introduce kilodebyte to mean 1000. Byte, to me implies a binary number system, and if you want to introduce a new nomenclature to reduce confusion, give the new one a new name and let the older of more prevalent one in its domain keep the old one…
Because kilo- already has a meaning. And both usages of kilobyte were (and are) in use. If we are going to fix the problem, we might as well fix it right.
Sure outside of computing in other science it has a meaning but in binary computing traditionally prefix + byte implied binary number quantities.
Many things acquire domain specific nuanced meaning ..
Even in computing the binary definition is only used with memory sizes. E.g. storage, network speeds, clock rates use the standard definition.
And yet in computing, a 1kHz clock is still 1000 cycles per second, and 1 MFLOP is still 1,000,000 floating-point operations per second.
The comment you replied to explained that:
"in binary computing traditionally prefix + byte implied binary number quantities."
There are no bytes involved in Hz or FLOPs.
> Why don’t kilobyte continue to mean 1024
Because it never did!
Which universe do you hail from? Because nobody except pedants have relented to this demand from non-computer scientists to conform to a standardization that has nothing to do with them or the work they do.