That is a prescriptivist way of thinking about language, which is useful if you enjoy feeling righteous about correctness, but not so helpful for understanding how communication actually works. In reality-reality, "kilobyte" may mean either "1000 bytes" or "1024 bytes", depending on who is saying it, whom they are saying it to, and what they are saying it about.

You are free to intend only one meaning in your own communication, but you may sometimes find yourself being misunderstood: that, too, is reality.

It's not even really prescriptivist thinking… "Kilobyte" to mean both 1,000 B & 1,024 B is well-established usage, particularly dependent on context (with the context mostly being HDD manufacturers who want to inflate their drive sizes, and … the abomination that is the 1.44 MB diskette…). But a word can be dependent on context, even in prescriptivist settings.

E.g., M-W lists both, with even the 1,024 B definition being listed first. Wiktionary lists the 1,024 B definition, though it is tagged as "informal".

As a prescriptivist myself I would love if the world could standardize on kilo = 1000, kibi = 1024, but that'll likely take some time … and the introduction of the word to the wider public, who I do not think is generally aware of the binary prefixes, and some large companies deciding to use the term, which they likely won't do, since companies are apt to always trade for low-grade perpetual confusion over some short-term confusion during the switch.

Does anyone, other than HDD manufacturers who want to inflate their drive sizes, actually want a 1000-based kilobyte? What would such a unit be useful for? I suspect that a world which standardized on kibi = 1024 would be a world which abandoned the word "kilobyte" altogether.

> with the context mostly being HDD manufacturers who want to inflate their drive sizes

This is a myth. The first IBM harddrive was 5,000,000 characters in 1956 - before bytes were even common usage. Drives have always been base10, it's not a conspiracy.

Drives are base10, lines are base10, clocks are base10, pretty much everything but RAM is base10. Base2 is the exception, not the rule.

I understand the usual meaning, but I use the correct meaning when precision is required.

How can there be both a "usual meaning" and a "correct meaning" when you assert that there is only one meaning and "There's no possible discussion over this fact."

You can say that one meaning is more correct than the other, but that doesn't vanish the other meaning from existence.

When precision is required, you either use kibibytes or define your kilobytes explicitly. Otherwise there is a real risk that the other party does not share your understanding of what a kilobyte should mean in that context. Then the numbers you use have at most one significant figure.

The correct meaning has always been 1024 bytes where I’m from. Then I worked with more people like you.

Now, it depends.