You've got it exactly the wrong way around. And that with such great confidence!
There was always a confusion about whether a kilobyte was 1000 or 1024 bytes. Early diskettes always used 1000, only when the 8 bit home computer era started was the 1024 convention firmly established.
Before that it made no sense to talk about kilo as 1024. Earlier computers measured space in records and words, and I guess you can see how in 1960, no one would use kilo to mean 1024 for a 13 bit computer with 40 byte records. A kiloword was, naturally, 1000 words, so why would a kilobyte be 1024?
1024 bearing near ubiquitous was only the case in the 90s or so - except for drive manufacturing and signal processing. Binary prefixes didn't invent the confusion, they were a partial solution. As you point out, while it's possible to clearly indicate binary prefixes, we have no unambiguous notation for decimal bytes.
> Early diskettes always used 1000
Even worse, the 3.5" HD floppy disk format used a confusing combination of the two. Its true capacity (when formatted as FAT12) is 1,474,560 bytes. Divide that by 1024 and you get 1440KB; divide that by 1000 and you get the oft-quoted (and often printed on the disk itself) "1.44MB", which is inaccurate no matter how you look at it.
I'm not seeing evidence for a 1970s 1000-byte kilobyte. Wikipedia's floppy disk page mentions the IBM Diskette 1 at 242944 bytes (a multiple of 256), and then 5¼-inch disks at 368640 bytes and 1228800 bytes, both multiples of 1024. These are sector sizes. Nobody had a 1000-byte sector, I'll assert.
The wiki page agrees with parent, "The double-sided, high-density 1.44 MB (actually 1440 KiB = 1.41 MiB or 1.47 MB) disk drive, which would become the most popular, first shipped in 1986"
To make things even more confusing, the high-density floppy introduced on the Amiga 3000 stored 1760 KiB
At least there it stored exactly 3,520 512-byte sectors, or 1,760 KB. They didn't describe them as 1.76MB floppies.
Human history is full of cases where silly mistakes became precedent. HTTP "referal" is just another example.
I wonder if there's a wikipedia article listing these...
It's "referer" in the HTTP standard, but "referrer" when correctly spelled in English. https://en.wikipedia.org/wiki/HTTP_referer
it's, way older in than the 1990's! In computering, "K" always meant 1024 at least from 1970's.
Example: in 1972, DEC PDP 11/40 handbook [0] said on first page: "16-bit word (two 8-bit bytes), direct addressing of 32K 16-bit words or 64K 8-bit bytes (K = 1024)". Same with Intel - in 1977 [1], they proudly said "Static 1K RAMs" on the first page.
[0] https://pdos.csail.mit.edu/6.828/2005/readings/pdp11-40.pdf
[1] https://deramp.com/downloads/mfe_archive/050-Component%20Spe...
It was exactly this - and nobody cared until the disks (the only thing that used decimal K) started getting so big that it was noticeable. With a 64K system you're talking 1,536 "extra" bytes of memory - or 1,536 bytes of memory lost when transferring to disk.
But once hard drives started hitting about a gigabyte was when everyone started noticing and howling.
It was earlier than the 90s, and came with popular 8-bit CPUs in the 80s. The Z-80 microprocessor could address 64kb (which was 65,536 bytes) on its 16-bit address bus.
Similarly, the 4104 chip was a "4kb x 1 bit" RAM chip and stored 4096 bits. You'd see this in the whole 41xx series, and beyond.
> The Z-80 microprocessor could address 64kb (which was 65,536 bytes) on its 16-bit address bus.
I was going to say that what it could address and what they called what it could address is an important distinction, but found this fun ad from 1976[1].
"16K Bytes of RAM Memory, expandable to 60K Bytes", "4K Bytes of ROM/RAM Monitor software", seems pretty unambiguous that you're correct.
Interestingly wikipedia at least implies the IBM System 360 popularized the base-2 prefixes[2], citing their 1964 documentation, but I can't find any use of it in there for the main core storage docs they cite[3]. Amusingly the only use of "kb" I can find in the pdf is for data rate off magnetic tape, which is explicitly defined as "kb = thousands of bytes per second", and the only reference to "kilo-" is for "kilobaud", which would have again been base-10. If we give them the benefit of the doubt on this, presumably it was from later System 360 publications where they would have had enough storage to need prefixes to describe it.
[1] https://commons.wikimedia.org/wiki/File:Zilog_Z-80_Microproc...
[2] https://en.wikipedia.org/wiki/Byte#Units_based_on_powers_of_...
[3] http://www.bitsavers.org/pdf/ibm/360/systemSummary/A22-6810-...
Even then it was not universal. For example, that Apple I ad that got posted a few days ago mentioned that "the system is expandable to 65K". https://upload.wikimedia.org/wikipedia/commons/4/48/Apple_1_...
Someone here the other day said that it could accept 64KB of RAM plus 1KB of ROM, for 65KB total memory.
I don't know if that's correct, but at least it'd explain the mismatch.
Seems like a typo given that the ad contains many mentions of K (8K, 32K) and they're all of the 1024 variety.
If you're using base 10, you can get "8K" and "32K" by dividing by 10 and rounding down. The 1024/1000 distinction only becomes significant at 65536.
Still the advertisement is filled with details like the number of chips, the number of pins, etc. If you're dealing with chips and pins, it's always going to base-2.
only when the 8 bit home computer era started was the 1024 convention firmly established.
That's the microcomputer era that has defined the vast majority of our relationship with computers.
IMO, having lived through this era, the only people pushing 1,000 byte kilobytes were storage manufacturers, because it allows them to bump their numbers up.
https://www.latimes.com/archives/la-xpm-2007-nov-03-fi-seaga...
> 1024 bearing near ubiquitous was only the case in the 90s or so
More like late 60s. In fact, in the 70s and 80s, I remember the storage vendors being excoriated for "lying" by following the SI standard.
There were two proposals to fix things in the late 60s, by Donald Morrison and Donald Knuth. Neither were accepted.
Another article suggesting we just roll over and accept the decimal versions is here:
https://cacm.acm.org/opinion/si-and-binary-prefixes-clearing...
This article helpfully explains that decimal KB has been "standard" since the very late 90s.
But when such an august personality as Donald Knuth declares the proposal DOA, I have no heartburn using binary KB.
https://www-cs-faculty.stanford.edu/~knuth/news99.html