> 1 kilobyte is precisely 1000 bytes
Agreed. For the naysayers out there, consider these problems:
* You have 1 "MB" of RAM on a 1 MHz system bus which can transfer 1 byte per clock cycle. How many seconds does it take to read the entire memory?
* You have 128 "GB" of RAM and you have an empty 128 GB SSD. Can you successfully hibernate the computer system by storing all of RAM on the SSD?
* My camera shoots 6000×4000 pixels = exactly 24 megapixels. If you assume RGB24 color (3 bytes per pixel), how many MB of RAM or disk space does it take to store one raw bitmap image matrix without headers?
The SI definitions are correct: kilo- always means a thousand, mega- always means a million, et cetera. The computer industry abused these definitions because 1000 is close to 1024, creating endless confusion. It is a idiotic act of self-harm when one "megahertz" of clock speed is not the same mega- as one "megabyte" of RAM. IEC 60027 prefixes are correct: there is no ambiguity when kibi- (Ki) is defined as 1024, and it can coexist beside kilo- meaning 1000.
The whole point of the metric system is to create universal units whose meanings don't change depending on context. Having kilo- be overloaded (like method overloading) to mean 1000 and 1024 violates this principle.
If you want to wade in the bad old world of context-dependent units, look no further than traditional measures. International mile or nautical mile? Pound avoirdupois or Troy pound? Pound-force or pound-mass? US gallon or UK gallon? US shoe size for children, women, or men? Short ton or long ton? Did you know that just a few centuries ago, every town had a different definition of a foot and pound, making trade needlessly complicated and inviting open scams and frauds?
> The computer industry abused these definitions because 1000 is close to 1024, creating endless confusion.
They didn't abuse the definitions. It's simply the result of dealing with pins, wires, and bits. For your problems, for example, you won't ever have a system with 1 "MB" of RAM where that's 1,000,000 bytes. The 8086 processor had 20 address lines, 2^20, that's 1,048,576 bytes for 1MB. SI units make no sense for computers.
The only problem is unscrupulous hardware vendors using SI units on computers to sell you less capacity but advertise more.
> They didn't abuse the definitions.
Yes they did. Kilo- means 1000 in SI/metric. The computer industry decided, "Gee that looks awfully close to 1024. Let's sneakily make it mean 1024 in our context and sell our RAM that way".
> It's simply the result of dealing with pins, wires, and bits. For your problems, for example, you won't ever have a system with 1 "MB" of RAM where that's 1,000,000 bytes.
I'm not disputing that. I'm 100% on board with RAM being manufactured and operated in power-of-2 sizes. I have a problem with how these numbers are being marketed and communicated.
> SI units make no sense for computers.
Exactly! Therefore, use IEC 60027 prefixes like kibi-, because they are the ones that reflect the binary nature of computers. Only use SI if you genuinely respect SI definitions.
> Exactly! Therefore, use IEC 60027 prefixes like kibi-, because they are the ones that reflect the binary nature of computers. Only use SI if you genuinely respect SI definitions.
You have to sort of remember that these didn't exist at the time that "kilobyte" came around. The binary prefixes are — relatively speaking — very new.
> Yes they did. Kilo- means 1000 in SI/metric.
I'm happy to say it isn't an SI unit. Kilo meaning 1000 makes no sense for computers, so lets just never use it to mean that.
> Therefore, use IEC 60027 prefixes like kibi-,
No. They're dumb. They sound stupid, they were decades too late, etc. This was a stupid plan. We can define Kilo as 1024 for computers -- we could have done that easily -- and just don't call them SI units if that makes people weird. This is how we all actually work. So rather than be pedantic about it lets make the language and units reflect their actual usage. Easy.
Well, you're joking, but the entire RAM industry still lists their chips in Gb (gigaBITS) to avoid confusion.
32 Gb ram chip = 4 GiB of RAM.
That's still wrong and you've solved nothing. 32 Gb = 32 000 000 000 bits = 4 000 000 000 bytes = 4 GB (real SI gigabytes).
If you think 32 Gb are binary gibibits, then you've disagreed with Ethernet (e.g. 2.5 Gb/s), Thunderbolt (e.g. 40 Gb/s), and other communication standards.
That's why I keep hammering on the same point: Creating context-dependent prefixes sows endless confusion. The only way to stop the confusion is to respect the real definitions.
It's not wrong. It's the standard definition for that industry.
Damn you're right. It's double-confusing now.