it's, way older in than the 1990's! In computering, "K" always meant 1024 at least from 1970's.

Example: in 1972, DEC PDP 11/40 handbook [0] said on first page: "16-bit word (two 8-bit bytes), direct addressing of 32K 16-bit words or 64K 8-bit bytes (K = 1024)". Same with Intel - in 1977 [1], they proudly said "Static 1K RAMs" on the first page.

[0] https://pdos.csail.mit.edu/6.828/2005/readings/pdp11-40.pdf

[1] https://deramp.com/downloads/mfe_archive/050-Component%20Spe...

It was exactly this - and nobody cared until the disks (the only thing that used decimal K) started getting so big that it was noticeable. With a 64K system you're talking 1,536 "extra" bytes of memory - or 1,536 bytes of memory lost when transferring to disk.

But once hard drives started hitting about a gigabyte was when everyone started noticing and howling.