I still find weird that they didn't make A,B... just after the digits, that would make binary to hexadecimal conversion more efficient..
I still find weird that they didn't make A,B... just after the digits, that would make binary to hexadecimal conversion more efficient..
Going off the timelines on Wikipedia, the first version of ASCII was published (1963) before the 0-9,A-F hex notation became widely used (>=1966):
- https://en.wikipedia.org/wiki/ASCII#History
- https://en.wikipedia.org/wiki/Hexadecimal#Cultural_history
The alphanumeric codepoints are well placed hexadecimally-speaking though. I don't imagine that was just an accident. For example, they could've put '0' at 050/0x28, but they put it at 060/0x30. That seems to me that they did have hexadecimal in consideration.
It's a binary consideration if you think of it rather than hexadecimal.
If you have to prominently represent 10 things in binary, then it's neat to allocate slot of size 16 and pad the remaining 6 items. Which is to say it's neat to proceed from all zeroes:
It's more of a cause for hexadecimal notation than an effect of it.Currently 'A' is 0x41 and 0101, 'a' is 0x61 and 0141, and '0' is 0x30 and 060. These are fairly simple to remember for converting between alphanumerics and their codepoint. Seems more advantageous, especially if you might be reasonably looking at punchcards.
[0-9A-Z] doesn't fit in 5 bits, which impedes shift/ctrl bits.
I'm not sure if our convention for hexadecimal notation is old enough to have been a consideration.
EDIT: it would need to predate the 6-bit teletype codes that preceded ASCII.
They put : ; immediately after the digits because they were considered the least used of the major punctuation, so that they could be replaced by ‘digits’ 10 and 11 where desired.
(I'm almost reluctant to to spoil the fun for the kids these days, but https://en.wikipedia.org/wiki/%C2%A3sd )