In the same spirit, the `char` type default initializes to 0xFF, which is an invalid Unicode value.

It's the same idea for pointers, which default initialize to null.

Not too familiar with D, but isn't 0xFF ÿ (Latin Small Letter Y with diaeresis) in unicode? It's not valid UTF-8 or ascii, but it's still a valid codepoint in unicode.

I'm a fan of the idea in general, and don't think there's a better byte to use as an obviously-wrong default.

It's an invalid 8 bit code unit, which is what matters. It's a valid codepoint but codepoints are just abstract numbers, not byte patterns.

The only byte representation that's 1:1 with Unicode code points is UTF-32, which I imagine that the D char type can't store.