Not too familiar with D, but isn't 0xFF ÿ (Latin Small Letter Y with diaeresis) in unicode? It's not valid UTF-8 or ascii, but it's still a valid codepoint in unicode.
I'm a fan of the idea in general, and don't think there's a better byte to use as an obviously-wrong default.
It's an invalid 8 bit code unit, which is what matters. It's a valid codepoint but codepoints are just abstract numbers, not byte patterns.
The only byte representation that's 1:1 with Unicode code points is UTF-32, which I imagine that the D char type can't store.