> But it's not quite the same as the real thing.
To be fair, with modern "retina" HDR displays, it should be very very close.
> But it's not quite the same as the real thing.
To be fair, with modern "retina" HDR displays, it should be very very close.
The most important element of the CRT look is the fast phosphor decay. This is why CRTs have so little sample-and-hold blur. No other hardware can simulate it perfectly, but a 480Hz OLED display comes close:
https://blurbusters.com/crt-simulation-in-a-gpu-shader-looks...
> it should be very very close
It should. It isn't. For some obscure reason, VGA colours look different on every modern LCD.
Most modern displays are calibrated, to some reasonable level, and can easily accommodate the very limited gamut of an old CRT, especially anything supporting HDR10. I suspect this is more of "they need to be fudged so they're wrong" more than anything.
I don't think old CRT gramut is "very limited". Only plasma screens were as good.
Plasma has great contrast and a slightly wider gamut than a CRT. Neither one have particularly good gamuts unless you're comparing to sRGB. Many current screens can do much better.
Most modern displays are calibrated, to some reasonable level, and can easily accommodate the very limited gamut of an old CRT. I suspect this is more of "they need to be corrected so they're wrong" more than anything.