Surely dithering is good enough to display 10-bit colour on a non-10-bit monitor with good colour.
The banding disappears and the noise introduced by dithering should hide the lack of depth completely to human eyes.
So a modern use for graphical dithering. And I'm sure there's more. It's not all for retro art.
> Surely dithering is good enough to display 10-bit colour on a non-10-bit monitor with good colour.
Indeed, but most software does not do this. It's not on people's minds.
When people hear dithering they think of palette dithering.
People think 8bit/channel is enough; but it is not.
Because it is only 256 levels of any gradient. And because of gamma correction it's practically quite a bit fewer; about 14% (i.e. 220) on a full 256 step gradient that went through an sRGB OETF.
I maintain a Rust crate that solely concerns itself with dithering for such cases as the one you mention.
The readme has a gradient on top that should make it obvious why this matters.
https://crates.io/crates/dithereens
> Indeed, but most software does not do this. It's not on people's minds.
It can be done in hardware, or even in the graphics driver. If you can combine spatial (for high resolutions such as 1440p or 4K) and temporal dithering you should expect very good results.
Modern printing uses dithering. When you’re going from a high bit rate down to 10 bits or 8 bits, it’s completely fine to use random dithering. Random might not be as fun to write about as Bayer dither patterns, or error diffusion, but for higher dynamic ranges, random is as effective.
Black and white digital dithering of course has it’s roots in halftoning, which printing has used for more than a century, and it continues to be used for newspapers, for example.
Photoshop dithers by default when converting from 16 bits per channel to 8 bits. Lots of other image software doesn’t do this, but you start paying attention quickly, I learned from experience, when you order a poster-sized print for $100 and it comes out with visible banding you couldn’t see on your monitor.
LCDs have actually used temporal dithering for ages to fake higher bit depth-- it let TN panels with 6 bits if precision per channel appear to have 8 bits of precision, and now it's being used for HDR-10 too!
Temporal dithering is much simpler algorithmically since you don't have to diffuse error over an area, just flicker a specific pixel between similar shades at the refresh rate.
https://en.wikipedia.org/wiki/Frame_rate_control
Yep and you can really tell on certain shades of grey when the screen is mysteriously flickering. Thanks Dell.