This is halftone (i.e., an apparent palette with more colors than the actual palette, by ensuring that you aren't just rounding the same way everywhere) but it isn't dithering in my opinion. To me, dithering means fading away the banding that occurs when the palette (or the apparent palette achieved via halftone) isn't large enough to avoid banding on its own.

The halftone technique demonstrated here takes a palette of 2 colors and increases it to something on the order of 20 apparent colors, but even with 20 there are extremely obvious bands.

That banding can be virtually eliminated by having way more colors (say, 256 if grayscale, 256^3 if RGB) or it can be virtually eliminated via dithering. I suspect the "error diffusion" technique (which is teased at the end of this demo) does what I'm talking about.

Noise is the key to dithering, and I don't see any noise in this demo. Everything is deterministic.

But the presentation is spectacular!

Error-diffusion dithering or dithering with a precomputed blue-noise or white-noise pattern are also deterministic.

One standard point of view is that what introduces noise is quantization, in the sense that the quantized image has an unwanted difference from the desired image, and dithering consists of techniques to shape that noise. The Bayer-matrix ordered dithering algorithm presented here concentrates the noise at high frequencies where the eye is less sensitive to it, but it still retains some noise at lower frequencies, which is the banding you're commenting on.

Dave Long had the sharp observation earlier today that Bresenham-style line drawing is a form of dithering, where the signal being represented is the position of the pen rather than the brightness of the visual field. (I see that dreamcompiler made the same observation two days ago: https://news.ycombinator.com/item?id=45728962 but I don't know if Dave saw it there)

We had a good discussion of dithering here a couple of days ago, connected with a significantly more comprehensive article with a less fancy presentation: https://news.ycombinator.com/item?id=45728231

> Dave Long had the sharp observation earlier today that Bresenham-style line drawing is a form of dithering, where the signal being represented is the position of the pen rather than the brightness of the visual field.

Bresenham is the equivalent of a sharp quantization, with no dithering. Using true 1D dithering for line drawing would instead result in a "wobbly" almost hand-drawn output.

You're still thinking about dithering being about colors. It's about finding the best member of set B to stand in for a member of set A when |A|>|B|.

In color dithering A is the set of colors in the original image and B is a smaller set of colors. Often just pure black and pure white, but that doesn't have to be the case.

In Bresenham A is the relative x,y coordinates of the ideal next pixel in the line if the pixels were infinitely small (thus |A| = infinity), while B contains the relative x,y coordinates of the 3 pixels that are actually available: +1,0; 0,+1; and +1,+1 (with appropriate rotation for the other 3 quadrants).

An important feature of Bresenham's is that the error inherent in this assignment is carried forth into the decision made for the next pixel, such that the total error is diffused along the line and its average value stays close to zero. Such error diffusion is also a feature of the best color dithering algorithms, but not the one described in TFA -- ordered dithering -- because ordered dithering is not a very good algorithm and is not used much today except when its peculiar artifacts are desired.

And yes, Bresenham's original algorithm does set each pixel it chooses to pure black, but this has nothing to do with its error diffusion mechanism. Bresenham's with grayscale is also a thing and it results in even better line approximations, but it's usually referred to as antialiased Bresenham.

I think zozbot was talking about the position and not the color, and they are correct that the positions of the pixels that get drawn in Bresenham are the nearest neighbors of the ideal positions, rather than having any sort of dithering applied to them. (I'm reluctant to call "nearest neighbor" a sort of dithering.) What's getting dithered is the slope, not the position. I was wrong about that.

It's error-diffusion dithering. You maintain an error accumulator from one pixel to the next, which keeps track of your departure from the desired slope, and you make a diagonal rather than paraxial move when it overflows. (Or is that only DDA?) But I guess you're right that the position isn't being dithered. It's the slope.

I'm tempted to try the wobbly algorithm now to see what it looks like!

The cool thing about Bresenham is that you're constantly picking the pixel closest to the desired slope but you never have to do the explicit division of deltaY/deltaX. The division happens implicitly as you draw the line, using only addition and subtraction of integers. Which was quite handy for the primitive computers circa 1960 when Jack Bresenham invented it.

You mean the desired position. If you were always picking the pixel closest to the desired slope, your lines would all be at multiples of 45°.

I agree that the lack of division is surprising and beneficial.

You made me curious. It looks like dithering is still an accepted name for this kind of technique: https://en.wikipedia.org/wiki/Ordered_dithering

Glad I snuck in that it's just my opinion! But the article you linked to sort of admits what I'm saying:

> The above thresholding matrix approach describes the Bayer family of ordered dithering algorithms. A number of other algorithms are also known; they generally involve changes in the threshold matrix, equivalent to the "noise" in general descriptions of dithering.

Basically, I'm leaning into "general descriptions of dithering" with my noise requirement, and the lack of noise in "ordered dithering" leads me to consider it not-quite-dithering.

The very first sentence of the general Dithering article [0] connects with my perspective as well:

> preventing large-scale patterns such as color banding

Aside: I probably misspoke with the word "halftone" earlier; apparently that's a specific thing as opposed to an umbrella term. I'm not sure there's a great word (other than "dither"...) for techniques to trade resolution for color.

[0] https://en.wikipedia.org/wiki/Dither

Dithering is the right term. It was called this even as far back as the Win 3.1 era where program installers typically showed you a full screen window with a background gradient that went from dark blue to black and used ordered dithering to make it look semi-respectable.

The threshold map of ordered dithering is still a source of noise, it just happens to be carefully pre-baked so that (original image + noise)==output makes the output more legible than what you'd get from just mapping the original image pixels to the nearest available color.

The error diffusion is static and baked into the thresholds chosen, but it's still there and choosing the error diffusion properly still matters to getting a reasonable output.

I think that text is somewhat misleading because it leads people to believe that ordered dithering is not a kind of dithering, and also because the noise in general descriptions of dithering is not equivalent to changes in the threshold matrix. Rather, it is equivalent to differences between the input image and the output image. So I've fixed it.

I'm a pixel artist and everyone I know who uses this kind of technique calls it dithering.

Dithering as a mechanism to reduce/remove banding can be very impressive.

The color Next machines only had 12 bit displays, 4 bits per channel, but with careful use of dithering it was often indistinguishable from a 24-bit display (so called "true color")

> Dithering means fading away the banding that occurs when the palette (or the apparent palette achieved via halftone) isn't large enough to avoid banding on its own.

Here[0] is a good summary of dithering/noise to reduce color banding. Interestingly, for my game[1] I found that Valve's animated dithering technique (which uses variation over time in lieu of a fixed noise pattern) worked best, as the other techniques described there altered the perceived lightness of the resulting image.

[0]: https://blog.frost.kiwi/GLSL-noise-and-radial-gradient/

[1]: A multiplayer Noita-like (more at https://slowrush.dev), where the radiance-cascades-powered lighting tends to cause obvious banding patterns on really dark backgrounds.

It's called ordered dithering.