ditherpunk
January 20, 2021 12:37 PM   Subscribe

For your consideration: Two deep dives on dithering techniques, from random thresholds to blue noise to error diffusion. Here's a listicle with example animations from a wide range of historical 1-bit games, including one recent and one upcoming game. posted by kaibutsu (23 comments total) 46 users marked this as a favorite
 
This is really cool stuff! I only knew dithering in the context of telescope observations, and I really enjoyed these deep dives in the context of old video games. Thanks for posting!
posted by CompanionCube at 1:30 PM on January 20, 2021


Oh, that's interesting; how does dithering get used for telescope observations?

(I spent way too much effort a number of years back writing custom image processing for timelapse photography with a raspberry pi... Astrophotography was a great source of inspiration, so I'm hopeful you have more tricks for me!)
posted by kaibutsu at 1:46 PM on January 20, 2021


Computerphile has covered dithering a few times, too, if video is your thing:

Ordered Dithering - Computerphile

Error Diffusion Dithering - Computerphile

I loved the Obra Dinn thread back in the day - I think Lucas does interesting games, even if they don't really click with me. I should see if there's a good Obra Dinn Let's Play out there....
posted by Kyol at 2:16 PM on January 20, 2021 [1 favorite]


1) This is great.
2) From the first deep dive: "nerdsniped," lol!
posted by BrashTech at 2:32 PM on January 20, 2021


I have a practical modern day dithering question.. how come dithering isn't still used to combat posterization when rendering video? You know the problem, it's paricularly bad with H.264 video. A black sky that in the source is 3 levels of dark grey; 19, 20, 21 say. And the compressor dutifully compresses them into blobby shapes and then the render shows them and what you see is a line of contrast, the edge between the regions. Even though they're almost the same shade (19 and 20) the human eye is very good at picking out contrast edges.

One solution to soften that line is dither near the transition. You need to do this carefully in video; the dots need to stay consistent frame to frame so you don't get an effect of rendered snow. But it can work. But I've never seen a video playback device try it. Why not? I think it'd be a really simple shader in GPU terms.
posted by Nelson at 2:36 PM on January 20, 2021 [3 favorites]


I've noticed a sort of dithering on Amazon Prime Video that I wonder if it's just an overlay to break up the banding or if it's actually in the video stream. I'm not sure I know how to confirm it either way, frankly. (or if it's just "hey look we're so good we've got Film Grain™ in our stream!")

I think the other solution in video is going from 8-bit color to 10-bit, so you have more than 256 shades to choose from.
posted by Kyol at 2:42 PM on January 20, 2021


10-bit helps in that the contrast between the blobs is smaller, but the problem is still there and perceptible.

What seems to work in practice for me is H.265 video. I don't know why. Maybe its encoding algorithms somehow do gradients better? Or the decoders are actually dithering? Also a lot of H.265 is 10 bit so if it's rendered down to an 8 bit display there may be a dithering step. But all my guesses could be wrong.
posted by Nelson at 2:56 PM on January 20, 2021


(Turns out dithering in astrophotography refers to moving the camera by small random perturbations while taking images that will later be aligned and stacked. Averaging over multiple images reduces time-dependent noise, which is a great first step, and what I was doing a bit of with my raspberry pi. The random perturbation gets rid of spatially dependent noise, like 'hot' pixels and so on, but then requires an alignment step, for which it probably helps to have a solid guarantee that all observed objects are at infinite distance. This seems completely orthogonal to the meaning of dithering in computer graphics, but if we take dithering to generally mean 'techniques of noise diffusion' it could fit... Cool stuff, tho, and I'm glad to have read about it.)
posted by kaibutsu at 3:15 PM on January 20, 2021 [2 favorites]


One solution to soften that line is dither near the transition. You need to do this carefully in video; the dots need to stay consistent frame to frame so you don't get an effect of rendered snow. But it can work. But I've never seen a video playback device try it. Why not? I think it'd be a really simple shader in GPU terms.

This sounds like you'd end up with the 'motion warping' approach in the Obra Dinn writeup, which looks like it had problem smoothly interpolating between dithering neighborhoods. On the bright side, video codecs already to a lot of tracking movement of pixels form one frame to the next, so this kind of approach seems at least like something that could be tried. The final version in Obra Dinn relied on knowing the camera position (iiuc), which seems out of bounds for general purpose video.
posted by kaibutsu at 3:29 PM on January 20, 2021


What I want is a program like Aseprite that can apply dithering on the fly to pixel art. It has a nice dithering tool, but that just applies a dither gradient that can't be edited after the fact. I want to be able to draw something then say "Ok now conform this sprite to this palette with dithering as appropriate" Or even just a tool like an airbrush, but applies dithering that can continue to be adjusted after the tool finishes.
posted by Mr.Encyclopedia at 3:43 PM on January 20, 2021 [1 favorite]


This is the content I crave!
posted by TwoWordReview at 4:05 PM on January 20, 2021 [1 favorite]


Thanks for this.

Others might be interested in one of my favorite toy apps, BitCam
posted by rustcellar at 4:31 PM on January 20, 2021


To combat banding in my iOS photography app, which simulated analogue processes including vignetting, I'd add in a one bit blue noise dither pattern to luminance channel. Helped heaps.
posted by seanmpuckett at 4:59 PM on January 20, 2021 [1 favorite]


I have a practical modern day dithering question.. how come dithering isn't still used to combat posterization

Off the cuff not so-expert opinion, smart tv's aren't all that smart and have processors far from powerful enough to do any kind of serious on the fly processing like that. Also there are so many different possible combinations of formats, display size and color space, a generalized algorithm is probably in the research problem category.
posted by sammyo at 5:59 PM on January 20, 2021


how come dithering isn't still used to combat posterization when rendering video?

I think what you're describing is usually posterization from lossy compression, rather than from rendering.

Dithering is valuable when you have a high-accuracy idea of what gray level (let's say) you *ought* to have, but your output device is coarse. Some compression algorithms don't really get human perception of gradients, and so they go with blocky squares of a constant gray level that show up as those visible chunky boundaries. Those blocky squares are what the rendering device has got to work with. It likely does have enough gray levels to render those blocky squares to a :chef's kiss: of gray-level accuracy, but they're still blocky squares.

It's not a dithering problem, it's a de-aliasing problem. Once we're stuck with this blocky input, which we may presume to be samples from a smooth gradient, can we recover the smooth gradient and render that? (with or without output-device dithering as needed) This is a different type of problem, because we have to be making some guesses. Maybe an Autechre album does in fact have blocky patterns of 8x8 pixels and we hooliganly smooth it into gradients. Heck.
posted by away for regrooving at 11:00 PM on January 20, 2021 [3 favorites]


Fantastic post. Really interesting to read about this technique.
posted by knapah at 11:56 PM on January 20, 2021


The solar-powered version of low tech magazine (previously) uses dithered images, ostensibly in order to save bandwidth, though I suspect its largest value is aesthetic. It does indeed look nice.
posted by trotz dem alten drachen at 4:37 AM on January 21, 2021 [1 favorite]


Ohh, at the risk of dating myself, that was such an interesting issue back when I was into graphics... today it's mostly for people who design printer drivers... and Kindle apps I guess? It's just slightly sad to see how a lot of awesome graphics work has been made mostly obsolete by "brute-force" technology advances.

But my favorite thing about dithering has always been that it's one of the most intuitive ways to think about the uncertainty principle... You can either know WHERE the pixel is or WHAT COLOR the pixel is, but you can't have both at the same time...
posted by kleinsteradikaleminderheit at 5:21 AM on January 21, 2021 [4 favorites]


https://xkcd.com/2414/
posted by hypnogogue at 6:12 AM on January 21, 2021


Hi kaibutsu, you're right, astronomical dithering is... philosophically related, at most. Space telescopes often use Lissajous curves to shift a star across a detector (which is then called dithering), to avoid that a single pixel that may have lost sensitivity corrupts an observation. Ground-based telescopes do something similar, but often use simpler patterns.
posted by CompanionCube at 6:27 AM on January 21, 2021


Dithering and quantization can and are also used to generate knitting patterns, for jacquard knits. Knitting has limited color pallettes and low "resolution", like old video games.

(I use dithering a lot in my small knitting factory. This post is a joy to read.)
posted by romanb at 7:20 AM on January 21, 2021 [6 favorites]


Ohh, at the risk of dating myself, that was such an interesting issue back when I was into graphics... today it's mostly for people who design printer drivers... and Kindle apps I guess? It's just slightly sad to see how a lot of awesome graphics work has been made mostly obsolete by "brute-force" technology advances.

Yeah, I remember spending a _lot of time_ back in the early 90s playing around with image tools trying to find the right way to dither down my beautiful 24-bit pov-ray .BMPs into a format that didn't eat the hard drive for breakfast but didn't turn them into a godawful mess either.
posted by Kyol at 10:32 AM on January 21, 2021


Something I find interesting is that in the last several years, many AAA games have gone from using alpha blending to fade objects out to using dithering instead, often on a larger-than-pixel scale. See these examples from Super Mario Odyssey, for example. It's an interesting low-tech solution to a tricky problem (alpha blending can cause all sorts of artifacts due to draw order and stuff, but a dithered pixel is just fine writing into the ordinary depth buffer; this also keeps objects from showing through themselves, if dithered uniformly (like, with a normal alpha-blended fadeout you'd see the edges of mario's nose through the back of his head as he faded out, but since the dithered nose pixels are directly behind the dithered head pixels the effect is as if Mario were a 2D sprite, but with depth handled correctly). Dithering gives a single deliberate visual artifact in place of several possible distracting and unpredictable visual artifacts.
posted by NMcCoy at 1:15 PM on January 21, 2021 [4 favorites]


« Older There is a legend which comes from...   |   The Folding Dutchman Newer »


This thread has been archived and is closed to new comments