Sub-Pixel Rendering
February 28, 2006 9:51 PM   Subscribe

How Sub-Pixel Rendering Works: a method of anti-aliasing, sub-pixel rendering (or ClearType as Microsoft calls it) exploits the fact that pixels on LCD screens are actually made up of three sub-pixels: red, blue, and green. By constructing fonts using the sub-pixels, the results are arguably smoother lines and easier-to-read type. Sadly (or happily) CRTs benefit little, if at all, from the technology.
posted by falconred (33 comments total)

 
Big fucking deal what Microsoft calls it. IBM invented it.

http://en.wikipedia.org/wiki/Subpixel_rendering
posted by Mikey-San at 10:00 PM on February 28, 2006


Poor CRTs, nobody loves them. There's this utter behemoth (24"?) weighing about 50kg sitting on the desk behind me, not plugged in because all the lazy workers want a cool LCD screeen instead. Wish I could nick it and take it home. Except for the footprint.

Thing I want to know about subpixel rendering is, what, if anything, do I have to do to make it work? On Win2k at work, XP Pro at home.

Also, I'm not an engineer, but this looks like totally bloody obvious technology, surprised it took that long for someone to 'invent' it (or really, discover latent capacity).
posted by wilful at 10:06 PM on February 28, 2006


So what happens when manufacturers don't arrange the 3 sub-pixels as expected?
posted by knave at 10:08 PM on February 28, 2006


Ack, didn't mean to paste that link into my post.

I totally suck at the Internet tonight.
posted by Mikey-San at 10:11 PM on February 28, 2006


wilful - If you have XP, you can download a ClearType Tuner to activate it, or let Microsoft hijack your machine through your browser.

knave - from the article: "Since a few LCD panels have their sub-pixels arranged in B-G-R instead of R-G-B order, any industrial strength delivery of sub-pixel rendering technology will require a user-settable (or operating system readable) option to inform the system's LCD rendering engine whether the sub-pixels are arranged in 'forward' or 'reverse' order"
posted by falconred at 10:13 PM on February 28, 2006


I don't care what anyone says, ClearType looks decidedly better than default on a CRT. Then again, I'm a blind fuck with 120 dpi font settings and Text Size->Largest so what do I know.
posted by kjh at 10:16 PM on February 28, 2006


It's convenient that the page with the demo software tells exactly how old this news is: 2,157 days.

Kind of depressing actually, because I can't believe this article came out 6 years ago.
posted by smackfu at 10:37 PM on February 28, 2006


It does wonders for readability on CRTs, as kjh says. I wish people would stop repeating what they're told.

Also, what smackfu said. Heh.
posted by stavrosthewonderchicken at 11:17 PM on February 28, 2006


Gibson claims "no color fringe can be seen when these images are viewed at normal magnification and resolution". Maybe that's true for many or most people, but for me it makes an obnoxious shimmering rainbow halo around each letter.

Im not sure whether the sub-pixel trick or regular anti-aliasing is worse.

The more fuzzy and blurry they make fonts, the more people crow about how "beautiful" and "smooth" they are. It literally hurts my eyes. The only thing keeping me from switching from Windows to Linux is that Windows allows anti-aliasing to be turned off. Am I the only one who likes clear, sharp fonts?

'Scuze the rant, this one always provokes me.
posted by jam_pony at 11:20 PM on February 28, 2006


jam_pony: Linux also allows it to be turned off. In my GNOME/Ubuntu desktop, it's in the system menu, in Preferences->Font.

So, go ahead and switch.
posted by Joakim Ziegler at 11:24 PM on February 28, 2006


KDE forces AA and it takes incredible efforts to get rid of it. Time to try Gnome! THanks for the tip.
posted by jam_pony at 11:27 PM on February 28, 2006


jam_pony: AA can be switched off easily in KDE as well, in the control panel.
posted by salmacis at 11:35 PM on February 28, 2006


jam_pony, I notice it too. It is much worse with fonts that were not designed for the screen.

Possibly before I die there will be dynamic display media with fine enough resolution that these stupid tricks will not be necessary.
posted by i_am_joe's_spleen at 11:49 PM on February 28, 2006


There's a checkbox in KDE but it has no effect, at least not in recent versions of SUSE.

Currently we're limited in resolution because of the way programs' graphic elements are mapped to pixels. High resolution just makes everything smaller, rather than more detailed/sharper at the same size.

It will take new graphics subsystems and new GUI-writing methods to enable true high-resolution. This kind of revolution was actually advertised as a feature of Vista but MS couldn't get it together in time for the release and dropped it.
posted by jam_pony at 11:57 PM on February 28, 2006


That was an interesting article. Also, I didn't know there was a Cleartype Tuner, which I just messed with -- I'm never entirely sure if I like it or not, though I tend to always turn it off after a while. It's not that it's worse, exactly, it just seems wrong.

On the other hand, I'm pretty sure Mac OS X uses sub-pixel rendering, and it looks fine and nice. Maybe they just do it better?
posted by blacklite at 12:22 AM on March 1, 2006


I've used ClearType on my CRT (21", 1600x1200 resolution) and it does make the fonts smoother, I do notice some odd coloring of the letters though, but eh, it's an improvement.
posted by Talanvor at 12:41 AM on March 1, 2006


There's a checkbox in KDE but it has no effect, at least not in recent versions of SUSE.

Did you remember to restart your X server? (Ctrl-Alt-Backspace from memory)
posted by salmacis at 2:13 AM on March 1, 2006


Count me in as one who always uses it and it looks spectacular on either a CRT or LCD.

I have a true 24-bit LCD panel however, the LG.Philips LM201U04, which is the same one Apple uses for it's 20" displays as well. There are LCDs out there that are not true 24-bit but rather 18-bit which probably affects how well this technology functions, not to mention the argument about LCDs being worse than CRTs for print. I've found my LCD to be wonderful for colour reproduction for my print jobs.
posted by juiceCake at 6:08 AM on March 1, 2006


Yowza. I'd read about this, but I use Windows so seldom these days that I hadn't bothered. Now that I'm showing stuff to clients on a Windows system, I checked it out -- crazy improvement.

I vaguely remember reading something about this w.r.t. Win:Mac differences. Is this why the "font smoothing" on Macs is so much better than on XP? Or does Apple just use a better regular old smoothing algorithm?

(jam_pony: I don't think you're crazy. Some people just have different perceptual capabilities and sensitivities. I can see beat-frequency flicker on CRTs unless they're exactly 60Hz or very very fast; my fiance gets migraines from strong colognes and perfumes; diff't strokes...)
posted by lodurr at 6:49 AM on March 1, 2006


... also, jam_pony et al, FWIW, I found that there was significant "rainbow" haloing in black-on-white at certain type sizes, after I first ran the wizard; I tweaked it a bit, and got rid of that.
posted by lodurr at 6:55 AM on March 1, 2006


The more fuzzy and blurry they make fonts, the more people crow about how "beautiful" and "smooth" they are. It literally hurts my eyes. The only thing keeping me from switching from Windows to Linux is that Windows allows anti-aliasing to be turned off. Am I the only one who likes clear, sharp fonts?

No, you're not! Windows does it properly, Very small and very large fonts are anti-aliased (You couldn't read the small ones without it, and at larger sizes, the 'blur' on the letter dosn't look so bad, because it's so small relatively)

But on normal sized, egads it's hedious. It makes me feel like there is somethign wrong with my monitor, or my eyes. Not painful, but incredibly annoying. I can't stand using a mac with blurry fonts day in day out.
posted by delmoi at 7:16 AM on March 1, 2006


I vaguely remember reading something about this w.r.t. Win:Mac differences. Is this why the "font smoothing" on Macs is so much better than on XP? Or does Apple just use a better regular old smoothing algorithm?

I dunno, as I said it looks like a blurry muddy mess to me.
posted by delmoi at 7:16 AM on March 1, 2006


By the way, there are two issues with sub-pixel rendering.

sub pixel rendering is separate from anti-aliasing. You can do sub pixel rendering without making your fonts blurry *at all*. With regular font rendering a pixel is ether on or off, with sub pixel rendering you just turn on or off different sub-pixels. Simple, and just as chunky (and sharp!) as before

Adding anti-aliasing will make the text blurry, whether you're using sub-pixels or not. In theory an anti-aliased font with sub pixels could be sharper then one without.
posted by delmoi at 7:20 AM on March 1, 2006


Big fucking deal what Microsoft calls it. IBM invented it.

As Gibson rightly points out, the first computer to use sub-pixel rendering was the Apple II.

I remember encountering my first Apple II when I was seven. After a while I noticed that green plus purple somehow equalled white. Man. That blew my fucking puny-ass seven-year-old mind.
posted by suckerpunch at 7:47 AM on March 1, 2006


it looks great, but on my screen (a Dell 2001wfp, the widescreen 21" lcd) you see rainbow effects outside the letters and my eyes start to hurt.
posted by shmegegge at 9:56 AM on March 1, 2006


Accursed rainbows!
posted by TwelveTwo at 10:17 AM on March 1, 2006


Those of you complaining about Linux's font rendering compared to Windows: the main difference is that the Truetype bytecode interpreter will be disabled in Linux's truetype libraries (freetype) by default, because of patent concerns.

If you want Windows-quality font rendering in Linux, you need to build or obtain freetype with the bytecode interpreter enabled. This HOWTO explains how in this section, although the whole thing is a worthwhile read.
posted by mendel at 10:17 AM on March 1, 2006


CRT's use this crazy thing called "a gray color" instead of sub-pixels for AA. It works nicely.
posted by skallas at 1:53 PM on March 1, 2006


wow, that's exactly the kind of thing a person would say when they haven't read the article!
posted by shmegegge at 4:17 PM on March 1, 2006


No, the article (which leans heavily in favor of LCD AA) goes out of its way to show the "bluriness" of CRT AA. I'm not seeing in on the handful of CRT displays I use. Subpixel AA does look really nice, but the criticism that AA is not for CRTs and looks "blurry" is usually a configuration issue.
posted by skallas at 6:48 AM on March 2, 2006


no, the article says that anti-aliasing on any type of monitor is worse than sub-pixel rendering in terms of producing accurately smoothe curved and angled edges. it then goes on to say that sub-pixel rendering doesn't work well on crts. I'm not a fan of the former statement, but the latter is true.
posted by shmegegge at 8:18 AM on March 2, 2006


On CRTs the subpixels are not independently addressable, so subpixel rendering won't work on them. That is, you have no way of telling whether the red subpixel for a particular pixel is to the left of the green one, to the right of it, above it, or below it. There might even be more than one red subpixel for a given pixel if the screen resolution is low enough and the dot pitch of the monitor is high enough. So anti-aliasing with grayscale (or more precisely, degrees of transparency with the background) is about the best you can hope for.

On an LCD you really need a digital signal for it to work best. If you have an analog signal it's tough to get the subpixels to line up properly.
posted by kindall at 8:32 AM on March 2, 2006


I've edited the Wikipedia article to include some more too much detail on the Apple II issue.
posted by kindall at 12:43 PM on March 2, 2006


« Older AAAAAAAAAAAAAAHHHHHHH...   |   The hidden super university Newer »


This thread has been archived and is closed to new comments