New monitor technology brought to light
October 4, 2005 2:08 AM   Subscribe

High Dynamic Range (HDR) imagery rendered in software is only half the story if your monitor can't actually display that full dynamic range. Bit-Tech has an excellent article on an actual HDR-capable display brought out by a crowd called Brightside Technologies (formerly Sunnybrook Technologies).

Needless to say, you want this. And it can be had, for the bargain price of $50000 USD.

Here's more on various HDR Display technologies. Brightside HDR is also covered at HardwareSecrets and Toms Hardware Guide. The Max Planck Institut has their take on HDR, Hyperfocal Design has a few good links if you're interested in trying your own hand at HDR content creation, and here's Brightside's own take on HDR and gaming.
posted by crocos (35 comments total)
 
I bet this will make Doom 5 or 6 really, really horrifying.

(I say 5 or 6 because I'm trying to compensate for the reduction in cost required in order for your average power gamer to be able to afford such technology. Am I too early? Maybe 7 or 8?)
posted by deusdiabolus at 2:37 AM on October 4, 2005


I've long thought that monitors are due for a big generational change sometime; what we have now, outside of the LCD transition which was welcome but more incremental, isn't all that different to the Trinitron wave of the late 80s.

Computers are nominally 100x faster now compared to 1989, hard disks and memory have scaled even more but DPI has only gone up from ~70 to ~100 and brightness/contrast isn't that better from Trinitron (though the newest LCDs are looking better).

I've got a nice LCD flatpanel and laptop from 2002, and I don't intend on replacing either until displays are more than incrementally better.
posted by Heywood Mogroot at 2:41 AM on October 4, 2005


Well, it was 10 years between Doom 2 and 3. Maybe Doom 3.5? Technology drops in price rather quickly these days.
posted by neckro23 at 2:42 AM on October 4, 2005


Whoa. Just read through that article on the Brightside, and I'm a bit in shock here. Now I just need to come up with $50K to acquire this magical god-box of image happy making.

I eagerly await the point where displays of this beauty become available. Not sure if this exact kind will ever be feasible, or if some other advance will allow mass production HDR displays, but suppose that I don't really care. All I know is that I want that thing.
posted by Stunt at 2:54 AM on October 4, 2005


What about the contrast between HDR and a traditional CRT? Glancing through the article I didn't notice any comparison. Doesn't CRT have the same advantage as Brightside since each pixel is individually lit? How come there aren't High Definition CRTs?
posted by furtive at 3:47 AM on October 4, 2005


Sorry, this is all way over my head? How is HDR different from HDTV? It must be with a pricetag like that.
posted by zardoz at 5:43 AM on October 4, 2005


High dynamic range introduces some new problems. As with audio - do you really want a cannon going off in your living room that is as loud as a real cannon - if your display is bright enough to cause discomfort... Good problems to have of course, but end users will be frustrated by these things.
posted by Chuckles at 5:57 AM on October 4, 2005


If you think that's hot, Sharp just announced a panel with 1,000,000:1 contrast ratio.

zardoz, HDR is the next step beyond HDTV. HDTV brings an increase in resolution: more pixels in the image means higher visual clarity. But the contrast ratio is still low, and can't come even close to approximating the light levels we see in real life. Here is a fantastic introduction to HDR. Read that and you'll understand why HDR will be the next breakthrough for photography and video. Imagine a camera with such a wide, sensitive exposure range that in a single click you can capture perfect detail from the shadows to highlights. Some academians and artists are already employing HDR techniques, such as this guy who was the subject of this MeFi post.

Chuckles, you bring up a very good point. People are already somewhat annoyed with "boomy" movie soundtracks when watching them at home. Fortunately home theater sound standards like Dolby Digital, which have a pretty sick amount of dynamic range, can compress things to be less boomy. Hopefully future consumer HDR systems will employ such compression so things still look awesome while you don't have to wear sunglasses.
posted by zsazsa at 6:03 AM on October 4, 2005


I've seen a smaller HDR lcd once. It's better than you could imagine.

It's like a set of small stereo speakers vs a nice speaker stack that has a full range of frequency response. Small speakers can offer great sound, but usually lack the low lows and the cleanest highs.
posted by tomplus2 at 6:09 AM on October 4, 2005


What about the contrast between HDR and a traditional CRT? Glancing through the article I didn't notice any comparison. Doesn't CRT have the same advantage as Brightside since each pixel is individually lit? How come there aren't High Definition CRTs?

That's a very good point. Comparing in with another LCD is cheating, since they're traditionally rubbish. I'd reckon a CRT with HDR images correctly fed into it could easily compete with the Brightside monitor. This whole thing is more of a hack to make LCDs less rubbish then it is to improve display technology.
posted by cillit bang at 6:09 AM on October 4, 2005


Well, I've been enjoying the poor man's version of HDR lately with Day of Defeat: Source (Half Life2 mod). I can only really approach the tech from a gamer's standpoint, and it's okay. Neat eye-candy, little bit more realistic, but almost distracting, when glaringly obvious.

I just upgraded my monitor, but I embrace the notion of better displays, even though M$ is planning on using future monitors to control DRM and degrade the quality of illicit (or seemingly illicit) video. It will be supported by the Operating System, and, if they go forward with the plan, will be a thorn in people's ass for a while to come.

HDR displays sound great, though.
I just got over the hump of waiting for HDTVs to become affordable, now this? I'm keeping my bloody 27" NTSC until it blows itself up.
posted by Busithoth at 6:25 AM on October 4, 2005


i've also been enjoying the HDR-effects of Valve's Day of Defeat: Source mod for the Half-Life 2 engine. Soon, Valve will be releasing a 'tech demo' for HDR named The Lost Coast.

With Day of Defeat: Source, I've found, as Busithoth, that the effect is neat...but a bit more gee-whiz than revolutionary. But no doubt - this is the next 'big thing' we'll be seeing in PC gaming, most likely because it's not a major tech upgrade. Valve licenses their technology, so if competitor development houses can't buy it, I'm sure they have something in the labs right now to compete.

However, I will give the HDR tech a nice nod, seeing as how I have had one spectacular experience in Day of Defeat: I was sniping from a second-story, bombed-out building...and the sun blinded me through my scope when I turned to look down one tight alley. I was killed by my target...and after the initial curse, I realized, 'I've just been blinded by the sun...in a videogame.'
posted by NationalKato at 6:57 AM on October 4, 2005


So how can I accomplish this with my photography? I'm confused.
posted by VulcanMike at 7:14 AM on October 4, 2005


How come there aren't High Definition CRTs?

There are tons of them. This one is sitting in my living room right now.
posted by SweetJesus at 7:16 AM on October 4, 2005


CRT's could display HDR as well, but your standard CRT has only been designed to expect (8bit/channel) signal input bandwidth and map that to particular light intensities ( this is what monitor gamma curves are all about). Even if your brights were brighter, you'd still only have 8bit in = 256 signal values to map from, so any bright light source in an image wipes out detail in shadows, like in photos taken with the sun behind the subject. Most photo/video cameras simulate an iris, and automatically readjust their sensitivity levels to keep as much of the image data in the range that can be captured and stored in that medium.

With HDR, the input/output bandwidth is 12-bit or 16-bit /channel, so you have a greater range of values than can be used to store brightness info, so you can preserve the detail in both bright areas and dark areas and re-display that broader output range as well.

Essentially, your own iris does more of the work, rather than having the image pre-adjusted by the camera or the display. It's a very different thing to look at.
posted by arialblack at 8:44 AM on October 4, 2005


Vulcan - either buy a new camera, or (if your subject permits), take a series of bracketed exposures until you've captured a satisfactory range of detail in every area of the shot. Those can then be reassembled in software into an single HDR image.
posted by arialblack at 8:59 AM on October 4, 2005


Yeah. I saw the Brightside display a few months back at SIGGRAPH. Completely, utterly fantastic.

Side note: HDRShop
posted by effugas at 9:21 AM on October 4, 2005


arialblack: CRTs are analog. They can easily render 16-bit dynamic range.

The whole concept is bullshit. Any display device, even the Brightside, can fundamentally only display a range of colors between black (~0%) and white (100%). The problem is, normal practice in TV studios is to adjust the camera iris so that bright parts of the image are around 80-90%, which inevitably means the brightest parts of the image sometimes go over 100% and get clipped. All HDR amounts to is turning that iris knob down so that there's headroom for the very bright parts. To compensate, you turn the brightness of the display way up. No buzzwordy new hardware required.
posted by cillit bang at 9:33 AM on October 4, 2005


cillit bang, you misunderstand colour reproduction... What is the difference between the sun and a lightbulb if they are both just 100% white?
posted by Chuckles at 10:33 AM on October 4, 2005


with current cameras, HDR images are accomplished by merging multiple photographs taken at different exposures. You take one picture exposed for the highlights (which plunges the rest of the image into darkness), one exposed for the mids, and one exposed for the shadows (which blows out all the highlights). Photoshop CS2 has a plug-in that helps you intelligently merge the image so that that you keep all the detail of the entire scene.
posted by pmbuko at 10:51 AM on October 4, 2005


digital cameras only, of course.
posted by pmbuko at 10:53 AM on October 4, 2005


...but a bit more gee-whiz than revolutionary.

Have you played for a while with it enabled, then tried it again with it turned off? I have, and it really isn't the same anymore.

Furthermore, remember that DOD: Source really isn't a FULL implementation of True HDR. The Lost Coast demo will reveal a bit more about some of the other features of HDR. So I'll wait to pass judgment until then.

Either way, much like Anti-aliasing...it's a welcome improvement to my gaming atmosphere. I love it.
posted by mr.curmudgeon at 10:55 AM on October 4, 2005


What is the difference between the sun and a lightbulb if they are both just 100% white?

Where did I say that?
posted by cillit bang at 11:29 AM on October 4, 2005


You didn't. However, I didn't put quotation marks around it and I did put a question mark at the end, so I'm not sure why you ask.

The point is that dynamic range is real, and 100% white doesn't adequately describe a light source.
posted by Chuckles at 11:44 AM on October 4, 2005


I wasn't talking about light sources, I was talking about displays, in the sense that the Brightsense thing is just a slightly brighter TV, and nothing revolutionary.
posted by cillit bang at 11:53 AM on October 4, 2005


All they are doing is PWMing a grid of white LEDs? This isn't revolutionary at all, this is just a new product that uses existing technology. Hopefully this means it will be available pretty soon, because I'd love one. I'd be surprised if other companies aren't working on the same thing, given its relative simplicity.
posted by MillMan at 12:17 PM on October 4, 2005


But how much brighter is slightly? The Dell 2005FPW has a brightness of 300 cd/m^2 while the BrightSide DR37-P has a brightness of 3000 cd/m^2. I don't know exactly how the eye perceives brightness, but it isn't likely to be linear, so the DR37-P isn't likely to appear 10x brighter. Nonetheless, to suggest that it is just bullshit is really kind of silly.

The real question, which I alluded to earlier, is how bright does it have to be before improved performance is no longer useful - there is no point in having a monitor that can blind you. I know my Compaq P110 (late 90's 21" trinitron CRT) isn't bright enough. I have spent a few minutes with the 2005FPW and it is better...

Now we do have to come back to bit depth too. Just being able to produce 3000 cd/m^2 isn't enough, the steps between brightness levels have to be near or below the threshold of perception or you will start to see artifacts (colour bands, presumably others). That is why 16 bits comes into the discussion... The higher bit depth is in some sense a consequence of being able to produce more brightness.
posted by Chuckles at 12:26 PM on October 4, 2005


The sun is pretty blue, not white. That's your eyes white balancing.

/derail
posted by lumpenprole at 12:33 PM on October 4, 2005


I just googled up an article comparing display technologies from extremetech that looks very good.
posted by Chuckles at 12:43 PM on October 4, 2005


I'm sure the Brightside display is much brighter, in absolute terms, than other displays. But in a darkened room, where the screen is the only brightness reference point, absolute brightness isn't at all important. You could do the same "HDR" demo with an ordinary CRT and any under-exposed (as in less-exposed-than-normal) image, provided the shadow detail in the source image hadn't been crushed.
posted by cillit bang at 12:59 PM on October 4, 2005


Funny... I don't agree at all. I have never owned a display device that created as much dynamic range or absolute brightness as a movie (lets say a 70mm movie on a moderate sized screen at a properly set up theatre, but I don't really know what a proper reference would be) and I think that even more dynamic range than that is probably desirable.

It is impossible to demo of course, you can't possibly see the benefit of the extended dynamic range on a normal display device (ya ya, what is extended, what is normal). It is like those stupid Bose commercials advertising the sound quality of their speakers. Duh - how is that supposed to work...
posted by Chuckles at 1:42 PM on October 4, 2005


Ah, I think you're talking about film -> video transfers, which generally do squish a lot of "dynamic range" out of the original. But it's not inherent to the display device, just an attempt to make the picture look OK in the non-ideal conditions DVDs are normally watched in. This is about methodology rather than technology.
posted by cillit bang at 2:06 PM on October 4, 2005


I don't think it has anything to do with the transfer at all. There are at least three significant factors - camera, storage and playback - they all matter. I saw a DVD projected with dual CRTs once... The same DVD would not have looked as bright with as much contrast on my monitor. Now the 2005FPW might have looked as good as the dual CRTs. On the other hand the 2005PFW seems to show a little more colour banding than my monitor when looking at gradual gradients...

Anyway, this is getting silly. Current display technology doesn't come close to duplicating reality (ever hear of "the absolute sound"? How about "the absolute video"). To what extent reality is actually desirable in a display is an interesting question, but at this point I would like a little more.
posted by Chuckles at 3:15 PM on October 4, 2005


cillit - I didn't mean to imply that the CRT as a technology couldn't, but that a run-of-the-mill CRT monitor expecting 8-bit VGA doesn't need to be held to the same standard in range and acceptable noise as one expecting to reproduce finer differences in signal level and over a wider range of contrasts - same for the converter on the gfx card and cables. If you're losing the extra range between the computer and the screen.. then it's just ...brighter & higher contrast.

can we just fold this in with that audiophile thread from last week?
posted by arialblack at 4:31 PM on October 4, 2005


zsazsa: From the last page of the Bit-Tech article:

STOP PRESS Sharp have announced an LCD with a claimed 1,000,000:1 contrast ratio. However, it's peak luminance is just 500 cd/m², meaning all their engineers have done is reduce the minimal luminance as low as 0.0005 cd/m². Of course, BrightSide's approach is superior at both ends of the spectrum: it can delivery 0 cd/m² blacks while simultaneously producing highlights at 4000 cd/m² - eight times the peak luminance of Sharp's LCD.
posted by crocos at 7:08 PM on October 4, 2005


« Older Kiwi ingenuity at work   |   Microsoft is becoming... Agile?? Newer »


This thread has been archived and is closed to new comments