Pixel Pickle
June 26, 2010 10:36 PM   Subscribe

Editors of the pop-culture magazine Wired provided the title "iPhone 4’s ‘Retina’ Display Claims Are False Marketing" to a highly critical article about the new iPhone's high-resolution "Retina" display, so-called as the human eye cannot resolve individual pixels when viewing it. A technician who worked on the Hubble telescope disagreed with the Wired editors' choice of rhetoric in very strong technical terms and issued less stringent disagreement with Raymond Soneira, the writer of the piece. Neuroscientist and photographer Bryan Jones published his own highly readable technical analysis of the display's pixel arrangement, that helped him decide whether Apple's claims were truthful or not.
posted by Blazecock Pileon (61 comments total) 6 users marked this as a favorite
 
Yep Wired sure does love those zingy headlines
posted by seagull.apollo at 10:48 PM on June 26, 2010


I've read some of them already, and think Wired's headline was exaggerating something that's at the splitting hairs level. From all reports the display is the best one on a smartphone today. I use Apple products, although not an iPhone, and while I don't always agree with Apple's actions and feel they shouldn't be immune to criticism, I'm happy they are around pushing everyone to make better phones, computers, and interfaces.

But at this point it seems like you're shilling for Apple.
posted by 6550 at 10:54 PM on June 26, 2010 [1 favorite]


I think the links are pretty interesting parts of the web that go beyond marketing and into some fun science. I invite you to take the time to read through them and skip over the thread if it doesn't interest you.
posted by Blazecock Pileon at 10:59 PM on June 26, 2010 [3 favorites]


Wait, so, Apple actually figured out "zoom and enhance!"???
posted by cthuljew at 11:01 PM on June 26, 2010 [1 favorite]


Yep Wired sure does love those zingy headlines

They sure do. Lest we forget the "Why the Japanese Hate the iPhone" article from early last year, when ... (ahem) ... "Apple iPhone Captures 72% of Japan Smartphone Market".
posted by cgomez at 11:07 PM on June 26, 2010 [1 favorite]


The Bryan Jones link is great.
posted by seagull.apollo at 11:07 PM on June 26, 2010


Hm, I think Bryan Jones dropped a factor of two— for a display to show X cycles/arcminute, it needs to have 2X pixels/arcminute. (You can think of this intuitively as a black and white pixel to give a line, or more formally as the Nyquist sampling criterion). You'll notice that his and WIRED's numbers are about a factor of two apart. So you may have to hold the phone as much as 18-20 inches from your eye in order to be theoretically unable to see the pixels. Oh noes.

Regardless, I've toyed with a coworker's, and it is a very nice display.
posted by hattifattener at 11:13 PM on June 26, 2010 [1 favorite]


I can't see the pixels on my 3GS. Do I need glasses?
posted by furiousxgeorge at 11:24 PM on June 26, 2010 [4 favorites]


You guys know that Apple is a giant multinational corporation with an ad budget pushing a billion US dollars, and it doesn't need you ordinaries to defend it from every possible criticism of any of its products, services, business methods or any other aspect of its gigantic, near-Exxon-sized business?

Right?
posted by dirigibleman at 11:28 PM on June 26, 2010 [12 favorites]


Great link, cgomez. I have never seen such an erratum statement on an article before. In summary it says "our reporting on this article was both incompetent and dishonest, contains more fuckups than actual words and is printed on paper too slick to wipe your ass with so it's not even good for that. But we stand by its conclusions in its current form, which is not as originally published in ways we didn't even want to admit until you made us." I'd say it's pure Wired, but I've never been able to stand the thing so I probably am not the best person to judge.
posted by George_Spiggott at 11:35 PM on June 26, 2010 [1 favorite]


Right?

Right, if by "right?" you mean "I seem to be taking this a bit too seriously"
posted by setanor at 11:38 PM on June 26, 2010


It's relatively easy to read the New York Times frontpage on the iPhone 4 without zooming in. The text of non-feature articles look to me like they're in around 4pt type, and they're smooth and readable. Regardless of their visibility to the human eye, the pixels are indeed aplenty. It's the nicest display I've used on any device. As floam said, I think our eyes will all rejoice when someone releases a laptop display with a similar pixel density.
posted by pkingdesign at 12:36 AM on June 27, 2010


Calling it a 'retina' display because under the right conditions the human retina doesn't have have enough resolution to distinguish nearby pixels is like saying a 1 megapixel camera has an 'optic nerve' sensor because the human optic nerve has about a million fibers.
posted by Pyry at 1:15 AM on June 27, 2010 [4 favorites]


The link to the Bryan Jones article (last link above) didn't work for me. I googled and got this. Interesting stuff! Is the resolution capacity of the average eye .78 arcminutes or should we be looking at the ideal .6 (as the Bad Astronomy link says, before it suggests 1 arcminute as the universal)? Anyway, in the link I've posted (and maybe elsewhere) Soneira says:
My analysis and comments on the Retina Display were widely distorted and transformed into an attack on Apple and Steve Jobs — they were not. I simply did a quantitative analysis of what was said in the context of my campaign to eliminate (or more realistically reduce) exaggeration in display specs. Apple's claim falls under glorious wording rather than numerical spec abuse — and even with quantitative analysis it's minor compared to what other manufacturers are saying. I sent Steve Jobs an Email explaining that and got a reply from him.
But, of course, Soneira is using a different metric to determine resolutionality.
This was very interesting and informative.
(And I heart the term "glorious wording")
posted by CCBC at 1:33 AM on June 27, 2010 [1 favorite]


Nerds dissecting authenticity of claims to gadget-lust? Sure, we got that. You want that sensationalized? Highly technical? Accessible? No, seriously--you have OPTIONS.
posted by millions at 1:53 AM on June 27, 2010


I think Apple's claims are fair reasonable enough, but I don't like the branding. When I hear "retina display" I think of this kind of thing, not just a really hi-res phone screen. This is the most annoyingly confusing piece of tech-branding since the nothing-to-do-with-Java "Javascript".
posted by L.P. Hatecraft at 3:12 AM on June 27, 2010 [1 favorite]


I'm more OK with "Retina display" than I am with Apple's advertising making the claim that the iPad is "magical." That takes some balls.
posted by emelenjr at 4:39 AM on June 27, 2010 [3 favorites]


I got an iPhone 4. A DPI this high is indeed pretty rad. I've never seen text that looks this good. I just want to know why my fricking two thousand dollar laptop can't do this. For many years things were stuck at <>

300 DPI @ 15 inches using a 1.6:1 display would be quad HD (3840x2400). For those of your playing the home game that's 70 megs for a 32-bit double buffered frame buffer before even considering the massive texture sizes required for the windows themselves in any sort of backing store.

The iPhone 4 in comparison uses 4.6 megs for a double buffered frame buffer and doesn't really require a backing store because there's not multiple windows to composite.

posted by Talez at 5:52 AM on June 27, 2010


I'm more OK with "Retina display" than I am with Apple's advertising making the claim that the iPad is "magical." That takes some balls.

I dunno, I set one of my preferences wrong on my iPad and blighted my neighbour's crops something terrible. Mind, he's not likely to notice now he's so obsessively in love with his pig.
posted by reynir at 6:32 AM on June 27, 2010 [15 favorites]


Talez, fwiw the way Core Image works is by using a whole lot of backing store ... One for each non.opaque subview making up the composite window. There are optimisations, but, yeah, there is a lot of memory being used to make that screen look good and animate fluidly. This, and multitasking, are why the fone got pushed to 512mb ram. Let me tell you that creating compelling imagery on the pad with a 3mb screen and an OS that gets pissed when one uses more than 50mb or so for everything, is freaking hard. The tricks and optimizations I have had to make for Paintbook to allow fast compositing of an arbitrary number of drawing layers are some of the gnarliest code i've dealt with.
posted by seanmpuckett at 6:40 AM on June 27, 2010


What's an iPhone?
posted by Fizz at 6:46 AM on June 27, 2010 [1 favorite]


I've been of the opinion for many years that displays should be 300dpi. That's the resolution that you need on paper for professional-level presentation. You get minor improvements in readability by going to 1200dpi, but 300dpi images look fantastic, especially for text. You have to stare very hard at two examples of text to figure out which is which.

I spent many many hours reading 300dpi text in my youth, and I'd dearly love to have the same resolution on a big 30" screen.

And yes, I understand that this would require roughly 7800x4500 resolution for a 30" screen. So get with it already, LCD manufacturers! My eyes aren't getting any younger.

Remember that old feeling of awe we used to get so frequently in computers, when we'd see something that would just blow us away with its beauty? A 30" screen at 7800x4500 resolution would have unbelievable "wow!" power.
posted by Malor at 7:07 AM on June 27, 2010 [3 favorites]


I am pretty sure that discrete pixel displays are not the right approach for super high Rez displays. A raster scanning laser would permit essentially unlimited resolution limited only by mems and driver response times. An optically folded system would permit an acceptably flat self.contained desktop unit, although my ideal display is just a flat wall and a projector on the ceiling.
posted by seanmpuckett at 7:21 AM on June 27, 2010 [1 favorite]


Mod note: few comments removed - metatalk is your option.
posted by jessamyn (staff) at 7:47 AM on June 27, 2010


I think the problem is that people have always shopped for displays based on screen size. They'll buy a 30" or 36" display over a 20" or 24" without considering that the resolution might be the same on each, and that the image might actually look worse on the larger one. They just think "bigger=better". You can also judge size even if the monitor is turned off or connected to a bad source.

It seems to be changing though. I'm not sure it makes sense to have more than about a 20" or so monitor on a desktop (widescreen, 16:9), or 15-17" on a laptop ... so manufacturers are going to have to compete on something other than raw size. Resolution, which is more subtle than size but can still impress, seems like the next logical choice.

I'm no fan of the iPhone but I'm glad Apple is pushing things up from the arbitrarily low ~96 dpi de facto standards that have prevailed for so long.
posted by Kadin2048 at 7:50 AM on June 27, 2010


Jeez I was just going to bemoan the total lack of "fanboi" comments, and now jessamyn comes in and says she's cut em out.
posted by nevercalm at 7:56 AM on June 27, 2010


I didn't cut out any fanboi comments, just a lot of crappy snarking at one user. Everyone knows where MetaTalk is.
posted by jessamyn at 7:58 AM on June 27, 2010


I'm more OK with "Retina display" than I am with Apple's advertising making the claim that the iPad is "magical." That takes some balls.

I know! I mean I've played with one in the Apple store like 15 or 20 times and I still can't decide if I want one. Sure it opens webpages faster than the laptop next to it, and it is location and movement aware, and you can actually download and watch movies, read books, create text, and a shitload of other things no one could imagine even 3 years ago, but "magical" is way out of line! I mean the damn thing doesn't even do Flash and there is no porn in the Itunes Store. Magical my ass.
posted by cjorgensen at 8:07 AM on June 27, 2010 [3 favorites]


I agree with what seems to be the consensus - that the display is nice; the difference between its resolution and human acuity at a distance of one foot is negligible; and that Wired overcooked its headline. But Soniera seems to be a bit of a purist. In his article on the Quattron four-color LCDs he assumes that the goal of the manufacturer ought to be to provide "accurate picture quality where the HDTV is showing the same picture seen at the production studio". I rather think their goal ought to be to produce an HDTV with a display that looks good in a living room, affected by both the TV's background and its lighting. I don't know whether exaggerated yellows would in fact look better against an off-white wall illuminated by tungsten lighting (each of which is slightly yellow), but I wouldn't dismiss the possibility.
posted by Joe in Australia at 8:17 AM on June 27, 2010


Do I have to remove my glasses for Apple to be right?
posted by pashdown at 8:24 AM on June 27, 2010


I love that the definition changes by distance. Whatever monitor you're using is a "retina display"...just get far enough away that you can't see the lines between the pixels.

...but yeah, nice display apple. you may have a cookie in hi-res.

(still gonna get a droidx)
posted by varion at 8:39 AM on June 27, 2010


and you can actually download and watch movies, read books, create text, and a shitload of other things no one could imagine even 3 years ago,

... What?
posted by Malice at 8:51 AM on June 27, 2010 [1 favorite]


I got an iPhone 4. A DPI this high is indeed pretty rad. I've never seen text that looks this good. I just want to know why my fricking two thousand dollar laptop can't do this. For many years things were stuck at <>
Well, your laptop display is probably a lot larger. It probably has more total pixels then your iPhone's display. There are also some android phones that have similar display resolutions (250dpi rather then 300), and apparently some Windows Mobile phones have display resolution that's just as high, but then you'd be using windows mobile.

I really do like this trend of high pixel density displays. But it seems like this would be really easy to test. Just show a grid pattern and see if you are able to see individual pixels.

Anyway, as to whether this post is schilling, it's very much "The iPhone 4: Awesome display or the AWESOMEIST DISPLAY POSSIBLE". Kind of annoying.
300 DPI @ 15 inches using a 1.6:1 display would be quad HD (3840x2400). For those of your playing the home game that's 70 megs for a 32-bit double buffered frame buffer before even considering the massive texture sizes required for the windows themselves in any sort of backing store.

The iPhone 4 in comparison uses 4.6 megs for a double buffered frame buffer and doesn't really require a backing store because there's not multiple windows to composite.
Windowing systems don't use textures to store background windows, the operating system directs the application that owns that window to redraw itself when more of it is shown. That's why on old PCs the windows would flicker when you move them around. Modern OSes might cache window displays, though, since they have so much memory.

And 70 megabytes is not a lot for a modern graphics card. Video cards with half a gigabyte of ram run about $29 on newegg, and video cards with 2gb cost about $99. Modern video cards have no trouble running multiple monitors these says, and ATI "eyefinity" cards can now drive up to six monitors easily pumping out that kind of resolution without breaking a sweat. The limitation is the monitors, not the graphics cards.
posted by delmoi at 9:11 AM on June 27, 2010 [1 favorite]


Modern OSes might cache window displays, though, since they have so much memory.

From what I know, OS X has moved to a process much like that, where it sets aside video memory for each window, if you have enough available, and is then able to move windows around without needing application assistance. If you don't have enough, it can store the backing in system RAM, which slows things down a bunch, as it has to copy in multiple megs every time you move a window. Video memory is surprisingly important to routine use of a Mac, which is one of the reasons I've always found it odd that Macs come with such crappy video cards, relatively speaking.

I think Win7 is doing a similar thing with Aero, though I don't know the details.
posted by Malor at 9:31 AM on June 27, 2010


I would just like to say that while I am not a fan of Apple, I am a fan of retinas. I own 2 of them, and have found them useful and easy to operate. My only complaint would be one of the retina's accessories, the lens, which in the model I have seems to be of somewhat poor build quality. I had to purchase a fix at some expense. Nonetheless a useful gadget overall.
posted by Xezlec at 9:35 AM on June 27, 2010 [13 favorites]


And 70 megabytes is not a lot for a modern graphics card. Video cards with half a gigabyte of ram run about $29 on newegg, and video cards with 2gb cost about $99. Modern video cards have no trouble running multiple monitors these says, and ATI "eyefinity" cards can now drive up to six monitors easily pumping out that kind of resolution without breaking a sweat. The limitation is the monitors, not the graphics cards.

That's texture memory, though. It's not for pixel resolution. Graphics cards don't push out, say, a 1600x1200 bitmap however many times a second. They draw shapes with textures on them.

Look at it this way. 3640x2400x32 is 294,912,000 bits of data per frame. To get 30 frames per second, you need to be processing 8.8 billion bits per second. Even assuming determining what a single color in a single pixel should be is something you can map to a single processor calculation, not a lot of systems can operate in the 8.8ghz range.
posted by kafziel at 9:39 AM on June 27, 2010


Haters gonna hate.
posted by Threeway Handshake at 9:46 AM on June 27, 2010 [1 favorite]


kafziel: actually, that's what happens, and much faster than that. Except that it's massively parallelized, to 128 bits per clock, times 8, 16, 24, sometimes upwards of hundreds of "texture units". That's why high end graphics cards can draw upwards of 300 watts -- they're ripping through terabits per second.
posted by seanmpuckett at 9:49 AM on June 27, 2010


Graphics cards don't push out, say, a 1600x1200 bitmap however many times a second.

They actually do, although seanmpuckett's oversimplifying a little.

You have an onboard frame buffer on those cards, and that frame buffer is sent, in full, every frame. On most LCDs, that's 60 frames a second. On CRTs, it often went to 85Hz or so. The newest LCDs are 120Hz. This is just a straight copy of the bits from the video RAM out to the display, and as far as I know, this is done serially, although I have some vague memory of "update zones" on an LCD, so don't take that as absolute truth. It's just a dirt-simple shoveling of a huge amount of data out to the monitor, each and every frame.

The part that he's talking about is the internal processing of the card, the 3D geometry and shaders, which are used to generate the bitmap to send to the monitor. It's quite common to have video cards generating new frames in sync with the video refresh, but it's not required. The image gets sent to the screen 60 times a second, no matter what, but it may send the same frame 2 or 3 times if the generation is running slow. The 3D engine will be constructing a new frame elsewhere in memory. When it's finished, it tells the video hardware to start sending from that part of the video memory instead (a screen flip), which it will continue to do until it's told otherwise. Then the card starts building another new frame. If it's going fast enough, you get a new frame every refresh. If the card's lagging behind, video frames will be repeated until new ones are ready.

Video cards, in other words, are frighteningly fast devices.
posted by Malor at 10:18 AM on June 27, 2010


MY IPHONE CANT STOP THE OIL SPILL
I MUST SKULLFUCK STEVE JOBS
posted by fungible at 11:01 AM on June 27, 2010 [1 favorite]


That's texture memory, though. It's not for pixel resolution. Graphics cards don't push out, say, a 1600x1200 bitmap however many times a second. They draw shapes with textures on them.

Look at it this way. 3640x2400x32 is 294,912,000 bits of data per frame. To get 30 frames per second, you need to be processing 8.8 billion bits per second. Even assuming determining what a single color in a single pixel should be is something you can map to a single processor calculation, not a lot of systems can operate in the 8.8ghz range.
You obviously don't understand how this works. Old 2D video cards worked by storing a single framebuffer. Every time the screen refreshed a custom microchip, not the CPU would read data out of the frame buffer, convert it to an analogue signal, and send it to the screen. It's only when the screen changes that the CPU needs to do anything.

3D cards work the same way, except they have extra hardware to calculate 3D transformations, texture mapping and now 'pixel processors' that do calculations for extra 3D effects or whatever you want. But the important thing to understand is that they operate in paraellel. Modern graphics cards may have thousands of 'pixel processors' that are used in 3D rendering. this card has 3,200 'stream processors' on two physical chips. (NVidia designs use fewer but more powerful processors)

But just like the old 2D card, they still have a frame buffer, the 3D hardware just makes changes to the frame buffer, and you still have a custom chip (which would now probably embedded in the same silicon as the 3D hardware) that converts the frame buffer into an image for the monitor.

And you're right that 8ghz is pretty fast for a silicon chip, but simple hardware like an image display chip can run faster then a CPU because they're so much simpler. But even if it's too fast for one chip, because the signal is digital you can have multiple chips send the data for multiple pixels in parallel.
posted by delmoi at 11:37 AM on June 27, 2010


Running the actual numbers, my 2560x1600 screen has 4,096,000 separate pixels, requiring (at least) 12,288,000 bytes to describe uniquely. It might require 16,384,000 if it's sending 32 bits per pixel.

At 60 refreshes a second, in other words, the graphic card is sending at least 737,280,000 bytes per second. That's just short of 6 gigabits a second, possibly 7.37 gigabits a second if it's sending 32 bits per pixel. (I'm not sure what the on-wire format actually is; I suspect it's probably three bytes per pixel, not four.)

That's a hell of a lot of data, but you could hypothetically fit all that down a single 10 gigabit Ethernet link. I dunno if you could get that kind of throughput in actual practice, but the raw bandwidth is there, and even purchasable by ordinary mortals.

In comparison, what I actually want at that resolution (7800x4500) is 50.5 gigabits/second. So, yeah, that's a lot more data, but it would merely be very difficult and expensive, not outright impossible as it once was.
posted by Malor at 12:30 PM on June 27, 2010


I just looked up Apple's new DisplayPort standard. The original rev supported 8.64 gigabits, and the update from last December now supports 17.28 gigabits. So there's an outside source that we're in the right ballpark.
posted by Malor at 12:34 PM on June 27, 2010


I am a fan of retinas. My only complaint would be one of the retina's accessories, the lens...

I'm more annoyed at the build defects they acknowledge but refuse to issue a recall on.
posted by Evilspork at 12:45 PM on June 27, 2010


DisplayPort isn't an apple technology, it's just a successor to DVI. It's already present on high end PC graphics cards.

Anyway, current graphics cards are already capable of handling resolutions like that. here's a regular PC driving six monitors. Amd calls it EyeFinity. They have a video showing a six monitor setup displaying a total 5700x2400 resolution video.

The problem isn't the chips, it's just a limitation of the monitor. For a really high resolution screen, you would probably need to have multiple connections to the graphics card with today's graphics adapters. But the cards usually have multiple connectors built in today.

Again, there's no question that the hardware is capable of this, it's the monitors that aren't advancing. But graphics cards already let you use more then one monitor.
posted by delmoi at 1:38 PM on June 27, 2010


Windowing systems don't use textures to store background windows, the operating system directs the application that owns that window to redraw itself when more of it is shown. That's why on old PCs the windows would flicker when you move them around. Modern OSes might cache window displays, though, since they have so much memory.

Yeah. This is 2010 not 1999. Windowing systems very much so use backing stores in VRAM these days, GPUs do the compositing and the desktop is now an OpenGL/Direct3D app depending on your OS. Windows hasn't used that style of drawing method in 3 years and Macs in 10 years.

I think Win7 is doing a similar thing with Aero, though I don't know the details.

It's been going on since Windows Vista with the Desktop Compositing Engine. Although Microsoft, in their infinite wisdom, couldn't figure out "fallback shader ops to software" like CoreImage does and requires a DX9 level card just to get Aero started. There's no DX7 level where you can just run the geometry through the card and have hardware accelerated compositing. It's fancy PS2.0 shader drawn window bars or you're back to software.

Compare to Quartz Extreme which only requires AGP and a GPU capable of drawing arbitrary texture sizes (i.e. Geforce2 MX).
posted by Talez at 1:56 PM on June 27, 2010


Again, there's no question that the hardware is capable of this,

Well, yes and no. At the moment, it's still a hard problem. Displays today are about 100ppi, so to get a monitor at 300ppi, that means it'll take about nine times the power and bandwidth to drive properly.

In the current generation, ATI is kind of where it's at for 3D; NVidia's Fermi just isn't that great. (this may very well change in the next silicon respin, which should hit by October or so, but at this precise moment, ATI is better.) A 5870, their fastest solution that looks like a single logical GPU, will drive 2560x1600 pretty well, although it's a little underpowered at that resolution for the really advanced engines. It pulls about 300 watts under load.

So, to get my theoretical 7500x4800 monitor running properly, that means you'll need about 9 of those, pulling 2.7 kilowatts. Even if you had a PC with enough PCIe slots to put that many in, that's a huge power and cooling problem. You can only pull about 1500 watts out of most power receptacles, so you'd need a 220V line and a super-duper power supply at the very least, not to mention industrial-strength heat exhaustion. So it would be extremely expensive, loud, and hot.

Now, if you accepted that you weren't going to run 3D at full resolution, but rather just wanted to put text and movies up there at InsaneRes(tm), you could drop back a fair bit on the required power. But you still have the 9x bandwidth issue. You'd need nine of the current DisplayPort connectors to drive a monitor with that many pixels, or five of the new generation. That's a lot of cabling, and you'd likely need three or four cards just to get enough ports, again giving you something of the same heat and power issues you get when trying to 3D-accelerate that many pixels.

So, yes, you could hypothetically do such a thing, but that leads us to this issue:

it's the monitors that aren't advancing.

LCDs are finicky beasts. Each pixel on the screen is three separate sub-pixels, and if you get more than a few failures on a given screen, it's unsellable. They're hard things to manufacture, taking extraordinarily good quality control. And when you raise the density of the individual elements that much, you both increase the chance of individual LCD subpixels failing (due to the new technology), and increase the number of pixels that must work.

They manufacture them as huge sheets of glass, and to sell that sheet of glass as a big screen, a very large area has to be nearly flawless. But no matter how flawed your manufacturing process is, you can often cut small pieces out that are perfect. The sheet of glass that would make just one 30" monitor can be turned into a whole lot of iPhone screens, some of which can then be junked without ruining the whole set. So the really cool tech goes into the small screens first.

That's also why you can get 24" screens for just a few hundred bucks, but 30" screens are still usually $1300 plus... it takes a much better manufacturing process to consistently make screens that large.

If we continue at the present rate of advance, I suspect my hypothetical screens should start to become actually feasible in about five years. But even then, they're going to be expensive as hell.

Plus, then we get into the whole mess of resolution independence in operating systems. Mac OS was supposed to go fully to scaled graphics in this release, but the iPhone has sucked so much of Apple's developer power away that I'm not sure we'll ever see a Mac OS that's fully resolution independent.

I think what's likely to happen instead is that they'll keep adding features to IOS, which is fully resolution-independent already, until it works well in computer format, and then try to foist off the same walled garden on computer users as well.
posted by Malor at 2:51 PM on June 27, 2010 [4 favorites]


though Microsoft, in their infinite wisdom, couldn't figure out "fallback shader ops to software" like CoreImage does and requires a DX9 level card just to get Aero started. There's no DX7 level where you can just run the geometry through the card and have hardware accelerated compositing. It's fancy PS2.0 shader drawn window bars or you're back to software.

In all honesty, it's not like DX9 cards are expensive anymore. Given the development expense of writing and supporting a software stack, it seems to me that just requiring hardware support is fairly reasonable. $75 cards will handle it fine.

Mac OS has been out for a long time, and their design goals for 3D UI compositing made sense at the time (particularly considering how weak their video usually is), but requiring DX9 to run Aero isn't an especially burdensome problem to pretty much anyone, and it avoids a problem for Microsoft. I think it was a smart decision.

It's not like the OS won't work without the hardware acceleration, it just looks nicer if you have it.
posted by Malor at 2:56 PM on June 27, 2010


Again, there's no question that the hardware is capable of this, it's the monitors that aren't advancing. But graphics cards already let you use more then one monitor.

This is a vast oversimplification of the problem. Your multiple monitors is handling a large frame buffer but still reasonably sized windows in your backing store. All of your windows are going to take up 4x the memory they used to, font ligatures are going to require 4x the memory, UI resources like buttons, tabs and such.

That's before you even think of the memory bandwidth increases and all of a sudden you're requiring a discrete GPU with tens of gigabytes a second back to GDDR3/GDDR5 just to display a simple desktop.
posted by Talez at 3:13 PM on June 27, 2010


In all honesty, it's not like DX9 cards are expensive anymore. Given the development expense of writing and supporting a software stack, it seems to me that just requiring hardware support is fairly reasonable. $75 cards will handle it fine.

Except the whole point of desktop compositing is to hardware accelerate the desktop on lower specced hardware. When Vista was released there were tens of millions of 945s out there that were dog slow because they weren't able to render should-be-optional shiny windows.
posted by Talez at 3:19 PM on June 27, 2010


Well, I never really used Vista, but Win7 only turns on Aero if you have hardware acceleration. I dunno about should-be-optional in the last generation, but in the current product you can run your desktop more or less like XP, if you don't have the hardware resources you should.

And, if you want the shiny, a $50 to $75 card will handle it fine.

Doesn't seem like much of an issue at this point, thought it may have been two years ago.
posted by Malor at 4:22 PM on June 27, 2010


A 7500x4800 display isn't theoretical, it just isn't cheap. I've seen the Evans & Sutherland 8K (15360x8640) projector in action and it's pretty awesome. The engineer who showed it to me told me there isn't much in the way of 8K content available, so they had to make their own by stitching together images and rendering CG.

I asked him about the 8K digitization of films like Baraka and he told me that he contacted the team who did the digitization and they told him they downscaled it to BluRay 1080p and threw away the 8K version in the process because it was too difficult to store (379MB uncompressed a frame, 31TB uncompressed an hour). Not that coming up with that scale of storage is impossible, but I guess they figure they can just do it again in another decade from the film master.
posted by pashdown at 7:55 PM on June 27, 2010 [2 favorites]


So all this discussion of large screens running at 300dpi... We'd all love it but it will be years coming. Here's the more realistic question: how about the iPad v2? What are the chances we'll see a "Retina" display on that next year?

What's the math on that for the pixel density? Seems to me it'd be greater than a current 30" LCD monitor. If that's true the current hardware probably couldn't keep up with the screen even if they could manufacture it.
posted by razorwriter at 9:18 PM on June 27, 2010


For the iPad to match the iPhone for in pixel density it would have to have a screen of about 2500 x 1900 pixels. Yeah, that's more than the 30" displays. Doesn't seem that likely that they could get that display in the next iPad while maintaining the current price point and profit. But I have been wondering about the iPad since the iPhone 4's introduction. The iPhone 4 has twice the RAM (512MB vs 256), the forward facing video cam, and the high resolution display. To me the iPad seems lacking just a few months after introduction. Will Apple stick to a yearly release schedule for the iPad like they have with the iPhone? Will they update the iPad sooner with some or all of those features?
posted by 6550 at 9:55 PM on June 27, 2010


31TB uncompressed an hour. Not that coming up with that scale of storage is impossible

Not impossible? Hell, it's not even expensive. Less than $2,000 for the disks. Or, if you're really cheap, 1200 gmail accounts.
posted by ryanrs at 12:26 AM on June 28, 2010 [1 favorite]


FREE part_24159_of_891048.raw
We've got the hottest mime/multipart
attachements cheap! Check out our
part_24159.raw today!
www.PriceGrabber.com

text ads by Google
    posted by ryanrs at 12:35 AM on June 28, 2010 [1 favorite]


    "metatalk is your option"

    Put or call? Pls hurry, I have my broker on hold and that fucker ain't cheap.
    posted by Eideteker at 12:01 PM on June 28, 2010


    Extra, extra! Marketing term might be dishonest! Read all about it!
    posted by chairface at 2:17 PM on June 28, 2010


    A 5870, their fastest solution that looks like a single logical GPU, will drive 2560x1600 pretty well, although it's a little underpowered at that resolution for the really advanced engines. It pulls about 300 watts under load. So, to get my theoretical 7500x4800 monitor running properly, that means you'll need about 9 of those

    Malor, did you read about ATI's Eyefinity stuff? They were demoing 5700x2400 with standard graphics cards available today. You would probably only need two graphics cards to get your. In fact here is 63 megapixel display running on a standard PC with just four graphics cards. That's twice as many pixels as your 7500x4800 example, running on PC today, with just 4 graphics cards. (that's being split across 24 separate monitors, by the way. The limitation is entirely based on the displays, not the silicon)
    posted by delmoi at 11:54 PM on June 28, 2010


    Oh, and on a 30 inch, 300 dpi screens, flaws and stuck pixels would be far less noticeable.
    posted by delmoi at 11:55 PM on June 28, 2010


    there was a time, not all that long ago in the grand scheme of things, when I would have actually understood most of malor's & delmoi's conversation. now i just whine when my brand-new hardware won't run Second Life because of an inadequate display adapter.
    posted by lodurr at 9:31 AM on June 29, 2010


    « Older Spoilers don't bother me but they might bother...   |   "A book is not born, but rather becomes, a... Newer »


    This thread has been archived and is closed to new comments