The myth of megapixel cameras
April 16, 2000 1:59 PM   Subscribe

The myth of megapixel cameras is explained here in detail, finally "illuminating" why digital resolution is often worse than you'd expect. In brief, digital cameras interpolate to get a color image from a black and white CCD -- losing sharpness in the process, and taking up far more flash card space than reason dictates. Conclusion: buying into the latest technology isn't worth the expense, until camera companies wise up. Finally, evidence which backs up my faith in scanning photos taken on a (decidedly analog) Nikon N70! [via Honeyguide]
posted by legibility (6 comments total)
 
I just picked up my first digital camera (Nikon 800) and am pretty pleased by it. The salesman was going on about megapixels blah blah blah. I didn't understand, so I just bought one that was reviewed well.

I'm pretty pleased so far - either way, I'm still an awful photographer.

For me the choice was about use - I use the camera for the web only really, so there was just no reason for me not to go digital. Especially when the camera I picked up had a pretty good price. But for professionals who pay upwards of a grand or so for a camera it's probably a different story.
posted by jbeaumont at 2:13 PM on April 16, 2000


Note that the person writing that page doesn't actually seem to own a digital camera. As the author says, "I wrote a program simulating the filtering and interpolation process of a digital camera." It strikes me as someone defending their current system, without even trying the new one. Basically, a completely conservative viewpoint.

Also, the cost comparison page is a little sketchy. The only way the author can make film cameras come out ahead in Scenario 2 is by using an insanely expensive digital camera. There is no chance that someone who would already have a $400 film camera would EVER buy a $1800 digital camera.

Just my thoughts...
posted by smackfu at 6:36 PM on April 16, 2000


He also botched his physics.

On his explanation of dark count he implies that CCDs are "positive" and that light striking a cell adds charge to it.

Actually, a CCD is a negative. The camera charges all the cells, lets them sit there, then shifts them out and digitizes the voltage from each cell. Every time a photon hits a cell, it knocks some electrons out of it. So the more light hitting a cell, the less charge there will be left and the lower the voltage found by the digitizer. The software inverts all the numbers to get the positive image.

It does look suspiciously reactionary, doesn't it? "I don't really know what it is but I hate it."
posted by Steven Den Beste at 8:29 PM on April 16, 2000


Minolta already has a dual CCD digicam. I found thisarticle to be less biased.
posted by gleemax at 10:53 PM on April 16, 2000


And CCD's are on the way out, anyway. Advances in manufacturing technology are making it practical to use CMOS chips -- effectively, RAM -- as the image sensor.

I will concur, though, that it ain't just the pixel count.

In the video arena, I've seen *VHS* tape that was broadcastable... because the camera was a 3-tube, with a glass lens.

Cheers,
-- jra
posted by baylink at 7:15 AM on April 17, 2000


Interestingly enough, there was an article in Byte, I believe, circa 1981 (boy is *my* memory hazy), which used a CMOS RAM chip as the heart of a scanner. There was one particular manufacturer whose package could be split open to reveal the chip. Spooky, huh?
posted by plinth at 2:30 PM on April 17, 2000


« Older NPR on the side of Corporate Radio?   |   Friday's Foxtrot Newer »


This thread has been archived and is closed to new comments