Join 3,374 readers in helping fund MetaFilter (Hide)


Exporting from Lightroom
January 23, 2013 11:00 AM   Subscribe

An Analysis of Lightroom JPEG Export Quality Settings.
posted by The Girl Who Ate Boston (41 comments total) 50 users marked this as a favorite

 
Thanks for posting this!
posted by shothotbot at 11:06 AM on January 23, 2013 [2 favorites]


That was really interesting. I do a couple of extremely basic image editing things at work and have absolutely no training in it, so I've always wondered what practical difference the various compression settings really make. I'd like to keep the images on our blog relatively low in size while not impacting the quality noticeably.
posted by Horace Rumpole at 11:15 AM on January 23, 2013


Very interesting. Thanks.
posted by Outlawyr at 11:15 AM on January 23, 2013


This is actually really useful for me. I use lightroom to edit my product photos but never did much experimenting with what the compression really does. Thanks for sharing it!
posted by NoraReed at 11:17 AM on January 23, 2013 [1 favorite]


This is great -- this is one of those things that always pops up and I blindly drag it over to high quality never really understanding the impact on particular photos. In fact, there are several menus in my Adobe workflow that I plow through without ever really questioning or understanding. Like that window that pops up with the versions when you first save something--I never really look at all the options there. And I can't even begin to understand all the options in the printer dialog box. I really should drill down through these someday like this.
posted by This_Will_Be_Good at 11:23 AM on January 23, 2013


Heh, I want to send this to our CDN guys. The images on the site I work for get compressed so badly I want to scream.
posted by BlackLeotardFront at 11:25 AM on January 23, 2013


"Those who blindly use the maximum setting for their exports likely waste a lot of local disk space, upload bandwidth, and remote storage space. But conversely, those who blindly use some lesser setting risk posterization in the occasional photo with an unlucky sky gradient."

I appreciate the experiment, but this is still a real issue and the conclusion? I will still default to 100%. If you shoot in RAW, that's usually the copy you're keeping locally as Lightroom stores your edits without altering the original. The RAW images are what take up the real space. Yes, you might locally store a set of JPEGs, but you certainly don't have to. With internal hard drives of good quality being under $50/TB, you have to churn a lot of JPEGs to care one way or another. Plus, what you shoot is not what you care to export - only a fraction of shots taken are usually worth fully editing and exporting.

Then, online, many services such as Smugmug have unlimited storage for pros or prosumers at a pretty low price.

That leaves bandwidth. Yeah, annoying to wait more than you want to, but you have to upload a hell of a lot to really cause an issue. Even using consumer level connections, it's hard to imagine a day of shooting's JPEGs taking more than overnight to upload.

So, yeah, interesting, but not sure if practically it makes a big impact, especially if you don't want to re-export something if you wish you'd exported at 100% in the first place (plus who wants to track their 100% v. 75% versions after the fact...ick).
posted by Muddler at 11:26 AM on January 23, 2013 [3 favorites]


The difference from the first example is stunning, and relates to what visual changes human are sensitive to: we pick up on imperfections in a continuous tone much more readily than slight changes in varied detail. The JPEG compression algorithm is built around this difference, trying to preserve quality in these smooth gradient areas, but as well as it does, a photo like the sunset presents a daunting challenge.

Interesting. Can someone confirm this about the inner workings of JPEG?
posted by vidur at 11:30 AM on January 23, 2013


vidur: short answer: yes. Representing a simple linear gradient without artifacts is hard for JPEG. Much more information on wikipedia, but it quickly becomes very technical: the discrete cosine transform followed by quantization is the heart of JPEG compression, but the quantization step can have the effect of making tiny gradients disappear altogether, or making larger gradients not quite line up at the edges of blocks.
posted by jepler at 11:47 AM on January 23, 2013 [5 favorites]


…hard edges are another tough case for jpeg. Since hard edges and nice linear gradients are so common in computer-generated images, this is why png is preferable to jpeg for some types of images. For instance, make a picture of 10pt lettering in a serif font and save it as jpeg at low quality. Around all the letters things will look a little mottled up to the edge of a DCT block.
posted by jepler at 11:53 AM on January 23, 2013 [1 favorite]


That leaves bandwidth. Yeah, annoying to wait more than you want to, but you have to upload a hell of a lot to really cause an issue. Even using consumer level connections, it's hard to imagine a day of shooting's JPEGs taking more than overnight to upload.

This is viewing the primary concern that of the upload speed. If you're just talking about archiving your material, that's one thing. But if you're creating a graphic to share to a wide variety of people, you'll want to use a moderate amount of compression. Smart device users with strict bandwidth caps and users who still have dial-up or slow DSL don't need to see download a 1mb image that could easily be compressed to half that size and still read just fine.

I am always amused when people include photos directly from their digital cameras or straight from a press package in websites, especially when they constrain the size below it's native resolution.
posted by filthy light thief at 11:59 AM on January 23, 2013



That leaves bandwidth. Yeah, annoying to wait more than you want to, but you have to upload a hell of a lot to really cause an issue. Even using consumer level connections, it's hard to imagine a day of shooting's JPEGs taking more than overnight to upload.


It can still have a substantial impact if you roll your own website & pay rates based on bandwidth usage & storage consumption like I do. I don't use Lightroom, but I usually go for a quality 6-8 or so for my web stuff in Photoshop, as the size is quite a bit smaller than a max. jpeg, and you can't really resolve any artifacts. I do quality 10 for Flickr since it's a flat fee.
posted by Devils Rancher at 12:12 PM on January 23, 2013


Friedel is also a Regular Expression Master. I saw him talk one fo the first perl conferences years ago. He came in jet lagged, sick -- I think -- and did his talk with transparencies, an anachronism even then. Even with all that, he rocked the house. I think I absorbed none of it but my mind was blown....
posted by Ogre Lawless at 12:17 PM on January 23, 2013 [1 favorite]


Devils Rancher and filthy light theif-

That's why I prefer the third party hosted services like smugmug and the like. Not only is your price fixed no matter the upload and download, the images served up as a default are not full resolution. You upload the full resolution, but what's served back is more appropriate for average monitors or in the case of mobile, a mobile screen. If you select original sizes to view, it takes a second or two, but then you get the entire resolution image to play with on the download side.

However, obviously if your bandwidth or online storage is a concern, this is very helpful. It's also helpful if you want to e-mail around a few photos. For that purpose, you can cut the quality down drastically if the average viewer will only be looking at the image on a screen and never printing. That's where I hit size limits - e-mails.
posted by Muddler at 12:19 PM on January 23, 2013


I will still default to 100%.

If you really don't care about file sizes, you may as well use a lossless format like TIFF or PNG, since even a "100%" JPEG is hugely lossy by comparison.
posted by Western Infidels at 12:37 PM on January 23, 2013 [1 favorite]


I appreciate the experiment, but this is still a real issue and the conclusion? I will still default to 100%.

Environment? Bandwidth and storage might be free to you, but the electricity and disk come from somewhere. The earth isn't going to collapse if one person doesn't squash down their JPEGs. But defining reasonable best practices in many little areas like this, with huge potential for waste, adds up.

As a total amateur whose livelihood will never be threatened by "good enough" quality, I learned today that the correct answer is 75. Thanks!
posted by gurple at 12:49 PM on January 23, 2013


Over 10 years I started using a freemium image editor called Image Forge that saves in a bunch of formats (I think the makers have gone OutOBiz, but I still keep a copy for my simplest graphics tasks) and one of the first things I noticed was the difference in file size between 100% and 85% with relatively little degradation. Down to 75% was usually okay, but the reduction in file size became minimal, so I became a "85%-er" using that as my default setting for jpegs no matter what program I'm using. (But now I'm thinking I should see if My Mileage May Vary for other software)

Thanks for the geekily-useful post, Boston Eater.
posted by oneswellfoop at 12:53 PM on January 23, 2013


This article's (FPP's) title is somewhat misleading, BTW. This isn't about Lightroom; it's about saving pictures repeatedly in JPG format from any program. Lightroom just happens to be the tool they used to display the effect.

You can actually make an interesting filter out of this, too: build a macro that repeatedly saves the image at slightly differing compression ratios (so that something actually changes, instead of the first output being reused without change). After 10-25 reps, a sort of posterization occurs. Self-link to an example of the progression.
posted by IAmBroom at 1:03 PM on January 23, 2013


The interface for comparing the results was novel and effective. I wish actual export dialogs worked that way.

IAmBroom: unless I'm missing something that's not what is being demonstrated. This isn't a chain of exports, the same file is being exported at different quality settings.
posted by cdward at 1:03 PM on January 23, 2013 [1 favorite]


gurple: The earth isn't going to collapse if one person doesn't squash down their JPEGs. But defining reasonable best practices in many little areas like this, with huge potential for waste, adds up.
That's like washing, drying, and reusing kleenex because we're running out of forests. Pretty sure that the damage a few magazine's storage demands make on the environment are dwarfed by things like the disposable cup your last drink came in, the fact that you drove your own car to get it, and your house insulation is only R-12 instead of R-1200.
posted by IAmBroom at 1:03 PM on January 23, 2013


As someone who deals in thousands of JPG image files each year, exported from Lightroom and delivered to clients, this is very useful to me. I will be lowering my settings, and saving a lot in time, DVDs, and HD space. Thanks for the post.

And yes, IAmBroom, you are misunderstanding. This is indeed all about individual JPG export settings, as processed specifically through Lightroom's unique JPG compression algorithm, as the link makes clear.
posted by hamandcheese at 1:16 PM on January 23, 2013


IAMBroom -- I really liked the effect of that. The last picture had a "heroic" vibe to me. Probably just the haloing, but it was neat.
posted by jclarkin at 1:16 PM on January 23, 2013


My wife, who does a bit of paid photography work, is a 100%er. I've always been a 70-80%er, since you often get files half the size, and to my eye I can't see any difference. I've tried to convince her otherwise (especially when she's trying to cram hundreds of photos on a DVD to send to someone) but as far as she's concerned, why would you want to give people anything but "maximum quality"? Maybe this will convert (heh) her.
posted by Jimbob at 1:36 PM on January 23, 2013


I wonder if Lightroom uses the same code/scale as the Independent JPEG Group? I still use the command-line cjpeg/djpeg tools all the time, and the lossless cropping/scaling/rotation tricks of jpegtran are mad handy.

Surely I'm not the only one to be careful not to crop JPEGs on anything other than 8x8 pixel DCT tile boundaries ... am I?
posted by scruss at 1:38 PM on January 23, 2013


Surely I'm not the only one to be careful not to crop JPEGs on anything other than 8x8 pixel DCT tile boundaries ... am I?

I have often pondered whether it's worth doing this, but I've never bothered...
posted by Jimbob at 1:40 PM on January 23, 2013


That's like washing, drying, and reusing kleenex because we're running out of forests.

I disagree. Making optimal use of bits is what JPEG is about. That's what it's for. Learning to use it well does not require the sort of resource investment that something like Kleenex laundering would require.

Also, yuck.
posted by Western Infidels at 1:53 PM on January 23, 2013 [1 favorite]


As someone who does a lot of web graphics, this is important stuff. Managing file weight is paramount to providing a good web experience. Most images can be compressed to some extent without anyone noticing, but most times it involves a combination of things - not just the level you at which you compress the image. As someone mentioned above, .pngs are a viable alternative many times, too.

Something I learned a long time ago about Photoshop from someone in the know: Photoshop actually uses a couple of algorithms when compressing jpegs. The switch between algorithms changes at 60%. So, if your image looks OK at 60%, it will usually look much better at 61%. My "go-to" setting with a quality starting image when saving for the web is 61%.
posted by Benny Andajetz at 2:01 PM on January 23, 2013 [2 favorites]


Another factor not mentioned is the image resolution. Modern cameras typically shoot photos in the 12-16 megapixel range, giving an image of 4500x3000 pixels. If you're going to put it on the web, it's likely you'll be displaying at size closer to 800x600. So if you downsample it before you save to jpeg, you'll save even more space (and bandwidth).
posted by CheeseDigestsAll at 3:26 PM on January 23, 2013


Thanks, jepler.

Honest questions to 100%-ers: Have you ever used an image browsing app on mobile?

There are many such apps for reddit/imgur. And sometimes I just want a quick, quiet chuckle to myself without waiting 3 minutes for one r/mildlyinteresting image to load.
posted by vidur at 4:18 PM on January 23, 2013


Optimizing image sizes seems to be a lost art nowadays (uphilll, both ways, snow, etc)

A tiny (by today's standards) 15k image would add 10 full seconds delay to page loading if your user was on a 14.4kbps modem. There were all sorts of optimization tricks available for JPEGs that are still possible now, but not relevant any more.

One thing I remember was the ability to specify compression levels by area - you could tell the software to compress the background more, and compress the in-focus area (or problem areas) less. It would take several iterations until I got a result I was happy with but it would allow me to cut the load time of a page in half.

100% quality JPG has no reason to exist, you should just use lossless PNG if you're archiving and a higher compression JPG if you're displaying to web.
posted by xdvesper at 4:44 PM on January 23, 2013


If you're using 100% jpeg, you really should be using a lossless format instead.

If you're using jpeg for images that have lots of smooth gradients or sharp edges, you should be using png or possibly gif.

And the next person who sends me a web-sized 0% jpeg of their two-color company logo gets a potrace right in the eye
posted by ook at 5:03 PM on January 23, 2013


I'd like to see a comparison of Lightroom's Export with Photoshop's Save for Web and Devices. That little feature seems to do amazing things to file compression and I'd like to know what exactly it's doing that's saving so much space.
posted by girih knot at 6:23 PM on January 23, 2013


There were all sorts of optimization tricks available for JPEGs that are still possible now, but not relevant any more.

Oh sure they are still relevant. Haven't you ever used a mobile device to view a page that is overburdened with too many high rez images? I hope these space saving techniques become more widespread, and they will need to be, as Retina Display resolution becomes common.

I often see this sort of demonstration of the horrors of high compression and I wish they would learn to use Unsharp Mask to help improve image quality. Protip: you can apply different Unsharp Mask settings to different parts of the image. Pros capable of using this tip effectively will probably recognize how this will work in practice, and will know how to apply different settings to duplicate layers and merge them together using Layer Masks.
posted by charlie don't surf at 6:55 PM on January 23, 2013


Conservation is a fluid requirement. If you live in the desert, water is so precious that it's worthwhile to make a leather wineskin. But if you live on a river, it's probably a waste of effort and materials.

You have to be serving a shitload of data before transmitting insufficiently-compressed images would exceed the energy you'd spend optimizing them.

I'm a programmer, and it's sometimes almost physically painful to feel like I'm wasting memory, or that I could make that loop 20% faster if I can just work on it for another couple of hours. Then I look at the nutrition info on the soda I'm drinking, and realize that there's a threshold... eventually I reach a point where I burn more calories trying to improve efficiency than the equivalent electricity it would save.
posted by Riki tiki at 6:59 PM on January 23, 2013


Conservation is a fluid requirement. If you live in the desert, water is so precious that it's worthwhile to make a leather wineskin. But if you live on a river, it's probably a waste of effort and materials.

It's not the water. No matter where you go, life is precious.

..eventually I reach a point where I burn more calories trying to improve efficiency than the equivalent electricity it would save.

If this is how you program servers, you aren't wasting electricity, you are literally killing dozens of people.
posted by charlie don't surf at 7:29 PM on January 23, 2013 [2 favorites]


I've always used 80% and it's served me well.
posted by j03 at 10:39 PM on January 23, 2013


cdward: The interface for comparing the results was novel and effective. I wish actual export dialogs worked that way.

IAmBroom: unless I'm missing something that's not what is being demonstrated. This isn't a chain of exports, the same file is being exported at different quality settings.
No, I got that. My discussion of chain-of-exports effects was partly a derail... except that it happens if you use JPGs as intermediate files in your workflow process. Really, the chain-of-exports is just a geometric progression of a single-save's problems.
hamandcheese: And yes, IAmBroom, you are misunderstanding. This is indeed all about individual JPG export settings, as processed specifically through Lightroom's unique JPG compression algorithm, as the link makes clear.
Meh. Everything the article says is still true of any other JPG algorithm, albeit with slightly different parameters. JPGs are lossy; their losses follow a general pattern (NxN blocks of pixels share color pallettes, for instance); the output quality setting has a high nonlinear effect on size (the relative change in file size from 100% to 90% is much, much greater than the change in size from 50% to 40%) and a somewhat nonlinear effect on visual quality (taking an image from 90% to 80% quality will more than double the image errors, but at high enough levels those errors are essentially unnoticeable for most applications).

What is different about their algorithm? Slight details. What is the important takeaway? If you care about image quality more than Aunt Tilda snapping the birthday cake does, you should probably familiarize yourself with the effects of your preferred workflow tools. That way, when you take the shot you intend to blow up to 8x10", you won't be surprised by the uncorrectable detritus scattered across your image.

For the record, I save at 95%, because I archive data on CDs/DVDs, and only keep 5% of my shots on average - so storage is dirt cheap. However, my tests with IrfanView (my 1st tool of choice for the initial cull) prove that in most cases 85% would keep me happy. The exceptions are highly-detailed backgrounds (like a wallpaper pattern).
posted by IAmBroom at 2:53 PM on January 24, 2013


If this is how you program servers, you aren't wasting electricity, you are literally killing dozens of people.

That's a hell of a thing to say and way beyond the magnitude of this topic, plus it's misuse of "literally" (making people wait is not the same as killing them, I hope you'd agree), and it's simplistic to the point of being frequently wrong.

I once worked on a web application that (among other things) showed people detailed local weather alerts. Two weeks after it went live, there was a severe storm in the area and lots of people went to the site to get information. Had I spent my time obsessing about a few kilobytes here and there, it might not have been delivered in time to help in that situation... or worse, the added complexity might've introduced a bug that made the service fail at a crucial moment.

Bandwidth is not life. It is just one of many resources, and I'll argue that in 2013 it's rarely the most precious.
posted by Riki tiki at 1:25 PM on January 25, 2013 [1 favorite]


making people wait is not the same as killing them, I hope you'd agree

No, I absolutely agree with Steve Jobs, that waiting on PCs is literally killing people. Computers were designed to wait on people, so like Steve, it is my pet peeve when people are made to wait on computers. Every day when I go to work and wait for my Windows PC to boot, and I curse at the crapware that I don't have the admin password to uninstall, and wait for the crapware apps to time out and let the machine resume booting, and then the tedious waiting for the boot to finish, the whole process takes me about 5 minutes just to get to my work screen, and I can feel myself dying. When the machine was first delivered to my desk, and I waited and waited and waited, and when it finally loaded, I literally saw something like this, I could feel my life force being sucked from my body. Sometimes while I am waiting for my machine to boot, I make an estimate of how long I have left to live, and what percentage of it is being wasted today, right now in front of my eyes, and calculate what fractional amount of my own death has just occurred.

You know, there was a famous labor class action lawsuit against a company that had hundreds of employees in call centers who booted up their Win98 machines and had to wait for them to load completely before they could launch the timecard app and go on the clock. They sued, claiming that they wasted several minutes each morning, EVERY morning, at work at their desks and not getting paid for it, unable to do anything but watch passively as their life slowly dwindled away with every tick of the clock. The employees timed the process, calculated how many hours of wages were lost per year waiting for Windows to boot, across the entire company workforce. They sued for millions of dollars of lost wages and won easily.

And this is absolutely within the scope of this topic, which is using compression, and one of the main reasons you use compression is to make web pages load more quickly. Every day, I read websites on my original iPhone that use poorly optimized images to the point of making the site unusable. My online bank is a particularly egregious offender. It takes so long to load their damn graphics, after I log in, my iPhone will time out three times before it finishes loading the page and displays my account balance. Unless I keep tapping it to keep it awake, it will never load.
posted by charlie don't surf at 8:03 PM on January 25, 2013


I'll leave it at this, because I don't want to make this thread about me: though you may disagree on their merits, I hope you acknowledge that the above are reasonable points. Accusing someone of literally killing people, over a reasonable difference of opinion, is way out of line.
posted by Riki tiki at 6:29 PM on January 27, 2013 [1 favorite]


And you must acknowledge my points are reasonable, or at least significant enough to spend the effort to refute them. Let's consider this an ethics programming optimization effort, and consider that we probably agree on some points.

This isn't me just gratuitously calling you a murderer. Hey we're all murderers of a sort, I do not exclude myself, and I wouldn't insult you by expecting you to consider your impact if it wasn't something I do myself. Our hardware is killing people with toxic wastes, sending them to their doom mining rare earths, and the energy we burn to run them is ruining the environment. We leave a wake of death behind us, we are even wiping out indigenous people in the rainforests so we can cut the trees and make toilet paper to wipe our asses. Every step we take squashes an insect, or people we have no more knowledge of than an insect. That's my perspective as a buddhist, all life is created from death. But reducing your footprint, reducing your "collateral damage" of death in your wake, is a worthy goal.

This particular class action lawsuit WAS a referendum on human life. The employer said their employee's time waiting, was worthless. They are essentially saying, their lives are worthless, unless it was spent in service of corporate efforts. A judge said the employees time had value to them, as it represented a significant part of their lives and created an opportunity cost. Time spent waiting at work in futility, was time taken away from the rest of their lives.

So please consider that, when you are doing your cost/benefit analysis of whether it's useful to continue optimization efforts. You may consider it a waste of time to spend say, another hour of YOUR effort. But what about the USERS' effort? Your apps will probably live on long beyond the time you expect, and may have far more users than you ever know, making the ultimate cost of inefficiency incalculable. Yes there are multiple bottlenecks beyond your control. It would be a shame if program optimization was the bottleneck, and people sat their waiting and waiting, wasting their life on something you could have improved.
posted by charlie don't surf at 8:13 PM on January 27, 2013


« Older 'I'm a White Girl': Why 'Girls' Won't Ever Overcom...  |  Ilana Gershon... Newer »


This thread has been archived and is closed to new comments