RAW vs JPEG is an argument that has been debated pretty much since shortly after the advent of digital post processing software that allowed access to the RAW image data. So, once again, for fun, which is the better option?
I think JPEG can be, ultimately, the better format though with one or two small qualifications.
JPEG or .JPG files get most of their reputation as inadequate and inferior because they are considered by most self appointed experts to be a beginner’s format that no serious photographer would use, for no particular reason other than with this file type, you are letting the camera and its processor make basic editing decisions and the output is a highly compressed file with much of the original digital information absent. On the other hand you are also guaranteeing yourself a useable image at minimum. Truthfully, most beginning photographers actually do start shooting JPEG because it’s safe, also it is the default file format for most digital cameras in the full auto modes. This doesn’t make it a just beginner’s format, or any less usable in the final analysis. Many pro photographers use JPEGs and are doing so more and more.
The next evolution for the beginning photographer is that they start watching videos online and maybe, reading various posts and articles online wherein they discover RAW files and their supposed superiority. This a a pretty standard progression for a digital photographer. they really don’t understand fully what RAW files are and what to do with them yet.
The standard line of advice goes like this; if you are a serious photographer, you should be shooting RAW exclusively. If not, then, at least setting your camera to save RAW plus JPEG so that you have the RAW files in the future, for use when you become more experienced at editing. A bizarre thought process. Though at first it does seem sensible, but, if you think about it, It basically says – “start out shooting junk, then when you’re better, you can shoot big boy pictures” – this argument can break down rather quickly if challenged. Unfortunately the RAW file side usually wins just by force of numbers. Even though the winning argument is a bit of a fallacy as we will see in a bit.
The basic premise is that with film, the photographer has much latitude when printing the negative in a darkroom. RAW files give the digital photographer this same (or even greater) latitude. Which when approached correctly can give a degree of refinement to the final image.
What this doesn’t discuss is that most photographers in the film only era didn’t develop or print in the darkroom. They either didn’t like the environment, or the process. Darkroom work is a separate skillset from photography and a good darkroom tech was highly valued. Many photographers left their film at the lab and then went back to shooting. Obviously there were some photographers that developed and printed their own work but they were fewer than you might think. On the other hand there were well financed hobbyists that did build and use home darkrooms. Does that mean RAW files are better suited to hobbyists then? Possibly but probably not.
When I, personally, first started shooting seriously with digital around 2007, there really wasn’t a lot of information on RAW editors to speak of software that would actually open the file was scarce and expensive. So, like many people, I was a JPEG shooter and happy. I was editing some in Photoshop Elements and overall, was pleased with the results. I still have two black and white prints matted, framed, and hanging in our living room from that time. Then I was away from photography for a while, and when I went back to it in 2013 I began to hear more and more about RAW files and the amazing latitude (especially regarding highlight and shadow recovery) you had when processing the image.
So, after a bit, I started shooting RAW and processing in Lightroom just like every other serious photographer, or so I thought. I have to admit that in the beginning it was interesting to spend time at the computer editing RAW files. The control I had over the image was pretty amazing. Of course I followed a pretty standard path and found that many (if not most), of my images were over processed (I’m being kind to myself here….) bringing me back to the old advise that just because you can do something doesn’t me you should, and went back to a much more minimalist style of editing. Still using RAW files.
There was, however, a nagging thought in the back of my mind though, and it went something like this – I used to shoot and edit JPEGs .. mostly minor adjustments to exposure, contrast, and maybe a little highlight and shadow adjustment. A lot of what you might call minor “tweaking”. Why am I spending all of this time and computer power to end at the same result? Why am I not shooting JPEG? The answer of course was because I was too “serious” a photographer to shoot JPEG, of course. Serious images relied on editing skill. Wait… what?
It’s been said that if you shoot RAW, you are an editor and if you shoot JPEG, you are a photographer. I do believe that there is much truth in this. I prefer to be a photographer with some editing skills rather than the other way around I never was a darkroom tech. Why am I an editor now?
Then I started to think about it and, realized a few simple truths. First there isn’t any such a thing as a true raw data file. If there were, all you would see was a series of 1s and 0s on your screen, no image at all. The camera’s onboard processor has already interpreted those ones and zeros to form an image file. So when you open a RAW file in Adobe Lightroom, Camera Raw or some other program, it is already a processed image – just with more information and lacking the file compression that you get with a JPEG. Some cameras even offer a compressed RAW file option (hmm…..). So if we think we are dealing with true unmolested original data, we’re just kidding ourselves. Of course this is also where the argument about which camera’s so-called color science is superior.
Next I began to think that if you use JPEG and there is less information to manipulate, then you had better get it right in camera. Then came the epiphany – this is exactly the same as color transparency (slide) film used to be. If you shot slide film you had better get everything correct in camera because there was no practical printing or darkroom process by which you could correct things like you could with negative film. Yes in the pre-digital days, the ability to successfully shoot transparency film was the mark of a true pro. Yet now we think of JPEGs (which can easily be compared to slides) as being amateurish. That is a serious flaw in the thought process.
Admittedly, in the early days of digital photography, JPEGs weren’t that good and photographers really wanted to get at the RAW data so they could process their own JPEGs. I would submit that now, JPEGs are exponentially better than ever and there is no practical reason to shoot RAW except in rare cases where you may want to shoot and save both to give yourself a safety margin for a specific image in case the JPEG shot fails. It happens.
Even Reuters (a kind of well known news agency) only accepts JPEGs from their staff photographers and stringers as part of an ethical position statement. By extension, this means that JPEGs are of a high enough quality that one or the world’s premier photo news services recognizes the quality of current JPEG output.
So, then, should we be shooting JPEG? Well it is my belief that, after shooting RAW for many years, JPEG is, day to day, the better format, for most photographers, for most uses.
There are caveats though. Most cameras now come with picture profiles (or something similar – it’s all in the name) that are not the same as the automatic “scene modes” you find on most cameras. Most of these profiles are adjustable to an extent, especially regarding vibrancy, contrast, and sharpness. You will have to experiment a little to find the sweet spot with these settings that gets you the result you want.
I would rather be shooting photos, or writing about photography than sitting in front of my computer or iPad editing RAW files when, in the end, my edits are about the same as the camera’s JPEGs, as compared by shooting JPEG an RAW at the same time.
So, in the end, for probably 80% of photographers 99% of the time, JPEG is the better solution.
Ok so, maybe I was a bit overzealous at first. There are some specific types of photography where Raw files might be the better choice. For instance Landscape photography where you might use focus stacking or may be using HDR techniques. Formal portraits where you are going to do extensive editing to improve skin and adjust lighting. Wedding photography because you want the bride to look her best. Lastly high end real estate photography, architecture, or interior design because you are going to be blending several images in Photoshop. These are all examples of when you, as an editor, should have as much image data as possible available.
Essentially if Photoshop is your primary editing tool because of its power to change reality you probably would be better off shooting RAW. If you use Lightroom from start to finish you are in the 80% and that makes you a photographer.
If you are so inclined let me know what you think in the comments.
-stay safe, see the world your own way, and thanks for reading.
2 thoughts on “RAW vs JPEG!”
I have been shooting for over 45 years, learned to shoot in the analogue era with everything that went with it, develop and print.
So, before you had a camera with real film, and the (better) photographer developed himself, he or she determined the end result in the darkroom.
Today the camera is digital, and the darkroom has become a computer.
Raw is the “film” equivalent in my opinion, not that Jpeg is bad, but you can’t “develop” it without damaging it.
The famous photographers of the analog era also spent a lot of time in the darkroom developing their masterpiece, were they “editors” or “photographers”?
I think Raw or Jpeg doesn’t make you an “editor” or “photographer”,
both can work,
I think the real question should be: suppose everything had to go back to analog, would you have your film developed at the camera shop, or would you do it yourself in the darkroom ?
Thanks for your comment. With all due respect we have been shooting for about the same amount of time. I learned the basics on a K1000 (a nod to Pentaxians everywhere). I also learned everything that went with it. Raw really is not equivalent to film. When you process a raw file you are doing everything that you would have done with a film neg to get the print to your taste. Developing film is a rather straight forward process, the only variables are chemicals and time.
As far as the “famous photographers of the analog (film) era” if you are talking about art photographers, which I assume you are regarding “masterpiece”, then yes they spent time in the darkroom – Ansel Adams comes to mind immediately. Adams even referred to himself as being a printer first and foremost. If You are talking other types of photography – not so much. Remember Robert Capa’s D-day photos were allegedly damaged by a careless darkroom tech. Dennis Stock’s famous photo of James Dean in Times Square is iconic for the printing notes on the Proof. As Stock was a Magnum photographer these notes were for their darkroom tech (a position once held by Cornell Capa).
To your last question – I would send the film for processing. I have shot thousands of E6 and K14 slides and would continue to do so. there is no point in “souping” them yourself. As far a printing I’ll leave the Cibachrome/Ilfochrome to the pros.
In the end, it’s all just my opinion though. As they say, Your Mileage May Vary.