Just got an FS7. Dumb question ahead related to color correction so forgive me. I am new to grading LOG and don't have too much experience with Speed grade or Resolve. I'm a DP, not an editor. I am only grading footage for my own reels and short things I produce myself.
I cut with Premiere CC and I installed Film Convert and the Vision Color Impulz pack of LUTS along with Osiris Cinema LUT's. I watched a few tutorials on how to apply them and so I'm familiar with the basics. But there are some elements I'm confused on.
Can anyone explain the benefits and drawbacks associated with using Film Convert and or the Vision Color LUT's directly in a Premiere timeline vs manual grading a timeline later via Resolve?
I find that I have enough flexibility with Film convert as far as choosing negatives, adjusting exposure levels and color, grain etc. I can stop there and I have a great looking image but I can also get a more polished and finished Hollywood look dropping in an Osiris LUT in an adjustment layer above the timeline which I can change the strength of using opacity. So it would seem that I get a finished look without doing a ton of manual grading.
But I know that there's a lot more to the color correction so if anyone out there can post a more traditional workflow and the general benefits I'd love to know. Color correction is an art and I wouldn't want to insult pro colorists by suggesting that it's easy. I am honestly interested in knowing what degrees of control I'm giving up by using this shorthand method.
I have the Impulz luts but I haven't used them much. They seem slightly trickier to apply as there are various versions of each stock representing different stages of application. Where the Osiris LUTS just pretty much have one version to apply in either LOG or REC709. Simple.
So my question to anyone who knows is if anyone can explain the how to drop IMPULZ luts on I could try using those vs Osiris and see the difference. And a brief explanation of why doing all this manually in Speedgrade or Resolve would result in a significantly better looking image.
Haven't done my own post but I assume you have to figure out how to apply a LUT so you can view normal Rec.709 for editing -- and output any temp cuts in Rec.709 for people to view, plus make Rec.709 dailies -- but then conform a log edited master and then create a color-corrected Rec.709 version from that.
In other words, there is a need for just converting the log footage to Rec.709 without corrections just for basic dailies and what you once called work print needs, but there is also a later conversion from an edited log master that will need shot by shot correction and conversion to Rec.709 for the broadcast market.
That makes sense for a film or TV show's post pipeline workflow. This is for my own reel and shorter projects. I'm taking LOG footage and finishing it. But just for myself.
Wondering about the difference of using Film Convert for rec709 conversion with a Kodak or Fuji stock emulation, grading and grain built in and then with an Osiris LUT for finishing. Versus grading LOG manually via a series of Impulz LUT's using Speed Grade or resolve. The former seems way easier but I'm wondering what the tradeoff is if any in quality or flexibility.
I picked up an HP Dreamcolor Display to be able to grade in Rec709 color space and that offers a world of difference from my laptop screen. It's soooo much better. 14bit engine delivers at least 10 bit color. So it's all pretty accurate and what I'm seeing with just basic Film convert looks great. So I'm hardpressed to see why I'd need to polish it in Resolve or another program.
But I know that's the usual way to do it so I'm trying to find out what I'm sacrificing with this shorthand method.
Edited by Michael LaVoie, 16 December 2014 - 08:59 PM.
You're basically asking about workflow and if you should color in the edit or color the final outputted file, correct?
If you are the guy editing and coloring you are fine doing everything in your time line at the time of edit. As you said, coloring is an art form unto itself and since the nature of filmmaking is collaborative, it would probably be best to find a good colorist to work with. Either way, you can color it in the edit and if you find later that you can work with a colorist, just remove all the grading and LUT so the colorist has the most info to work with.
Most editors these days would apply LUTs and basic color correction to their edit just so it doesn't freak out end clients that would be reviewing your edit. When the edit is approved, you would generally remove grading (sometimes transitions too!) and make it available to the colorist.
That was sort of my question. I mean, the degree of control they'll have in Resolve or Speedgrade that I won't via plugins. I've viewed enough tutorials online that I can sort of see it's a bit of isolating different colors and shades and whatnot Where LUTS generally affect everything but if it's a formula of adjustment layers and a specific order of adjustments that is common, I'd love to know. What are just some key advantages to a workflow in Resolve that you can't do in the timeline?
I haven't used Premiere much. But I've done plenty of in-NLE colour correction over the years, and in my experience - taking your footage into Resolve yields MASSIVELY better results. You just have much greater control over the finer points of the grade, and that seems to make a big difference to the end result from what I've seen.
LUT's can be a great dart on the wall as far as your grade, but the final tweaking of your look is what will set you apart. The fine tuning control of a color correction suite (Resolve or Speedgrade) vs. what is available in your NLE stock or with FilmConvert is like comparing a $2000 guitar or TV to a $200 one. It's all those subtle details that usually add up to better production value. I'm not saying I haven't just drag and dropped LUT's on Log to deliver footage a bit quicker, but on the projects that really matter, send them to the suite and you can flex more muscle, more delicately and with grace. You'll also start to see WHY your shooting in Log and what benefits your really getting from it. FilmConvert is also a bit of a resource hog. That being said, depending on your CODEC / framerate, sometimes a round trip to Resolve can get tricky. And finally, take my advice with a grain of salt!
...is like comparing a $2000 guitar or TV to a $200 one. It's all those subtle details that usually add up to better production value.
FilmConvert is also a bit of a resource hog.
Yes it is! Applying it to lots of clips usually means going to sleep for the night.
I did however run into this small thing that makes it 10x faster. There's a little pull-down at the bottom to render with CPU or GPU. If you have an OpenGL graphics card set it to GPU and watch it rip through rendering like CRAZY FAST.
The main difference between an NLE and Resolve are the maths. Resolve does them in 32 bit float, which permits a more detailed processing and finer results.
My only NLE experience is with Final Cut Pro version 7 which is a mixed bag in regard to arithmetic precision. Some of its filters do 32-bit floating point arithmetic and some don't. Its Color Corrector 3-way filter is a wholesome filter while its RGB filter isn't. Its Gamma filter is a wholesome filter while its Gamma Correction filter isn't. Etc. One must take great care in such an NLE not to blow ones 10-bit image fineness. (There are some funky partial workarounds like this and this.)
32-bit floating point arithmetic should preserve an image's 10-bit gradation. So far as I can determine the sole purpose of 10-bit vs. 8-bit is the avoidance of banding. (Even 10-bit isn't quite enough when getting near black, and if your codec is 10-bit, no ultra-precise arithmetic can improve its gradation.) It's said that 10-bit allows finer color grading than 8-bit because 10-bit can produce some colors that are distinguishable from all the colors 8-bit can produce. Rather, the advantage in 10-bit grading is in the avoidance of banding via the bridging colors, rather than in the representation of extra colors in themselves.
Probably more important than gradation fineness is spatial fineness, including sharpness, cleanness, etc. for which ultra-precise arithmetic doesn't help. It's the quality of the algorithms for rescaling, rotation, deinterlacing, denoising, etc. that determines this image fineness. Is Resolve top quality in all these?
Tough question that is and one that I can't answer with actual knowledge. The fact that a legend like you asks this question from somebody as junior as me makes me very nervous...
I can guess a bit, though. I know that Assimilate Scratch converts everything to EXR and then does its calculations off these numbers whilst allowing a choice of algorithms. Resolve does its maths in a similar space, and does offer a few choices in algorithms. These algorithms must come off academia unless the companies are investing on maths research, so I don't see any apriori reasons why Resolve should be any worst in this area. The algebra is out there and free to access, I think.
I repeat, I'm guessing what passes off as an answer to your question, so please bear with my ramblings.
It's good to be a little nervous about the quality of our software. The more I use FCP7 more amateurish it looks, but it was thought OK for editing some large professional movies just a few years ago. What has changed in video software-land? The software "looks" more confident, its bells and whistles positively gleam. But is the knowledge base so much firmer, and are the software authors so much wiser and more careful than in the past decade?
In many areas software can evolve brilliantly in very few years, but there are some big burdens on video/electronic cinema software. One is that the color science is a mess. From the camera color sensors, through the coding, through all the LUTs, through the controls afforded the colorist, there is a lack of clear thinking. For example, the ACES system is idiotic.
Another is that simply having images in pixels leads to very complicated problems of appearance unknown in film photography. Even blowup and reduction are difficult. Glance at these two discussions from one small corner of the image processing world: Image Magick 1, Image Magick 2. Then glance at an analysis of the algorithms used in Photoshop. The digital image science is uncertain enough, but software writers have their own bright ideas for algorithms.
Another is that there is strong impetus for shortcuts, in order to speed up image processing.
So how good is our software? This can only be known a posteriori. Users need to use it critically -- test it hard if not pry up the bonnet.
And I have to ask juniors, for the downside of "legend" status is getting old and losing ones sharp eyes.
Edited by Dennis Couzin, 09 February 2015 - 02:03 AM.
I criticized ACES earlier: see post and the post immediately following it. ACES is just a rewarming of the Luther condition from 1927! It ignores viewing conditions. Its flare factor calculation is primitive. Its white point is not the CIE Standard Illuminant it says it is. It ignores the real engineering issues that have pushed cameras away from the Luther condition and existing metrics for measuring by how far. Take a peek at Shuxue Quan's 2002 dissertation.
Why-oh-why does ACES exist? It can be done better, and has been done better, in CIE XYZ. Since writing my earlier posts I learned that while the 5nm CIE tristimulus data are widely available, CIE charges for their 1nm data. So is the 1nm RICD data published in SMPTE ST 2065-1:2012 a dirty way to cheat CIE of their fee?
ACES deserves its own topic, which will be fun only if one of those responsible for it participates.