Exposure on digital
Posted 23 September 2010 - 04:46 AM
Let's say our device can handle 12 stops of latitude, and our final output media is going to have 7 stops. In digital, as the slog curves offer a low contrast image we are used to work with luts that approach the image to the final look we want... My question would be if it would be intelligent to expose the image according to the LUT (the white levels and medium tones) and then check the log or raw image to see if we stay in the range the camera can handle, or if it's better to expose using the raw view, or the log uncompressed image.
I know it can seem evident to use whatever the biggest image quality we have to have an accurate exposure, but, if we have a grey point at 35 IRE and a white point at 70 IRE using slog, and after the must be conversion those levels are going to be adjusted to 45 and 100 IRE, it seems reasonable to start with this data for the overall exposure and then checking the sensitive areas (highlights and shadows) in our true signal raw or whatever... What do you think about it?
Posted 23 September 2010 - 07:17 AM
I've had people complain that they monitored the S-log output of an F-35, put some huge light on the scene, and effectively nothing happened; this is the problem with monitoring a very flat signal. It's also very offputting to anyone else who looks at it because all they'll see is a murky, rather washed-out image, which would quite understandbly raise eyebrows.
A monitor or some external box that does switchable LUTs is more or less essential, as is time spent defining a reasonable one for viewing that everyone can be happy with.
Of course this is the perfect time to start softening-up your employers as to the final look you're going for.
Posted 23 September 2010 - 02:46 PM
First, please don't confuse LATITUDE with DYNAMIC RANGE. These terms are not interchangeable and mean very different things, both of which are crucial to your understanding of this subject.
LATITUDE is the amount an image can be over- or under-exposed and still be color-corrected into acceptable-looking image. Obviously what's "acceptable" is subjective, but let's say you determine that you can underexpose your overall image three stops before the noise and shadow detail becomes unacceptably ugly, and that you can overexpose the image only one stop before the highlight clipping becomes objectionable. That means you have a latitude of four stops (not 12!) -- three stops of underexpose latitude, and one stop of overexposure latitude. LATITUDE is the "wiggle room," or margin-of-error you have for exposure with that system.
DYNAMIC RANGE is the full range of luminance the camera system can capture and reproduce. That's usually measured by the ability to differentiate a gray card from pure white or pure black. Obviously that's a very different interpretation of exposure from "latitiude," as those extremes of sensitivity won't produce a recoverable image with a full range of luminance.
Secondly, ALL systems, digital and analogue (film), have different imaging characteristics throughout their exposure range. Dynamic range, color hues and saturation, noise, highlight clipping and gamma curves will all change along with the exposure. It's your job as cinematographer (or DIT in your case) to get to know the imaging qualities of the system you're using, primarily by TESTING. There's no inherently right or wrong way to expose an image; just the way that manipulates the technology to produce the desired results.
So to answer your original question, you determine the exposure by finding the balance of highlight range, noise, color fidelity and other qualities -- all the way through color correction -- that gives you the look you (and your supervisors) want. In some cases you might decide to significantly underexpose the RAW or LOG image in order to preserve more shadow detail in the scene and highlight range from the sensor; in other cases you might want to nail the RAW exposure dead-on in order to preserve the fullest dynamic range from the sensor with the least noise. There's no one right way to do it.
So on a practical level, you usually expose using a LUT. That's what it's for: it's a representation of what the final color-corrected image will look like, taking into account your image adjustments, so that you're NOT confused by the flattened contrast, altered luminance, and uncorrected color of the RAW or LOG image. Use light meters and view the RAW image to verify what you're getting on the sensor, but don't set your exposure by what "looks right" in RAW or LOG view.
Incidentally, this is essentially what film cinematographers have been doing for decades, minus the viewable LUT on a monitor. The "LUT" is essentially in the DP's head (and on paper in the form of notes); you expose your image according to the testing you've done with the film stocks, lenses and filters, and lab and printing processes you're going to be using, not necessarily by the film's actual ASA rating.