How many DR stops are enough for a HDR delivery format? For rec709, you can use even a 7stops dr camera and its fine, but Im very confused about HDR standards...1000nits, 10000nits, Dolby PQ, BBC. 10stops? 15stops? A camera like C300 with 12stops will be still usable? Lets say, I want to buy a HDR display (the new atomos, or any other)...if the imagine looks good on a 1000nits display, is gonna look good on a 3000nits display? Am I totally wrong? This new HDR scares me a lot! There is very few information about this new standard out there and if anyone could explain, I think is gonna be very useful for many of us. Thank you!
HDR Dynamic Range
Posted 24 May 2016 - 05:50 PM
From what I've gathered, people are satisfied with over 12 stops of dynamic range when it's time to color grade. Some camera companies stretch the truth a bit with that statistic though. As for viewing the finished product, I'm not concerned with how much dynamic range there is, as long as it's not abysmal. I'm sure someone else replying to this will object to that last point.
Posted 24 May 2016 - 09:20 PM
The thing is that the standards are in flux though we seem to be heading towards a 1000 nits brightness target. However, some OLED TV's that can achieve a deeper black are getting away with something lower than 1000 nits at the high end, so the range is just as important as the max brightness.
In terms of grading something for 1000 nits but it being displayed on a HDR TV with 700 nits or 2000 nits max brightness, I think it would look OK, a bit like watching TV today -- we all don't have the same max brightness for every TV set on the market.
Right now, we don't have any cameras that deliver more than 15-stops of DR anyway, most are around 12 or 13 stops, the Alexa is 14.5-stops, perhaps a Red camera using HDR-X can achieve a bit more. And not every scene in a movie necessarily uses the max DR range possible.
Posted 25 May 2016 - 02:53 AM
Thank you very much for answers! So, what I already shoot on a Red (not in HDRx mode), is going to be good enough for a HDR grading in future? I suppose some old footage from a 5d (probably 9-10 stops) will not be use for a HDR delivery, isnt?
Posted 25 May 2016 - 03:56 AM
I'm not totally certain on your question, but video from a 5D would not be considered HDR (if it is, that is a really low standard for high dynamic range). Maybe if Magic Lantern was involved having it record "RAW" it might get that title.
Posted 25 May 2016 - 06:37 AM
Depends what sort of 5D.
Put magic lantern on a 5D mk. III and it's a very respectable camera in many ways.
Posted 25 May 2016 - 07:07 AM
Posted 25 May 2016 - 09:51 AM
A Rec.709 camera usually captures more than the 5-6 stops of brightness that a CRT monitor displays. The midtones are linear but usually there is some slight curve applied to the knees and slope to get you a bit more range on display, maybe more like 7 to 9-stops. And of course, one can show a log image with 14 stops of information on a Rec.709 display, it just looks very flat.
There is no current standard for HDR in terms of display range, but 12-stops is often tossed around.
Posted 25 May 2016 - 02:30 PM
If the dynamic range of the capture format exceeds that of the display - thats obviously the best as it gives you more flexibility in post. But I'm sure its not a deal breaker if it doesn't. In some ways its quite arbitrary since the dynamic range of capture does have to map in an accurate way to the dynamic range in the presentation format.
Currently we have cameras and film formats that capture a wider dynamic range then the display and this dynamic range is compressed (though various means gamma etc..).
So the inverse is also true if you had a camera that captured 9-10 stops of dynamic range - that could be stretched to 12 stops or what ever is presented by the HDR display if you wanted too. It would be a completely arbitrary choice decided by the colourist. It could be stretched and manipulated in a multitude of ways to best present it on a HDR screen. You'd probably have to do more work to manipulate the image to 'fit' HDR compared to higher dynamic range source - but I don't see how with care you wouldn't get good results
As long as the cameras dynamic range exceeds the dynamic range of the scene in front of the lens - your not going to induce any artefacts - beyond the limitations of the capture medium in general. Blacks would be mapped to blacks, whites would be mapped to whites - contrast could potentially be stretched.
A more important element would be image bit depth - an 8 bit image is more likely to band when stretched across a 12 stop HDR display. A camera that can record in 12 to 16 bits is going to give you much more room to manipulate and stretch contrast for a wider dynamic range display is going to be needed.
HDR decisions are probably going to made in the grading suite rather then on set. I would have thought as long as you shoot maximise the range available to the camera - taking care not to clip the whites or crush the blacks. A well exposed shot on a decent format (more then 8 bit) should give the colourist enough material to produce a HDR grade.
The bit depth of the HDR display is also going to have an effect - if your mapping an 8 bit camera to a 16 bit display its not going to look as good as a 16 bit source. A non magic lanterned 5D would be problem due to a high end display revealing the problems inherent in the codec
Posted 25 May 2016 - 02:55 PM
I've seen movies in Dolby theaters and the moment the little HDR demo is done, the black levels go up and the movie looks normal. It's the same with IMAX Laser projection, they promote it as having greater dynamic range, but I haven't seen it.
Honestly, the only reason you need high dynamic range is in camera, which gives you more options for correction in post. IF you had an HDR camera, you couldn't distribute in HDR because in the coloring process, you'd be limiting the dynamics of the image, that's kinda the whole point. You bring up the blacks and lower the highlights so you can see details.
Posted 25 May 2016 - 06:14 PM
But if we have HDR tv sets the viewer will see a difference .. much more than having a 4K tv.. Rec 2020.. The F55 has this option now.. although any 12-14 stop capable camera should do it.. nearly all camera,s can shoot over REC 709 these days.. the problem is the TV /display..
Posted 25 May 2016 - 07:01 PM
I'm not aware there's any particular reason laser projection should have higher dynamic range. Current laser projection techniques are best described as laser illuminated, where solid-state lasers are used as a high power light source which is diffused and then modulated using a conventional micromirror device. There are conveniences in colorimetry and optical setup because the red, green and blue channels can be directly driven by coloured light, the wavelength of which can be finely controlled. This also facilitates wavelength-division multiplexing of stereoscopic content, which is about the best and most refined way ever of doing something that's a tremendously bad idea.
But I'm not sure how any of that would affect dynamic range.
Posted 25 May 2016 - 07:14 PM
IMAX laser projection for instance, is a double system right? But the 2nd projector only projects a blurry image on top of the 1st projector. This helps remove the black lines between the mirrors (pixels) which is so evident on digital projection.
I just find it funny that to make digital look pleasing, they have to blur the image! What a joke!
Posted 10 June 2016 - 07:16 AM
I've worked with colorists experimenting with "HDR" on film over 10 years ago...where you scan one pass for blacks and one for highlights then combine in much the same way as is done with still photography. Results could be impressive but also a ton of work. That was before the latest round of scanners which probably negate much or all of the gains with that method.
Would it be possible to build a camera that basically had two sensors and control the light to each sensor so one would expose for blacks and one for highlights? or perhaps one sensor where every other pixel is tuned to different exposures, combining in software for greater range if lower resolution? (or perhaps this is already being done).
Posted 10 June 2016 - 09:25 AM
With a 4K DLP projector showing a black field, we could see the person talking to the group lit by the light coming from the projector and throwing his shadow onto the screen, and the shadow was black compared to the black of the projector. When they switched to the 4K Dolby projector, the presenter more or less disappeared into darkness with no visible shadow cast onto the screen, the only way you could see him at all was from room ambience.
Posted 10 June 2016 - 09:29 AM
Posted 10 June 2016 - 12:40 PM
I think the issue for cinematographers is about lighting and exposure for HDR display. I imagine that one might use a different style of lighting knowing that the release would be HDR. Of course this is complicated by the fact that much of the viewing for the foreseeable future will be on "old school" non-HDR display as well.
I think the trend lately, has been to shoot HDR style, for standard DR display. David Fincher comes to mind beginning with "Zodiac". Here HDR capture has been squished into standard DR display in color grading and adjusted to present an attractive result with limited DR display.
The challenge is more for colorists who will need to create a different grade for HDR vs. Standard DR. And perhaps, more than one HDR version for each HDR system as there is no standard yet.
It's also possible to create a pseudo HDR from limited DR material by selectively controlling highlights and shadows in color grading to make a version that will work on an HDR display. Even if in a limited way.
There is another issue here that I'm thinking about though: Imagine some day interior scene with windows to the outside daylight. The old school tradition is to let the windows roll off towards the clip point (either film or digital), diminishing the detail that's viewable through the window. This helps place the viewers attention on the interior scene being filmed, while achieving the mood of a bright day outside.
If the HDR version has the same bright window, but contains all the detail outside, what happens to the viewer's attention? Does this lead to confusion in story telling? Can too much DR be a bad idea for story telling?
HDR demos can be kind of like stereo 3d demos. In each case the image is more like real life, and that certainly gets our attention and looks pretty cool for a few minutes. I think it's possible that 3d becomes fatiguing to watch in it's own way, and I think it's possible the HDR, that resembles "real life" could be fatiguing to view as well over extended periods.
There is also the issue of screen size, and how much it fills the viewer's peripheral vision. If an HDR display fills only a small angle of view, the viewer's eyes won't be able to adjust to the bright parts of the image and it will become uncomfortable to view. In this case, a bias light might be needed to surround the screen with some ambient lighting.
I guess my point is that there are a lot of perceptual, physical, and even, psychological issues involved in HDR presentation. And also, artistic issues to be dealt with. When creating the new standards for HDR, I hope that there might be a different standard for home viewing, theater viewing, and theme park ride viewing Whatever happens, it's sure to be complicated.