Jump to content


Photo

Stupid HDR question


  • Please log in to reply
12 replies to this topic

#1 George Ebersole

George Ebersole
  • Sustaining Members
  • 1435 posts
  • Industry Rep
  • San Francisco Bay Area

Posted 10 April 2017 - 09:41 PM

So, is HDR really all that "good"?  Is anybody here authoring their projects for 4k HDR?

 

I bought one recently, and even though I've built a new rig with the latest video card, a Dell superwidescreen 4k monitor, I can't get it to run, and every site I go to I get "It's great!" or "It sucks!"

 

Anybody here have any insight?


  • 0

#2 Stuart Brereton

Stuart Brereton
  • Basic Members
  • PipPipPipPip
  • 2767 posts
  • Cinematographer
  • Los Angeles

Posted 11 April 2017 - 10:43 AM

The last few movies I've shot have all had a HDR color-timing pass done on them, mostly for future proofing, rather than any immediate demand. The images did look incredible on a $30k Sony monitor in a dark room, but how well any of it would translate on a consumer HDR tv in someones home, I have no idea.


  • 0

#3 George Ebersole

George Ebersole
  • Sustaining Members
  • 1435 posts
  • Industry Rep
  • San Francisco Bay Area

Posted 11 April 2017 - 01:00 PM

Thanks.  It seems to me that proper color adjustment can be done on the end user / consumer end to get a desired image.  Maybe I'm being naive in that, but for all the visual tweaks I've seen done to source material I've yet to come across any image enhancement that actually makes a poor to so-so film a good film.  


  • 0

#4 Jack OGara

Jack OGara
  • Basic Members
  • PipPip
  • 52 posts
  • Cinematographer
  • Derby - UK

Posted 11 April 2017 - 05:27 PM

What CPU you got?


  • 0

#5 George Ebersole

George Ebersole
  • Sustaining Members
  • 1435 posts
  • Industry Rep
  • San Francisco Bay Area

Posted 12 April 2017 - 02:34 AM

i7 3.3GHz with 5800 onboard cache, 32 gigs RAM

 

I built it not more than 8 months ago.  It was going to be my editing rig as well as my gaming rig, but if HDR is the new thing, and if I can't edit or author HDR ... well.


  • 0

#6 Phil Rhodes

Phil Rhodes
  • Sustaining Members
  • 11645 posts
  • Other

Posted 12 April 2017 - 04:45 AM

There is no particular set of hardware specifications associated with ability or inability to create HDR material, beyond the minimum requirements for running your NLE of choice. Perhaps rendering might take fractionally longer for some techniques.

 

The problem with HDR at the moment is that the (very impressive) trade show demos given by outfits like Dolby and Sony are vastly better than what's actually likely to be sold to the public. The Sony BVM-X300 is an OLED, thus enjoying zero black level and offers 1500 nit peak brightness, and Dolby's Pulsar monitors go all the way up to 4000 using an LCD panel with local backlight dimming. There's no denying that the resulting pictures are clearly and obviously a cut above standard dynamic range. It looks good. It's really nice.

 

Unfortunately, even the targets, let alone the products, for consumer-level devices are nowhere near as capable. They require 1000 nits of peak brightness with an 0.05 nit black level, which is a target clearly aimed at LCDs, or 540 nits peak brightness and 0.0005 nits black level, which is clearly aimed at the emerging consumer-level OLEDs. This looks better than SDR, but you really have to have them side by side to be able to tell. Bear in mind that consumer TVs have been massively exceeding the hundred-or-so nit standard associated with Rec. 709 for years because it looks better and sells more readily. Things like computer monitors regularly go up to 500 or 600, and consumer HDR may fail to even hit the 1000-nit standard, so most consumer HDR is less than a stop brighter, if at all, than conventional displays.

 

This is a shame because really good HDR is, well, really good. It can be clearly and obviously better than preexisting pictures and in a way that doesn't really cause too many problems. It's hard to dislike, even for a dedicated curmudgeon such as yours truly. Even the much less capable consumer level implementations of it are not a bad thing - a thousand-nit TV with really decent blacks is great. It's just not as great as it could be. Partly this is the economics of domestic TVs, partly it's the keenness of the TV industry to have a new badge to put on things without having to do more than the absolute minimum amount of work. In this respect I blame the TV manufacturers for making up a new acronym every week which serves to dilute the impact of genuinely interesting things. Still, the bottom line is that it's generally good, it will get better as TVs, particularly the peak brightness of OLEDs, improves, but the actual deployment of it could be seen as lacking a little in ambition.

 

P


  • 0

#7 Jack OGara

Jack OGara
  • Basic Members
  • PipPip
  • 52 posts
  • Cinematographer
  • Derby - UK

Posted 12 April 2017 - 04:50 AM

Doesn't 4K require the new Intel Kaby Lake CPUs to work? 

That is with authoring too?

 

Might be totally wrong here, not really done authoring before.


  • 0

#8 Phil Rhodes

Phil Rhodes
  • Sustaining Members
  • 11645 posts
  • Other

Posted 12 April 2017 - 05:01 AM

Depends what software you're using, but no, it's just a matter of processing power. 

 

Now, particular pieces of software may have specific requirements, but there's no particular reason that a particular frame size might be associated with a specific CPU requirement in the general case.

 

P


  • 0

#9 George Ebersole

George Ebersole
  • Sustaining Members
  • 1435 posts
  • Industry Rep
  • San Francisco Bay Area

Posted 12 April 2017 - 12:51 PM

Thanks Phil

 

Honestly speaking, as one curmudgeon to another. again this is just for me, it seems like if any DP worth his salt is getting the image the director wanted, then there really isn't a need for new technology like HDR.  I've seen demos, and my thinking is that good image comes down to resolution technology, because you can always tweak blacks and colors in post no matter if you're shooting digital or film.

 

That may sound naive or heresy, but I think at some point you have to decide on what or how you want to present your final product to your audience.  If you can get that, then you're done.

 

I won't call HDR a gimmick, but in spite of all of the impressive image comparisons that I've seen, again, it doesn't hit me as being something that can't be achieved through other means.

 

Besides, 14grand for a 30" monitor?  No thank you.


  • 0

#10 Stuart Brereton

Stuart Brereton
  • Basic Members
  • PipPipPipPip
  • 2767 posts
  • Cinematographer
  • Los Angeles

Posted 12 April 2017 - 04:10 PM

 I've seen demos, and my thinking is that good image comes down to resolution technology, because you can always tweak blacks and colors in post no matter if you're shooting digital or film.

 

I won't call HDR a gimmick, but in spite of all of the impressive image comparisons that I've seen, again, it doesn't hit me as being something that can't be achieved through other means.

 

 

It can't be achieved through other means, because SDR displays don't have the dynamic range to display the full range of luminance values that the original images have. You can color time for greater detail in the highlights, but you lose shadows, and vice versa. You are always trying to squeeze 14+ stops of detail into a  8-10 stop range

 

It may be true that perceptually there is little difference, but that can be down to something as simple as viewing in too bright an environment.


  • 0

#11 George Ebersole

George Ebersole
  • Sustaining Members
  • 1435 posts
  • Industry Rep
  • San Francisco Bay Area

Posted 12 April 2017 - 07:53 PM

It can't be achieved through other means, because SDR displays don't have the dynamic range to display the full range of luminance values that the original images have. You can color time for greater detail in the highlights, but you lose shadows, and vice versa. You are always trying to squeeze 14+ stops of detail into a  8-10 stop range

 

It may be true that perceptually there is little difference, but that can be down to something as simple as viewing in too bright an environment.

 

Well that's the whole thing, if you can't tell, then what good is it?  


  • 0

#12 Phil Rhodes

Phil Rhodes
  • Sustaining Members
  • 11645 posts
  • Other

Posted 12 April 2017 - 08:01 PM

Well, you can, even on the consumer gear. But it's not as smack-you-in-the-eye gorgeous like the very high end stuff.


  • 0

#13 George Ebersole

George Ebersole
  • Sustaining Members
  • 1435 posts
  • Industry Rep
  • San Francisco Bay Area

Posted 12 April 2017 - 09:37 PM

I don't know, my gut feeling is that watching "The Empire Strikes Back" in HDR isn't going to make me relive my boyhood again.

 

When I used to watch movies on TV I noted they mostly had better image quality than home video.  But now I'm just not seeing the real benefits for more recent films.  Just my opinion though.  Maybe I'll change my mind by the end of the year or something. 


  • 0


The Slider

Ritter Battery

Rig Wheels Passport

Glidecam

Paralinx LLC

FJS International, LLC

Quantum Music Works

Metropolis Post

Pro 8mm

Broadcast Solutions Inc

rebotnix Technologies

Gamma Ray Digital Inc

Willys Widgets

Aerial Filmworks

Tai Audio

Visual Products

CineTape

Media Blackout - Custom Cables and AKS

ZoomCrane

Abel Cine

CineLab

Technodolly

Rig Wheels Passport

CineTape

FJS International, LLC

Abel Cine

Quantum Music Works

The Slider

Aerial Filmworks

Paralinx LLC

Glidecam

Willys Widgets

rebotnix Technologies

Broadcast Solutions Inc

CineLab

Pro 8mm

Ritter Battery

Metropolis Post

Tai Audio

Gamma Ray Digital Inc

ZoomCrane

Media Blackout - Custom Cables and AKS

Technodolly

Visual Products