Jump to content


Photo

Need help understanding artifacts in film originated HD


  • Please log in to reply
8 replies to this topic

#1 cole t parzenn

cole t parzenn
  • Basic Members
  • PipPipPipPip
  • 288 posts
  • Other

Posted 09 May 2014 - 08:03 PM

Hi, all. Been lurking for a while - love what you've done with the place.

 

I saw the 4k restoration of Citizen Kane and it was as gorgeous as you could hope for (not having the OCN). I bought the blu ray and it's grainy:

 

Kane 1

Kane 2

 

I saw 2001: A Space Odyssey in 4k and it was... only ok looking (is it just me or does Warner Brothers over-compress their DCPs?). But it wasn't as grainy as the blu ray:

 

2001 1

2001 2

 

The seams between the "Dawn of Man" sets and screen were much less conspicuous, too.

 

I haven't seen Apocalypse Now on the big screen yet but I expect it's less grainy than the blu ray:

 

A. Now 1

A. Now 2

 

What's going on? I know that film can look grainier than it is, when scanned at low resolutions, but this is film looking grainier than it is, when shown at a low resolution. Additionally, I've seen this in S16 originated HD video online, compared to S16 originated HDTV - same stocks and display resolution (give or take a little compression). It's the kind of grain I hate most, too - analog noise, basically. I can live with soft but I hate noisy.

 

A few weeks ago, I saw something else that I don't understand. I was watching the (according to IMDB) S35 originated True Detective pilot and saw moire-ing on a piece of wardrobe. I myself can't think of a particular reason film couldn't moire but conventional wisdom holds that it won't and, despite every darn exhibition being digital, now, I've never it from film originated material, before.

 

It occurs to me that IMDB could have been wrong about the format (with all the compression, it's hard to be certain, but the images did look pretty Alexa-y) but, assuming it wasn't, why would I see moire and why hadn't I seen it before? (If you don't mind a tangent, does the Alexa moire, and if so, why? Shouldn't oversampling prevent that?)

 

Thanks for knowledge.


  • 0

#2 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 19759 posts
  • Cinematographer
  • Los Angeles

Posted 10 May 2014 - 03:53 AM

Film can moire when transferred to video and shown on TV if fine grid patterns in the film image interact with the fine lines of the TV.  It's just that at least with film, it is possible to use anti-aliasing filters in the telecine to reduce moire.

 

Modern digital restorations of film material look grainy because they are sharper; film printing and projection softens the image enough to reduce visible grain but a straight scan of a film negative shown digitally sees every bit of grain if done at a high enough resolution.

 

In fact, for most purists, a grainy image on blu-ray is considered preferable to one that has used digital noise reduction to smooth out some of the grain.


  • 0

#3 Keith Walters

Keith Walters
  • Sustaining Members
  • 2219 posts
  • Other
  • Sydney Australia

Posted 10 May 2014 - 04:35 AM

Another factor is that if a grainy negative is transferred to fine-grain intermediate stock, the "big" grains on the camera negative tend to get broken up into smaller pieces on the intermediate stock which tends to disguise the original gain somewhat.

 

Regarding Blu-Ray releases, as with so called HDTV, just because the movie is delivered on 1920 x 1080, by no means does that guarantee that you're getting the resolution that the format is capable of.

 

I used to do work for a large electrical retailer who sold both DVDs and Blu-Rays, and we got to look at a lot of new releases. On a frightfully large number of them, there was little obvious difference between the DVD and the Blu-Ray versions of the same film. A-B testing you could spot the difference, but not on casual viewing.

 

It's also surprising what a difference there can be between a DVD played via HDMI, and where the MPEG2 files have been copied (but not re-encoded) onto a USB drive and plugged directly into a TV with USB playback. With USB playback, the MPEG2 data stream is directly decoded to match the TVs display; with HDMI, data stream is first decoded to HDMI (Digital RGB) and then has to be re-mapped to match the display panel. Taking out that one step makes a surprising amount of difference.


  • 0

#4 cole t parzenn

cole t parzenn
  • Basic Members
  • PipPipPipPip
  • 288 posts
  • Other

Posted 10 May 2014 - 11:32 AM

Film can moire when transferred to video and shown on TV if fine grid patterns in the film image interact with the fine lines of the TV.  It's just that at least with film, it is possible to use anti-aliasing filters in the telecine to reduce moire.

 

Interesting.

 

 

Modern digital restorations of film material look grainy because they are sharper; film printing and projection softens the image enough to reduce visible grain but a straight scan of a film negative shown digitally sees every bit of grain if done at a high enough resolution.

 

In fact, for most purists, a grainy image on blu-ray is considered preferable to one that has used digital noise reduction to smooth out some of the grain.

 

 

My blu-rays are sharper than the DCPs I saw? Could you elaborate on this?


  • 0

#5 cole t parzenn

cole t parzenn
  • Basic Members
  • PipPipPipPip
  • 288 posts
  • Other

Posted 10 May 2014 - 11:34 AM

Another factor is that if a grainy negative is transferred to fine-grain intermediate stock, the "big" grains on the camera negative tend to get broken up into smaller pieces on the intermediate stock which tends to disguise the original gain somewhat.

 

Interesting.

 

Regarding Blu-Ray releases, as with so called HDTV, just because the movie is delivered on 1920 x 1080, by no means does that guarantee that you're getting the resolution that the format is capable of.

 

I used to do work for a large electrical retailer who sold both DVDs and Blu-Rays, and we got to look at a lot of new releases. On a frightfully large number of them, there was little obvious difference between the DVD and the Blu-Ray versions of the same film. A-B testing you could spot the difference, but not on casual viewing.

 

It's also surprising what a difference there can be between a DVD played via HDMI, and where the MPEG2 files have been copied (but not re-encoded) onto a USB drive and plugged directly into a TV with USB playback. With USB playback, the MPEG2 data stream is directly decoded to match the TVs display; with HDMI, data stream is first decoded to HDMI (Digital RGB) and then has to be re-mapped to match the display panel. Taking out that one step makes a surprising amount of difference.

 

I haven't tried USB but I've noticed similar things.

 

Thanks again.


Edited by cole t parzenn, 10 May 2014 - 11:37 AM.

  • 0

#6 KH Martin

KH Martin
  • Basic Members
  • PipPipPip
  • 246 posts
  • Other
  • Portland, Oregon

Posted 10 May 2014 - 07:06 PM

Film can moire when transferred to video and shown on TV if fine grid patterns in the film image interact with the fine lines of the TV.  It's just that at least with film, it is possible to use anti-aliasing filters in the telecine to reduce moire.

 

Modern digital restorations of film material look grainy because they are sharper; film printing and projection softens the image enough to reduce visible grain but a straight scan of a film negative shown digitally sees every bit of grain if done at a high enough resolution.

 

In fact, for most purists, a grainy image on blu-ray is considered preferable to one that has used digital noise reduction to smooth out some of the grain.

I remember Harrison Ford's suit from LAST CRUSADE when they were in Venice doing crazy moire when I saw it on cable. Remember a lot of that on the PATTON laserdisc as well.

 

Maybe my eyes are going, but I don't see issues with grain on most of these. Issues with DNR, yeah (one reason I still don't have PATTON on BR.)

 

Doug Trumbull said he screened 2001 and BLADE RUNNER via Blu-Ray in his own deluxw home theater and that they looked as good as they ever did in any theatrical venue. That statement kinda stunned me (I remember seeing 2001 in L.A. when I was 7-1/2 and I don't think ANYTHING has ever looked so awesome ever again), but I do kind of want to defer to his judgement, since he might be the only guy still around who has seen 2001 more times than me (outside of Tom Hanks, I'd guess.) I am SO pissed that 2001 screened at the Seattle Cinerama this weekend when I couldn't get away to go up there; I haven't seen it in 70mm since 1989, and haven't seen it in 35mm in over 12 years. (really needed a fix.)


  • 0

#7 KH Martin

KH Martin
  • Basic Members
  • PipPipPip
  • 246 posts
  • Other
  • Portland, Oregon

Posted 10 May 2014 - 07:12 PM

 

Interesting.

 

 
 

 

My blu-rays are sharper than the DCPs I saw? Could you elaborate on this?

Not to step on anybody's answer here, but until the laser projection stuff starts happening, I don't imagine any current theatrical viewings are going to have the zest that a high-lambert true academy standard 35mm projection of past years would feature. Maybe dual 4K would do it? I remember talking to some IMAX guys about this stuff a couple years ago for ICG (might be the story SIZE MATTERS on their site, not sure now... yeah, towards the end there's a bit about this at: http://www.icgmagazi...3/size-matters/ )


  • 0

#8 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 19759 posts
  • Cinematographer
  • Los Angeles

Posted 11 May 2014 - 11:32 AM

My blu-rays are sharper than the DCPs I saw? Could you elaborate on this?

 

Are you doing an apples to apples comparison -- blu-ray and DCP being projected on the same device to the same sized screen?  Or are you comparing a blu-ray seen on a hi-def monitor to a DCP being projected digitally onto a theater screen?  Generally if the blu-ray and DCP are made from the same master, the only difference is pixel resolution and color space.


  • 0

#9 cole t parzenn

cole t parzenn
  • Basic Members
  • PipPipPipPip
  • 288 posts
  • Other

Posted 11 May 2014 - 11:52 AM

 

Are you doing an apples to apples comparison -- blu-ray and DCP being projected on the same device to the same sized screen?  Or are you comparing a blu-ray seen on a hi-def monitor to a DCP being projected digitally onto a theater screen?  Generally if the blu-ray and DCP are made from the same master, the only difference is pixel resolution and color space.

 

True. Maybe I'm not understanding how they're getting from the digital masters to the HD release. Taking 2001, as our example (a 1960s 65mm negative probably resolved about 4k, I figure), a pixel from my blu-ray represents four from the DCP, before chroma subsampling. Why isn't the grain smoothed out?


  • 0


Gamma Ray Digital Inc

FJS International, LLC

rebotnix Technologies

CineLab

Rig Wheels Passport

Glidecam

The Slider

Abel Cine

Willys Widgets

Paralinx LLC

Broadcast Solutions Inc

Aerial Filmworks

Wooden Camera

Visual Products

CineTape

Media Blackout - Custom Cables and AKS

Opal

Tai Audio

Technodolly

Ritter Battery

Metropolis Post

Broadcast Solutions Inc

The Slider

Aerial Filmworks

Wooden Camera

Opal

Metropolis Post

Abel Cine

Technodolly

CineLab

CineTape

Ritter Battery

Willys Widgets

rebotnix Technologies

Paralinx LLC

FJS International, LLC

Glidecam

Rig Wheels Passport

Media Blackout - Custom Cables and AKS

Tai Audio

Visual Products

Gamma Ray Digital Inc