Hi, all. Been lurking for a while - love what you've done with the place.
I saw the 4k restoration of Citizen Kane and it was as gorgeous as you could hope for (not having the OCN). I bought the blu ray and it's grainy:
I saw 2001: A Space Odyssey in 4k and it was... only ok looking (is it just me or does Warner Brothers over-compress their DCPs?). But it wasn't as grainy as the blu ray:
The seams between the "Dawn of Man" sets and screen were much less conspicuous, too.
I haven't seen Apocalypse Now on the big screen yet but I expect it's less grainy than the blu ray:
What's going on? I know that film can look grainier than it is, when scanned at low resolutions, but this is film looking grainier than it is, when shown at a low resolution. Additionally, I've seen this in S16 originated HD video online, compared to S16 originated HDTV - same stocks and display resolution (give or take a little compression). It's the kind of grain I hate most, too - analog noise, basically. I can live with soft but I hate noisy.
A few weeks ago, I saw something else that I don't understand. I was watching the (according to IMDB) S35 originated True Detective pilot and saw moire-ing on a piece of wardrobe. I myself can't think of a particular reason film couldn't moire but conventional wisdom holds that it won't and, despite every darn exhibition being digital, now, I've never it from film originated material, before.
It occurs to me that IMDB could have been wrong about the format (with all the compression, it's hard to be certain, but the images did look pretty Alexa-y) but, assuming it wasn't, why would I see moire and why hadn't I seen it before? (If you don't mind a tangent, does the Alexa moire, and if so, why? Shouldn't oversampling prevent that?)
Thanks for knowledge.