Jump to content


Photo

At what resolution is film scanned for a D.I.?


  • Please log in to reply
22 replies to this topic

#1 Reuel Gomez

Reuel Gomez
  • Basic Members
  • PipPipPipPip
  • 257 posts
  • Other

Posted 22 July 2013 - 05:48 AM

I read an article from a while back (I'd say about 2002-2003) where some guy from EFilm said that they scan at a higher resolution, I think it was 4K, before doing a film-out in 2K. Is this true? And would that mean that a 4K D.I. scans in 8K resolution?
  • 0

#2 David Cunningham

David Cunningham
  • Basic Members
  • PipPipPipPip
  • 1049 posts
  • Cinematographer

Posted 22 July 2013 - 09:33 AM

Most films are DI scanned at 2K despite the availability of 4K projection and 35MM being much better than 2K.  I was VERY disappointed to find that Les Miserable was a 2K DI.  At this day-and-age, it's such a waste.  Especially up close to a big screen, 2K just doesn't cut it.  So much information is lost.  Les Miserables almost looked like a digital movie to me because of it.  The grain was blocky and smudgy because it was so fine and not resolved correctly.

 

The debate rages on about what resolution scans should be done.  But, my opinion is that Super 8 needs 2K or at least HD.  Super 16 definitely needs at least 2K but probably 4K, especially with fine grained films.  And 35mm needs 8K but a 4K absolute minimum.  The reason isn't just "resolution" or "information".  Even if the film itself does not have as much resolution as a 4 or 8K scan can provide, modern film stocks have crazy fine grain (especially Vision3 50D, etc).  If you don't scan the film at a resolution that can properly resolve the grain, you loose a major part of the "film look".  The grain aliases, smudges and blocks and just doesn't look like it was meant to.  Even if you aren't seeing more "detail" in the image, you are getting a better image that makes full use of the soft "organic" look that is largely the reason for shooting film in the first place.

 

Depending on the scanner, many of them scan the data at higher resolutions and then output to a lower one.  For example, the Arriscan scans at 3K for a 2K output and 6K for a 4K output.  A major part of the reason for this is the limitations of area sensors like those used in the Arriscan.  Also, grain aliasing is handled better because the actual scan is 6K then down-rezed to 4K, so the grain is cleaner.  There is also "overscanning" of the frame.  For example, the ScanStation is more than 2K so that it can correctly overscan the Super 8 frame and still result in a 2K image area.  The area sensor at Cinelab similarly scans at 3K for down-rez to a 2K format.

 

The sound of music was re-scanned from original camera negatives a few years go.  It was scanned at 8K for blu-ray.  Why you ask?  Primarily, the reason was grain aliasing wanting to master in a format that could be used for 4K projection down the road and for future "restorations" and re-issues.  The Blu-ray is fantastic.  But, considering it was 70mm Tod-AO, it's amazingly grainy.  There probably isn't a whole lot more real information than 2K on the film itself.  But, at least they have it archived away with no question about the full detail being in the digital image.

 

Sorry, that was a ramble.  :)


  • 0

#3 David Cunningham

David Cunningham
  • Basic Members
  • PipPipPipPip
  • 1049 posts
  • Cinematographer

Posted 22 July 2013 - 09:38 AM

Oh, I think to get back to your original question...

 

Yes, most film is scanned at a higher resolution than it's final output.  Much of this is for grain resolution/aliasing.  This can be especially important in "film-out" where you are now adding another layer of film grain.  You don't want to put smudgy, muddy funky grain on top of more grain.

 

Also, scanning at a higher resolution and then down-rezing allows you to control a bit more of what information is lost or compressed.  Also, if you scan at 4K for an eventual 2K output, you can re-frame, zoom, etc.  If you scan at final resolution from the get-go, you are stuck at that resolution OR LOWER.


  • 0

#4 Phil Rhodes

Phil Rhodes
  • Sustaining Members
  • 11937 posts
  • Other

Posted 22 July 2013 - 10:06 AM

It's worth being clear that the four-generation 35mm photochemical process was never capable of sharpness (properly, "acutance") equivalent to a 2K digital projection mastered from a good 2K digital source. More resolution may be required for scanning 35mm because of the need to ensure the grain is properly imaged. Likewise, filming out HD material to 35mm prints will produces results that, while entirely watchable, are not as sharp as either format can achieve on its own.

 

DI is a much abused term, often mistakenly taken to mean simply "grading" these days, but a true DI was never very kind to the sharpness of 35mm, much as filming out digitally-originated material was never very kind to it, either. It's still true, though, that film transferred to film, transferred to film, transferred to film, was never a 2K medium.

 

P


  • 0

#5 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 19759 posts
  • Cinematographer
  • Los Angeles

Posted 22 July 2013 - 11:45 AM

This is an old article but shows the limitations of scanning 35mm at 2K:

http://www.creativep....com/node/44657

 

Generally there is a principle that you should sample an image at a higher resolution than it contains.  A 4K scan barely allows this; generally 35mm film seems to resolve around 3K in detail, which means it is best to scan it at 4K to 6K to avoid aliasing.  Essentially you want to oversample it. Since the ARRISCANNER can do 6K, a lot of archival work is now done starting with 6K scans, which are then downsampled to 4K.

 

A number of D.I.'s involve a 4K scan that is downsampled to 2K for the rest of the work.  The idea here is that even though the finished master is in 2K, a 4K scan insures that every bit of grain in the original is faithfully reproduced.

 

There was a even bigger reason to finish at 4K if one was going to film-out the results because there is some sharpness loss, especially if you were recording a 4K dupe negative but then striking an IP and then multiple IN's to make mass release prints, so if you had started with a 2K film-out and then went through multiple generations to release print, the results would be a bit softer (unfortunately a lot of movies did it this way.)  But now most movies are released in a 2K DCP so the quality doesn't really drop further down.

 

And some studios (Warner Bros. mainly, perhaps Sony) are pushing more for a 4K finish because they want movies that went through a D.I. archived at 4K and there are plans for 4K distribution in the future.

 

Most D.I. facilities can work at 4K, it's just that it is 4X the data to handle, and they charge more for dealing with it.  The other limitation has been that visual efx companies are adverse to doing the work at 4K, particularly if the movie is 3D because that's already 2X the work.  


  • 0

#6 Dirk DeJonghe

Dirk DeJonghe
  • Basic Members
  • PipPipPipPip
  • 605 posts
  • Industry Rep
  • Kortrijk,Belgium

Posted 22 July 2013 - 01:09 PM

I never understood why some labs absolutely wanted to make an IP/DN from a recorded negative losing even more quality on the way. Your recorded negative is not an original and you can make more if you need to.

 

if you sit in a seat in a theatre where you can see the entire image at once, only a few percent of the human population would be able to see the difference between 'good' 2K and 4K. Using this parameter, a professor from Fraunhofer Institute once explained that the human eye is really no more than 3K (someone with very good eyesight).


  • 0

#7 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 19759 posts
  • Cinematographer
  • Los Angeles

Posted 22 July 2013 - 01:51 PM

The reason is money.  It costs something like $10,000 or less to strike a one-light IP or IN, but more to record out another copy on an Arrilaser.  I don't remember the cost, something like $40,000 for a film-out -- my prices are out of date -- but it doesn't matter if it only cost $10,000 more to make each IN off of the digital master rather than dupe an IP, distributors see that as spending more money that they don't have to. They would question you if it cost only $2,000 more.

 

I believe in oversampling and personally feel that it's a plus when you reach the point where you can't see the improvement, that's the whole point, you need to get beyond the point where people can spot artifacts, pixels, etc.  2K looks decent on a modest-sized screen if you don't sit too close, but closer and larger and you start to be aware of stair-stepped edges, etc.  Certainly I think 2K projection is on par with 35mm release print film projection -- worse for blacks, better for steadiness, sharpness -- but now we have an opportunity to make a bigger step forward in quality.  However, I'm not much of a fan of the current 4K Sony projector in theaters -- the blacks are grey compared to a film print, and even a 2K DLP projector has better blacks and contrast.


  • 0

#8 Freya Black

Freya Black
  • Basic Members
  • PipPipPipPip
  • 4161 posts
  • Other
  • Went over the edge... Central Europe

Posted 22 July 2013 - 02:34 PM

Most films are DI scanned at 2K despite the availability of 4K projection and 35MM being much better than 2K.

 

Just to be clear tho, while 4K projection is available, it's very rare. Even the places that have a 4K projector often aren't set up for a full 4K workflow and so wouldn't actually be able to project a 4K DCP.

 

4K DCP's are even rarer, even movies shot on Red cameras in 4K usually end up as a 2K DCP because of the VFX, or just because.

 

There is presently something of a little move towards 4K in order that it might be ready for UHD TV.

 

Freya


  • 0

#9 David Cunningham

David Cunningham
  • Basic Members
  • PipPipPipPip
  • 1049 posts
  • Cinematographer

Posted 22 July 2013 - 03:55 PM

The reason is money.  It costs something like $10,000 or less to strike a one-light IP or IN, but more to record out another copy on an Arrilaser.  I don't remember the cost, something like $40,000 for a film-out -- my prices are out of date -- but it doesn't matter if it only cost $10,000 more to make each IN off of the digital master rather than dupe an IP, distributors see that as spending more money that they don't have to. They would question you if it cost only $2,000 more.

 

I believe in oversampling and personally feel that it's a plus when you reach the point where you can't see the improvement, that's the whole point, you need to get beyond the point where people can spot artifacts, pixels, etc.  2K looks decent on a modest-sized screen if you don't sit too close, but closer and larger and you start to be aware of stair-stepped edges, etc.  Certainly I think 2K projection is on par with 35mm release print film projection -- worse for blacks, better for steadiness, sharpness -- but now we have an opportunity to make a bigger step forward in quality.  However, I'm not much of a fan of the current 4K Sony projector in theaters -- the blacks are grey compared to a film print, and even a 2K DLP projector has better blacks and contrast.

 

 

Ah!  Sony 4k is the reason Les Miserable shadows were so milky looking.  I figured I was missing something.  I sat in this Beautiful 4k "awesome" theater and watched it with disgust.  (This is ignoring the obnoxiously close-up shoulder-mount method the director used that frequently made me sick.)  Maybe I will download the blu-ray to see if it looks better.  (The shadow details that is).


  • 0

#10 David Cunningham

David Cunningham
  • Basic Members
  • PipPipPipPip
  • 1049 posts
  • Cinematographer

Posted 22 July 2013 - 03:57 PM

 

Just to be clear tho, while 4K projection is available, it's very rare. Even the places that have a 4K projector often aren't set up for a full 4K workflow and so wouldn't actually be able to project a 4K DCP.

 

4K DCP's are even rarer, even movies shot on Red cameras in 4K usually end up as a 2K DCP because of the VFX, or just because.

 

There is presently something of a little move towards 4K in order that it might be ready for UHD TV.

 

Freya

 

Most of the major theaters around the greater Boston area are 4k Sony projection now.  There ARE some 2K theaters and some 35mm film theaters.  But, both are pretty rare around here now.


  • 0

#11 Reuel Gomez

Reuel Gomez
  • Basic Members
  • PipPipPipPip
  • 257 posts
  • Other

Posted 22 July 2013 - 04:04 PM

 
 
Ah!  Sony 4k is the reason Les Miserable shadows were so milky looking.  I figured I was missing something.  I sat in this Beautiful 4k "awesome" theater and watched it with disgust.  (This is ignoring the obnoxiously close-up shoulder-mount method the director used that frequently made me sick.)  Maybe I will download the blu-ray to see if it looks better.  (The shadow details that is).

I have the "Les Miserables" Blu-Ray and let me tell you it looks terrible.
  • 0

#12 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 19759 posts
  • Cinematographer
  • Los Angeles

Posted 22 July 2013 - 05:27 PM

Reuel, you should probably familiarize yourself with this:

http://en.wikipedia.org/wiki/Nyquist–Shannon_sampling_theorem

 

 

The rough concept when applied to images is that you should scan them at twice the level of detail in the image to avoid aliasing.  However, since images rarely contain nothing but a field of fine lines in a grid pattern, and rarely does 100% of the image reach peak resolution, plus a moving image has random grain patterns that change with each frame, aliasing artifacts in a scan happens less often than it does with something shot digitally, assuming the scan was done at sufficient resolution.  Hence why even though 35mm is more like 3K in terms of fine detail recorded, and therefore should be scanned at 6K to avoid the Nyquist limits, the truth is that often even a 2K scan is OK in terms of not having problems with aliasing, though it can happen (however, you can apply anti-aliasing filters to the process to minimize the problem, though at the risk of softening fine detail too much.)   In general, it's better to oversample, the main problem just being cost, time, and the unwieldiness of large files... there comes a point of diminishing returns. Which is one reason why it has become more common to scan at a higher resolution and then downsample for the rest of the process.  I mean, film has no pixels, no pixel resolution, and you could scan 35mm at 15K or 150K if you wanted, it's just that above a certain point, you aren't capturing any more detail so it's a waste.

 

Because film detail exists on grains, and those grains are in different spots on each frame, there is a feeling that perceptible film resolution during projection/display has to be considered as some sort of vague average of detail over a couple of the frames, which is why often a freeze frame not only looks grainier but feels a little softer.  I don't know if this is a very scientific theory, what some might call temporal resolution, because I don't know if the brain really averages detail on multiple frames flashed at them in succession at only 24 fps.  But there is some feeling that merely looking at single frames as a way of measuring resolution doesn't give us the whole picture.  

 

There is also the other factor that when our eyes see grain in sharp focus on the screen, we think the image has more detail and resolution than it really has because our eyes see something tiny and sharp -- it's just that it's not detail in the subject, it's grain.  It's one reason why something shot on 50 ASA film can feel softer than something shot on 500 ASA film, or why some things shot on film look sharper than something shot digitally when measurements using line resolution charts don't show this.


  • 0

#13 Reuel Gomez

Reuel Gomez
  • Basic Members
  • PipPipPipPip
  • 257 posts
  • Other

Posted 22 July 2013 - 06:29 PM

Reuel, you should probably familiarize yourself with this:
http://en.wikipedia....ampling_theorem
 
 
The rough concept when applied to images is that you should scan them at twice the level of detail in the image to avoid aliasing.  However, since images rarely contain nothing but a field of fine lines in a grid pattern, and rarely does 100% of the image reach peak resolution, plus a moving image has random grain patterns that change with each frame, aliasing artifacts in a scan happens less often than it does with something shot digitally, assuming the scan was done at sufficient resolution.  Hence why even though 35mm is more like 3K in terms of fine detail recorded, and therefore should be scanned at 6K to avoid the Nyquist limits, the truth is that often even a 2K scan is OK in terms of not having problems with aliasing, though it can happen (however, you can apply anti-aliasing filters to the process to minimize the problem, though at the risk of softening fine detail too much.)   In general, it's better to oversample, the main problem just being cost, time, and the unwieldiness of large files... there comes a point of diminishing returns. Which is one reason why it has become more common to scan at a higher resolution and then downsample for the rest of the process.  I mean, film has no pixels, no pixel resolution, and you could scan 35mm at 15K or 150K if you wanted, it's just that above a certain point, you aren't capturing any more detail so it's a waste.
 
Because film detail exists on grains, and those grains are in different spots on each frame, there is a feeling that perceptible film resolution during projection/display has to be considered as some sort of vague average of detail over a couple of the frames, which is why often a freeze frame not only looks grainier but feels a little softer.  I don't know if this is a very scientific theory, what some might call temporal resolution, because I don't know if the brain really averages detail on multiple frames flashed at them in succession at only 24 fps.  But there is some feeling that merely looking at single frames as a way of measuring resolution doesn't give us the whole picture.  
 
There is also the other factor that when our eyes see grain in sharp focus on the screen, we think the image has more detail and resolution than it really has because our eyes see something tiny and sharp -- it's just that it's not detail in the subject, it's grain.  It's one reason why something shot on 50 ASA film can feel softer than something shot on 500 ASA film, or why some things shot on film look sharper than something shot digitally when measurements using line resolution charts don't show this.

Is there any way to calculate film resolution?
  • 0

#14 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 19759 posts
  • Cinematographer
  • Los Angeles

Posted 22 July 2013 - 07:32 PM

People shoot tests of line resolution charts and then scan the frames at a higher resolution than the film could probably achieve, then count the lines that are resolvable on the chart (I suppose you could try holding the negative under a loop or just project the negative onto a big screen and count the lines resolved.)

 

In theory, I suppose resolution is limited by the size of the smallest grain but they are too randomly scattered about and vary too much in size to count how many grains there are across a frame to determine horizontal resolution.

 

I think Kodak was the first to say that 35mm negative was 4K across, back when they developed the Cineon scanner and file format in the late 1980's, but again, the problem is that you have to differentiate between optimal scanning resolution and image detail, the first should be higher than the second.

 

The first digital restoration of a 35mm movie was done at 4K, of Disney's "Snow White and the Seven Dwarfs", in 1993 by Kodak Cinesite.  The movie had been shot in successive-frame 3-color (not 3-strip)Technicolor onto b&w panchromatic negative.

 

Take it with a grain of salt if you want, but Red has shot resolution tests of 35mm and said that the most amount of detail they ever saw resolved was maybe 3.5K (?) -- I can't remember the exact figure, maybe it was 3.2K.

 

I saw a 4K test comparing 2.88K ArriRaw from the Alexa to a 6K scan of Super-35 250T Kodak negative, both shots finished at 4K and digitally projected at 4K on a large screen on the Sony lot... and they looked rather similar in terms of fine detail (it was of a scene, not a chart).  If anything, the Alexa shot was marginally finer-detailed but that was just a rough impression.

 

I've done some D.I.'s using 4K scans of 35mm anamorphic versus 2K scans, filmed them out at 4K to 35mm, and projected them next to 35mm anamorphic contact-printed... my general impression was that there was resolution loss from the 2K D.I. when it was recorded out to film compared to contact-printed 35mm, but the 4K film-out looked closer to the contact print.  And since then, looking at my own footage on 2K projectors, my gut tells me that 35mm is more like a 3K format in terms of visible resolution, and it can fall below that easily when using softer lenses or the shot is washed out or the focus is slightly off, etc.

 

The numbers can be misleading, I don't think this is an exact science, there are so many factors at work in terms of the perception of resolution or sharpness.  All that matters to me is working with a ballpark figure, exact figures are mostly misleading marketing tools that have little to do with real life shooting conditions.

 

Keep repeating to yourself: FILM DOES NOT HAVE PIXELS SO IT DOES NOT HAVE PIXEL RESOLUTION.  Real resolution is probably measured by MTF or something but I can never quite understand that subject. You can take a crack at it:

http://en.wikipedia....ansfer_function


  • 0

#15 Phil Rhodes

Phil Rhodes
  • Sustaining Members
  • 11937 posts
  • Other

Posted 22 July 2013 - 08:10 PM

Ah!  Sony 4k is the reason Les Miserable shadows were so milky looking.  

 

The Sony 4K projection most people seem to be talking about uses what they call "SXRD", for something like Silicon X-tal (for "crystal) Reflective Display, which the rest of the world calls LCOS, or liquid crystal on silicon. JVC call it D-ILA, etc, etc. This is effectively a TFT display much the same as the one you're looking at as you read these words, but bonded onto a mirror and designed to operate as a reflective rather than transmissive device. This allows it to be aggressively heatsinked and deal with much higher power levels than a transmissive TFT, because the light doesn't have to pass through it (or at least, not very far through it).

 

As such it suffers much the same problems achieving a low black level as any other TFT. That said, The JVC HD2K projector was probably the first affordable projector that I ever saw the Truelight colour calibration system agree had a sufficiently good black level to bother calibrating, so there isn't anything too atrociously wrong with it as a technology.

 

It is certainly much cheaper than DLP, which can in theory achieve an absolute zero black level.

 

P


  • 0

#16 Charles Zuzak

Charles Zuzak
  • Basic Members
  • PipPip
  • 74 posts
  • Other
  • Pittsburgh, PA

Posted 07 August 2013 - 08:11 PM

Take it with a grain of salt if you want, but Red has shot resolution tests of 35mm and said that the most amount of detail they ever saw resolved was maybe 3.5K (?) -- I can't remember the exact figure, maybe it was 3.2K.

 

Sadly, with most things Red, you do have to take it with a grain of salt.

 

I read the same statement: they claim that the sharpest 35mm stock only resolves to 3.2K.

 

The problem, however, is not so much the statement, but rather, the lack of any evidence or empirical data.

 

They never stated what stock was used, what telecine/data scanner was used to scan it and what lenses were used on the camera body.

 

I'd like to believe what Red has said is true, but that's not possible.


Edited by Charles Zuzak, 07 August 2013 - 08:11 PM.

  • 0

#17 steve waschka

steve waschka
  • Basic Members
  • PipPipPip
  • 204 posts
  • Industry Rep
  • Indian Harbour Beach, FL

Posted 11 December 2013 - 09:35 AM

I think it may be important to add that it also depends on the gauge of the film stock you are using. The grain size does not change from gauge to gauge of the same formulation. Yet the size of the same shot will. Ie a cu of an object that falls on a 35mm will cover less area when falling on 16mm. So you can get away with a bit less scanner for 16mm than you would choose based on a linear graph of resolution required for 35. Couple that with the larger grain size and distribution in older stocks used for small gauge and its no wonder 8mm has gone away. Which is sad because i much prefer the visual texture of the film stock to that of a handheld video camera i can carry around. Give me vision 3 for my bolex H8 and id be way happier than using a little electronic whizbang. Its not like we do that stuff every day. It didnt really cost that much. We really blew it voting with our dollars consumers! In the outdoor industry its what has happened to flyfishing and traditional archery. Its a bit of work so we dont do it.


Edited by steve waschka, 11 December 2013 - 09:39 AM.

  • 0

#18 steve waschka

steve waschka
  • Basic Members
  • PipPipPip
  • 204 posts
  • Industry Rep
  • Indian Harbour Beach, FL

Posted 11 December 2013 - 09:55 AM

And back to Davids point about moving grain... I shoot sports. Soccer predominantly. It seems I am often happier with full size sensor still shots than I am 35mm scanned. Now im usually pushing the light limits, using fast film and needing high shutter speeds. But the point is thats the worst performing aspect of film so it exaggerates the point to be more obvious. The OPPOSITE holds true for motion for me. So it has to have something to do with the fact that the grain structure is moving. A line is broken by grain only for 1/24. And then its all in another place. And the edges are sharp without the grain. So the average seems to be a homologous of broken sharpness that the brain avgs out. So moving 35 looks way better than still 35. And i have NO idea how you quantify that. BUT I would be very careful how we vote with our dollars. Kodak seems to be ok right now but it is a hard world for them to survive in.


Edited by steve waschka, 11 December 2013 - 09:56 AM.

  • 0

#19 Will Montgomery

Will Montgomery
  • Sustaining Members
  • 2030 posts
  • Producer
  • Dallas, TX

Posted 11 December 2013 - 01:58 PM

I would love to see a 70mm (65mm), first gen print projected against a 4K projector of the same material scanned at 8K and down sampled to 4K. Curious what the difference would be. I've seen 70mm prints but they were usually pretty beat up by the time I saw them years ago.


  • 0

#20 Stuart Brereton

Stuart Brereton
  • Basic Members
  • PipPipPipPip
  • 3057 posts
  • Cinematographer
  • Los Angeles

Posted 11 December 2013 - 03:46 PM

I shoot a lot of 6x7 medium format stills, which are fairly close in size to a 70mm neg. I usually scan at 6-8k resolution, but my feeling is that I could easily go as high as a 12k scan and still be retrieving useful information.


  • 0


The Slider

Visual Products

Metropolis Post

Willys Widgets

Rig Wheels Passport

Media Blackout - Custom Cables and AKS

Ritter Battery

Paralinx LLC

Technodolly

Aerial Filmworks

Abel Cine

Opal

Tai Audio

CineTape

Wooden Camera

Gamma Ray Digital Inc

Broadcast Solutions Inc

CineLab

rebotnix Technologies

Glidecam

FJS International, LLC

Visual Products

CineLab

Aerial Filmworks

rebotnix Technologies

Abel Cine

Paralinx LLC

Ritter Battery

CineTape

FJS International, LLC

Opal

Willys Widgets

Wooden Camera

Media Blackout - Custom Cables and AKS

Technodolly

Broadcast Solutions Inc

Gamma Ray Digital Inc

The Slider

Glidecam

Rig Wheels Passport

Tai Audio

Metropolis Post