Jump to content




Photo

Playback frame rates for films on digital media.


  • Please log in to reply
16 replies to this topic

#1 George Ebersole

George Ebersole
  • Sustaining Members
  • 1265 posts
  • Industry Rep
  • San Francisco Bay Area

Posted 09 October 2014 - 02:32 PM

Another dumb question; but now that older films can be digitally scanned and played back at their intended speed, is it still necessary to transfer them for playback at 29p?  Can they not be encoded at the initial 24fps or 25fps so they can play back at the same frame rate on a multimedia monitor?

 

Or do newer HD TVs still have the same refresh rate, which dictates films be shown at a higher frame rate than what they were shot at?

 

I ask because I saw an old Western TV series this morning on my new TV, and it still looked sped up even though it was broadcast over a digital channel.  


  • 0




#2 Phil Rhodes

Phil Rhodes
  • Sustaining Members
  • 11234 posts
  • Other

Posted 09 October 2014 - 03:52 PM

Sorta, kinda. This actually gets quite complicated.

 

It's not so much about the video being at the same rate as the display - the display will invariably sync to the signal it's fed by the computer or decoder. It's about the video being at the same rate as the signal.

 

On a computer, for instance, you pick a display mode when setting the thing up, the computer then works at that frame rate regardless of what you do with it, and all the video you then watch is forced into the resulting frame rate with frame duplication. This is sort of OK when you're showing (say) 25hz material on a 50hz or 100hz display, or 30hz material on a 60hz display, but it gets a bit messy if you have (as is very common) a 60hz display with 25hz content on it. This happens all the time and it isn't usually that objectionable. Mostly. Er. It's not great.

 

As such, if you're watching netflix on a computer they can do what they like as precision per-frame display isn't really possible. I'm not sure what they actually do, mind you.

 

TV decoders (whether built in or external) tend to set up their output signal to suit what the currently-viewed signal is doing, meaning that the content, the signal to the display, and the display are all doing the same thing. A TV channel will have picked a frame rate at some point. I assume that, for instance, a channel dedicated to movies would broadcast at 24p (or 48i segmented frame, more likely) and the receiver chain should follow that. There are 720p 60Hz sports channels and 1080p 29.97hz channels in the US.

 

I'm not sure whether current broadcast standards allow channels to switch frame rates on the fly, or how most consumer gear would react if they did. Certainly the tech to do it exists.

 

Bear in mind that NTSC-region DVDs of "cinema" content always were encoded at (strictly speaking) 23.976fps, and the DVD players did the 3:2 pulldown, to avoid wasting bandwidth encoding duplicated fields. PAL-region DVDs were always encoded at 25fps and no special treatment was required and there were 29.976 modes for NTSC DVDs of content originated at video frame rates.

 

Things like YouTube and Vimeo support a huge range of frame rates, depending what you fancy (though last time I checked, which admittedly was a year or so ago, YouTube didn't support much more than 30fps, so you couldn't do HFR on it).

 

There is a whizzy new technology called G-sync which is intended for video games, where frames may be rendered by the games machine or computer with variable per-frame timing depending on the complexity of the scene. With this technology the monitor will accept updates whenever they're available, often up to a maximum of above 100fps, which pretty much solves all your problems...

 

This is a bit of a crapshoot right now.

 

P


  • 0

#3 George Ebersole

George Ebersole
  • Sustaining Members
  • 1265 posts
  • Industry Rep
  • San Francisco Bay Area

Posted 09 October 2014 - 04:29 PM

Well, I read the tech heavy link that David Mullen posted on a similar topic in the General Discussion section, and it struck me that NTSC was still using the same 29 or 30 video frame rate, but since there's technology to adjust the refresh rate on mode advanced multimedia monitors through the software, it just seemed like the know-how would have caught up and been built into more recent televisions.

 

But all the old shows I saw in the 70s, a lot of which were reruns from shows from the 60s and 50s, all of which were US shows shot at 24fps, were essentially running at either 29 or 30 to match the video refresh rate.  They had been intentionally sped up to match the video refresh rates, and it's pretty annoying.

 

Something tells me I'm going to have to reread Mullen's link again.


  • 0

#4 Perry Paolantonio

Perry Paolantonio
  • Basic Members
  • PipPipPipPip
  • 349 posts
  • Other
  • Boston, MA

Posted 09 October 2014 - 05:23 PM

Another dumb question; but now that older films can be digitally scanned and played back at their intended speed, is it still necessary to transfer them for playback at 29p?  Can they not be encoded at the initial 24fps or 25fps so they can play back at the same frame rate on a multimedia monitor?

 

Or do newer HD TVs still have the same refresh rate, which dictates films be shown at a higher frame rate than what they were shot at?

 

I ask because I saw an old Western TV series this morning on my new TV, and it still looked sped up even though it was broadcast over a digital channel.  

 

How old was this western? Like, silent era? Because even if it's an old transfer of a sound film, it shouldn't appear to be running fast. If it was shot at 24fps, it would have been transferred to NTSC at 29.97 with pulldown - apparent speed will remain the same, but there will be repeated fields. Same *could* be the case for a transfer to 1080i.

 

If the film was not shot at 24fps, then what you may be seeing is a 1:1 frame mapping of the original film to 24fps. That would give it the keystone kops effect of speeding everything up.

 

-perry


  • 0

#5 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 18789 posts
  • Cinematographer
  • Los Angeles

Posted 09 October 2014 - 10:18 PM

I found a region-free blu-ray of "Smash" in France but when I tried to play it here on a blu-ray player it wouldn't play. Put it into my computer and it played but I discovered that it was running at 25 fps.
  • 0

#6 George Ebersole

George Ebersole
  • Sustaining Members
  • 1265 posts
  • Industry Rep
  • San Francisco Bay Area

Posted 10 October 2014 - 02:43 AM

 

How old was this western? Like, silent era? Because even if it's an old transfer of a sound film, it shouldn't appear to be running fast. If it was shot at 24fps, it would have been transferred to NTSC at 29.97 with pulldown - apparent speed will remain the same, but there will be repeated fields. Same *could* be the case for a transfer to 1080i.

 

If the film was not shot at 24fps, then what you may be seeing is a 1:1 frame mapping of the original film to 24fps. That would give it the keystone kops effect of speeding everything up.

 

-perry

 

It was from 1966, or thereabouts.  I wish I could remember the name of it.


  • 0

#7 George Ebersole

George Ebersole
  • Sustaining Members
  • 1265 posts
  • Industry Rep
  • San Francisco Bay Area

Posted 10 October 2014 - 02:45 AM

I found a region-free blu-ray of "Smash" in France but when I tried to play it here on a blu-ray player it wouldn't play. Put it into my computer and it played but I discovered that it was running at 25 fps.

 

That's strange.  I wonder why that would be.  I thought a player just read a continual stream of data via the laser.  Why would the frame rates matter?


  • 0

#8 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 18789 posts
  • Cinematographer
  • Los Angeles

Posted 10 October 2014 - 02:56 AM

It doesn't matter on my computer monitor but it appears to matter between my blu-ray player and my TV set through my HDMI cable.


  • 0

#9 Phil Rhodes

Phil Rhodes
  • Sustaining Members
  • 11234 posts
  • Other

Posted 10 October 2014 - 05:35 AM

I'm slightly surprised that a US-purchased DVD player wouldn't play an unlocked 25p disc. There's almost certainly no technical, hardware-based reason why not. The disc reader part of it wouldn't care, and I'd be astonished if they made different versions globally with different HDMI output circuitry. It'd be unusual to find a TV that wasn't multi-standard.

 

I would assume it was deliberately deciding not to do it, based on its firmware, but I can't fathom why that would help anyone.

 

P


  • 0

#10 George Ebersole

George Ebersole
  • Sustaining Members
  • 1265 posts
  • Industry Rep
  • San Francisco Bay Area

Posted 10 October 2014 - 10:33 AM

It doesn't matter on my computer monitor but it appears to matter between my blu-ray player and my TV set through my HDMI cable.

 

Why would the HDMI cable have anything to do with it?


  • 0

#11 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 18789 posts
  • Cinematographer
  • Los Angeles

Posted 10 October 2014 - 10:40 AM

I don't know why, I'm just giving all the clues I have.

 

As Phil suggests, it probably has something to do with the player being locked to Region 1/A specs.

 

My point is that if you plan on showing your transfer on a TV set through a disc player, pick a broadcast standard for the frame rate. If this is for the internet, it probably matter less what frame rate the transfer is in.


  • 0

#12 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 18789 posts
  • Cinematographer
  • Los Angeles

Posted 10 October 2014 - 10:45 AM

But all the old shows I saw in the 70s, a lot of which were reruns from shows from the 60s and 50s, all of which were US shows shot at 24fps, were essentially running at either 29 or 30 to match the video refresh rate.  They had been intentionally sped up to match the video refresh rates, and it's pretty annoying.

 

 

 

24 fps movies aren't "sped up" to be shown on 30 fps NTSC, they get a 3:2 pulldown added to spread out 48 fields over 60 fields.


  • 0

#13 George Ebersole

George Ebersole
  • Sustaining Members
  • 1265 posts
  • Industry Rep
  • San Francisco Bay Area

Posted 10 October 2014 - 11:06 AM

David;

 

Yeah,. I know that that's the standard practice, but on the local WB affiliate (COZI TV?) a couple of the shows actually look sped up during the telecine process.  Remember all those old Star Trek and Mission Impossible reruns you saw, now imagine them sped up to 30fps, only the sound is adjusted such that it doesn't go up in pitch.

 

That's kind of what I'm talking about.  The odd thing is that it isn't very program, just a select few.  And I had actually seen this technique used in the 90s.

 

I don't know, maybe it's my imagination or something.  I'll have to keep an eye out for it.


  • 0

#14 David Mullen ASC

David Mullen ASC
  • Sustaining Members
  • 18789 posts
  • Cinematographer
  • Los Angeles

Posted 10 October 2014 - 11:25 AM

I've never seen that but I suppose it's possible that some TV station with an old film chain and a print could run it at 30 fps in order to make the episode run shorter and squeeze more commercials in.  But if station didn't own the print, then that would mean that they got a 30 fps video copy with a 3:2 pulldown and were just speeding the video, which doesn't make much sense.


  • 0

#15 Bruce Greene

Bruce Greene
  • Basic Members
  • PipPipPipPip
  • 393 posts
  • Cinematographer
  • Los Angeles

Posted 10 October 2014 - 11:26 AM

I have seen the same effect, I think. The old shows are sped up to allow time for more commercials. But not all the way to 30fps. I think they just drop frames here and there...
  • 0

#16 George Ebersole

George Ebersole
  • Sustaining Members
  • 1265 posts
  • Industry Rep
  • San Francisco Bay Area

Posted 10 October 2014 - 12:13 PM

I've never seen that but I suppose it's possible that some TV station with an old film chain and a print could run it at 30 fps in order to make the episode run shorter and squeeze more commercials in.  But if station didn't own the print, then that would mean that they got a 30 fps video copy with a 3:2 pulldown and were just speeding the video, which doesn't make much sense.

 

That's kind of what I was thinking.  I've heard it done for radio stations; compress talk show hosts audio to fit in more sponsors, but I hadn't heard of it being done for TV.  But, thinking about it, I'm guessing that's the reason for it.

 

I thought it was simply to make the transfer easier, and that the side effect was a shorter show that allowed more commercials.


  • 0

#17 George Ebersole

George Ebersole
  • Sustaining Members
  • 1265 posts
  • Industry Rep
  • San Francisco Bay Area

Posted 10 October 2014 - 12:15 PM

I have seen the same effect, I think. The old shows are sped up to allow time for more commercials. But not all the way to 30fps. I think they just drop frames here and there...

 

Yeah, the other reason I thought it might be done was to improve the image quality.  When you see film that's slightly sped up the dust and other artifacts that might be on the print don't register with your eye.  I thought it might be an effort to make those shows look a little more presentable when broadcast, and thereby attract more viewers.   The upshot of which I guess is more commercial time.


  • 0


CineLab

Tai Audio

Aerial Filmworks

Paralinx LLC

Rig Wheels Passport

Abel Cine

Glidecam

Visual Products

Willys Widgets

The Slider

Pro 8mm

Ritter Battery

rebotnix Technologies

Technodolly

Zylight

CineTape

Broadcast Solutions Inc

rebotnix Technologies

Ritter Battery

Visual Products

Broadcast Solutions Inc

CineLab

Willys Widgets

Abel Cine

Zylight

The Slider

Glidecam

Aerial Filmworks

Rig Wheels Passport

Pro 8mm

Tai Audio

Technodolly

CineTape

Paralinx LLC