Could someone please confirm my math and if I'm wrong show me where I'm wrong. I'm having this debate with someone.

My argument is that a 720P scan of a regular 8 frame is approximately a 5200 dpi scan. A regular 8 frame is .137 inches high. So...if there is 720 dots per .137 inches then that is equal to about 5200 dots per full inch making that a scan resolution of 5200 DPI on a regular 8 frame. This would make scanning at 720P of regular or super 8 film and upscaling to 1080P virtually indistinguishable from scanning directly at 1080P from such a tiny frame.

Thanks

The math might be right but the logic is not.

So yes:

720 pixels/0.137 inches = 5255 pixels/inch

But then:

1080pixels/ 0.137 inches = 7883 pixels/inch

So in what way would it follow from the maths that scanning at 5255 pixels/inch (and upscaling) would be virtually indistinguishable from scanning at 7883pixels/inch.

In fact the contrary conclusion occurs: scanning at 5255p/in is not virtually indistinguishable from scanning at 7883/in, (and upscaling doesn't change anything as there is no new information being added to the mix). The numbers are significantly different.

Note that mathematically you could have also just ignored DPI

The DPI ratio of 5255:7883 is exactly the same ratio as the pixel ratio: 720:1080

So whatever difference you find (or not) between the numbers 5255 and 7883 you would also find (or not) the same difference using the numbers: 720 and 1080

The issue of distinguishability (or not) requires some further considerations

Carl

**Edited by Carl Looper, 09 December 2013 - 06:18 PM.**