With film, the sensitivity is normally measured by either ISO, ASA or DIN.
In video, the norm is at whatever lux the capture device reads (0db), and any increase in gain is measured in 'db'.
Are there any kind of standards for db ratings?
The reason I ask if because on a film I made yonks ago the DP had a light meter, and I now regret never asking him how he got it to work with the digital camera.
What good would it be if my light meter started asking for an ISO or ASA rating?
And if the light sensitivity between cameras set on 0db differ, then what it says on the light meter could may well be wrong. Unless the lightmeter has presets for different cameras, which I highly doubt.
So, how do you get a light metre to work with a digital video camera?
ohh... dammit... this just occurred to me, do you input the lux sensitivity into the light meter? And then input the amount of gain? (i.e. 3db, 6db, 18db?)
Edited by Daniel Ashley-Smith, 28 December 2006 - 07:13 PM.