Posted 11 June 2009 - 11:48 PM
Posted 13 June 2009 - 03:46 AM
+12db has visible noise - I wouldn't use it unless I really had no other way of getting the shot.
Remember that because you are shooting RAW there is a lot of information captured in the dark areas that can be 'graded up' later so use the higher gain settings sparingly.
When shooting in low light situ
ations I find it very useful to turn off the .Look (touch the screen center right) so you can see the RAW and have a better idea of what you are capturing.
The one thing you MUST do when using the SI-2K (& especially in dark situations) is black balance, otherwise your image will have a lot of 'fixed pattern noise' which is impossible to get rid of.
I now try to do this every half hour or so when shooting as temperature changes can effect the noise levels (usually blue channel) in the blacks.
Posted 14 June 2009 - 11:53 AM
The gain is probably the most complicated issue in SI.
1st thing to remember that this gain is not linear, meaning the voltage to chip is changed and therefore anything that is within "normal" exposure is "g®ain" free.
1. You choose not to mess with the chip gain and use LUT for linear gaining (like RED iso) then you will mathematically multiply all the values within linear image, meaning the darks will get worst degradation because they have the least info recorded. This is what we have grown to expect from video(yes I used the horrible word here) cameras.
2. Changing the gain of the chip will give you cleaner image, provided that you do get decent exposure (if you keep the gain 0db and close way down, take off the LUT and you will see some "gain" as well) yet underexposure will have worse quality than less gain. There are two problems with this method:
A. CMOS chips have a lot of non-uniform variance from pixel to pixel - in non geek: they are grainy as hell, so this is tackled by black frame subtraction(black balance, black calibration....). Now if the chip is pushed hard and ambient temp of the chip it-self starts to wiggle you will get static noise in the image. For example I was shooting in 0-3C in full 12db gain and the "black noise" would creep up all the time, sometimes in 30sec - sometimes in 30min. The rail pain is to see the black noise in documentary setting where you do not have a full resolution screen and all the time in the world.
B. It will start to limit dynamic range of the recorded data - I would say even up to 2 stops in 12db and you will get smearing in very bright over exposure > 10+stops. Sometimes it is better to get 8 usable stops than 10 full of grain.
C. More gain means harder to compress, more CPU and less cineform quality.
In documentary settings I have used 12db excessively with about 50/50 success rate, because of the hard weather conditions I have had problems with static noise(what seems weird to me is that older heads in serials less than 200 it seems the black noise is handled much better - GUT FEELING)
I have use 6db in commercial and feature production with 100% success, nobody has noticed the difference.
And never use -3db, it only reduces dynamic range, meaning 100% white becomes something like 93% white, even overexposure. Use Shutter or ND.
Hope that helps a bit
Posted 17 June 2009 - 10:12 PM
About what were you rating the SI at with 6db?