HI,
i would like to know what is the way engineers set the output voltage of a signal in RAW, their maximum output voltage is set in regard of the full well capacity or the amplifier in pixels or column?
Also , After in the A/D converter the voltage reference of the converter is something that can be set or it's its resolution that locks it in a certain range?
I'm Belgian, so sorry for my english, i'm not sure i as using the right terms. I'm student in a film school but i'm very curious about sensors, i'm not a pro so i'm just trying to learn.
I've tuned out of video processing long before the advent of modern sensors... But most signals are given gain to certain levels, that are chosen for some sort of optimum. Such as Signal-to-noise ratio is a popular value.
What ever that voltage is for 'no photon hitting the sensor cell' to 'overflow of the sensor', gives the total available range... There often are errors in the extremes, such as noise when only a few photons interact with the sensor, vs at the high end. The temperature of the system also affects the such things as noise...
So all that goes into the design 'box' where everything works with whatever trade offs are required.
A Digital converter will produce a binary number for a given analog input. Take the high and low of the analog voltages, divide that by the full bit count out of the converter, and you will get the volts, milivolts, or microvolts of each step, which corresponds to how many photons interact with the sensor for that sized step.
There are any number of problems... the speed of the sampling will require less filtering and more noise...
In any case, the fact that one gets a usable image... let alone an image that could be called a 'piece of art'... is somewhat of a modern miracle.