Jump to content


Photo

Setting the Record Straight


  • Please log in to reply
8 replies to this topic

#1 Ted Johanson

Ted Johanson
  • Basic Members
  • PipPip
  • 46 posts
  • Student
  • Wisconsin

Posted 30 August 2007 - 02:36 AM

I noticed that some person on this forum (who will remain unnamed) made the following ridiculous statement:

Modern debayering...easily extracts a good 75-90% of the cmos array resolution.


I intend to put that lie to death with the truth of real-world examples.

Please visit the site below to see resolution tests comparing a Bayer-based sensor to a true resolution sensor.

http://www.ddisoftware.com/sd14-5d/

These resolution tests involved a Canon 20D (8.2 megapixels) and a Sigma SD14 (4.6 megapixels). The difference between these two cameras is that the SD14 uses a Foveon sensor, which is capable of full-color sampling at each each pixel site. The 20D can only sample one color at each pixel site.

Mathematics would suggest that the resolution of a Bayer-filtered image is only worth half of the advertised pixel count. That is because half of the "pixels" on the chip are green. Why are there more green than red or blue? Because, green is more important to a human's visual system when perceiving detail. That's why Bryce Bayer chose to give more resolving power to the green component. However, green pixels aren't always "stimulated", thereby leaving the red or blue "pixels" to fend for themselves. In that case, the resolution can be as low as 1/4 the advertised pixel count. Can a human's visual system tell the difference in that case? Yes!

Even if the human visual system couldn't tell the difference in resolution between a two megapixel red image and four megapixel green image, that still doesn't negate the fact that such poor sampling resolution is a farce for something that is supposedly so perfect. What would happen if you wanted to use the red component only for black and white conversion? Or what if you wanted to enlarge the image; whether digitally or by physically stepping closer to the image? The decreased resolution would stick out like a sore thumb. It doesn't matter if a human can notice the difference or not, because certain other (computer) processes need all the resolution they can get.

So, according to the methodology outlined above, an 8 "megapixel" Bayer sensor really only has 4 megapixels of true resolution at best. Furthermore, Bayer pattern sensors require low-pass (blur) filters to reduce color artifacts on edges. This further reduces the ability of the sensor to resolve fine detail.

This all seems to ring true judging by the test images. At the site linked to above, scroll down to the resolution pinwheels. Note that the Canon 20D, which supposedly has almost twice as many pixels, really only has equivalent resolution to the SD14 in the white and green quadrants of the pinwheel. Meanwhile, the Sigma SD14 - in spite of it's lower pixel count - clearly outresolves the 20D in the red and blue quadrants of the pinwheel! Also of note is that the 20D's images are significantly softer all across the board!

With all that said, anyone with a decent pair of eyes should be able to see that an image from a Bayer-based digital camera - when viewed at 1:1 pixel magnification - is nowhere near as sharp as the same image viewed at 1:2 pixel magnification (50% zoom). Doesn't that tell them something? It should be obvious that real pixels can do a lot better.
  • 0

#2 Stephen Williams

Stephen Williams
  • Sustaining Members
  • 4708 posts
  • Cinematographer
  • Europe

Posted 30 August 2007 - 03:36 AM

I noticed that some person on this forum (who will remain unnamed) made the following ridiculous statement:
I intend to put that lie to death with the truth of real-world examples.


Hi Ted,

Thank you for that, I have a feeling the original poster may not look at this part of the forum.

Stephen
  • 0

#3 Paul Bruening

Paul Bruening

    (deceased)

  • Sustaining Members
  • 2858 posts
  • Producer
  • Oxford, Mississippi

Posted 30 August 2007 - 02:03 PM

My, that was enthusiastic.
  • 0

#4 Mitch Gross

Mitch Gross
  • Basic Members
  • PipPipPipPip
  • 2873 posts
  • Cinematographer

Posted 30 August 2007 - 02:39 PM

Not all Bayer sensors are alike, not all de-bayering software programs are alike, not all of ANYTHING is alike. To make this statement based on a single test is innaccurate at best. I have seen two Bayer pattern sensors of exactly the same stated resolution with vastly different MTF response. And I have also seen 3-chip sensor blocks with poor prism alignment that had a significantly lower final resolution than the Bayer pattern of the same stated resolution. Theory is one thing, practice is another.
  • 0

#5 Phil Rhodes

Phil Rhodes
  • Sustaining Members
  • 11939 posts
  • Other

Posted 30 August 2007 - 04:37 PM

I think you can certainly say that the 75-90% claim is complete drivel. It's not just a case of making up the pixels which aren't the right colour, it's also necessary to LPF the pixels you do have to avoid it crawling.

It really isn't a terribly ideal way to proceed unless you can outresolve the target resolution at least two or three times over. The idea that Red have a real, honest 4K camera is laughable.

Phil
  • 0

#6 Graeme Nattress

Graeme Nattress
  • Basic Members
  • PipPipPip
  • 145 posts
  • Other

Posted 30 August 2007 - 05:44 PM

Sigma cameras avoid using the necessary optical low pass filter. This makes their images "sharper" by allowing the sensor to alias and produce fake detail in the form of aliassing artifacts. This is probably because of the "low" resolution of the sensor and if they'd properly filtered it, it would be lower resolution still.

Aliasses are bad enough in a still, but at least you could go in by hand and pixel paint them out if you so desire. Doing that on a motion picture would be terminal. There's no way to automatically detect an alias from real resolution, and even downsampling the image won't remove the aliasses, so you've got to avoid them getting into the the system in the first place.

Graeme
  • 0

#7 John Sprung

John Sprung
  • Sustaining Members
  • 4635 posts
  • Other

Posted 30 August 2007 - 06:16 PM

These resolution tests involved a Canon 20D (8.2 megapixels) and a Sigma SD14 (4.6 megapixels).


The first and foremost mistake -- and I've been guilty of it too -- is this sloppy business of talking about pixel counts on camera chips. No matter what camera, no matter what chip, the correct answer to how many pixels it has is: Zero.

Chips have photosites, not pixels. The difference is important.

A pixel is a set of brightness and color data for one position in a theoretical rectangular grid.

A photosite is a light sensitive region on the surface of a chip.

Can there be a 1:1 correspondence between photosites and pixels? Maybe on the Foveon chip. No way on a Bayer or any other single chip color system, like the vertical stripe array of the Genesis. How about a three chip camera? Guess what -- the photosites on one chip don't exactly line up with the ones on the other chips. On a 2/3" chip, the center distance from one photosite to the next is 0.0002". Tell a machinist that you have to make some objects that are 0.0002" square, and stack them up so they're lined up. Not gonna happen in the real world. But it doesn't matter, the next section tells why:


Mathematics would suggest that the resolution of a Bayer-filtered image is only worth half of the advertised pixel count. That is because half of the "pixels" on the chip are green. Why are there more green than red or blue? Because, green is more important to a human's visual system when perceiving detail. That's why Bryce Bayer chose to give more resolving power to the green component.


Not quite that simple. The human visual system resolves brightness much better than it does color. That's why it was possible to add color to NTSC using a very small piece of the spectrum for the color subcarrier. That's why 4:2:2 and all that stuff works. Green is more important to brightness than the other primaries, luminance equations generally are something like Y = 0.7G + 0.2R + 0.1B. That's probably what Bayer was thinking about.


However, green pixels aren't always "stimulated", thereby leaving the red or blue "pixels" to fend for themselves. In that case, the resolution can be as low as 1/4 the advertised pixel count.


The idea you're kinda heading towards here is called "undersampling". It happens on all chips, because they need some room between photosites. It's what causes moire patterns, among other things.


It should be obvious that real pixels can do a lot better.


What's a real pixel? ;-)



-- J.S.
  • 0

#8 Phil Rhodes

Phil Rhodes
  • Sustaining Members
  • 11939 posts
  • Other

Posted 30 August 2007 - 06:34 PM

> Not quite that simple. The human visual system resolves brightness much better than it does color. That's why
> it was possible to add color to NTSC using a very small piece of the spectrum for the color subcarrier. That's
> why 4:2:2 and all that stuff works. Green is more important to brightness than the other primaries, luminance
> equations generally are something like Y = 0.7G + 0.2R + 0.1B. That's probably what Bayer was thinking
> about.

Yes and that's absolutely fine - until someone moves closer to the screen, or even just starts concentrating on it harder, or tries to key it, or...

All these things are crutches and workarounds.

Phil
  • 0

#9 John Sprung

John Sprung
  • Sustaining Members
  • 4635 posts
  • Other

Posted 30 August 2007 - 07:46 PM

Yes and that's absolutely fine - until someone moves closer to the screen, or even just starts concentrating on it harder, or tries to key it, or...

All these things are crutches and workarounds.

Phil

Yes, they're workarounds for limited bandwidth or limited bitrate.

Suppose, for instance, that you have enough capacity to send a 1280 x 720 picture in 4:2:0. If you keep the bit depth the same but change to 4:4:4, you'll have to use fewer pixels.

In 4:2:0, you'd have 1280 x 720 samples for luminance, and 640 x 360 samples each for the red and blue color differences. Do the math, it works out to 1,382,400 samples.

Now consider a 4:4:4 picture that's 906 x 510. It would have 906 x 510 samples for each component, which comes to 1,386,180, just slightly more than in the 4:2:0 example. Which one breaks down first as you move closer to the screen?

All these things are good engineering solutions to real world problems. The trouble is, sometimes they get applied where thery shouldn't, such as in pulling keys. There you're using chroma data to make luma decisions. Bottom line, these ideas are for looking at, not manipulating, pictures.



-- J.S.
  • 0


Ritter Battery

Wooden Camera

Metropolis Post

Visual Products

rebotnix Technologies

Tai Audio

Willys Widgets

FJS International, LLC

CineTape

Glidecam

Abel Cine

Aerial Filmworks

Broadcast Solutions Inc

Opal

The Slider

Paralinx LLC

CineLab

Rig Wheels Passport

Technodolly

Media Blackout - Custom Cables and AKS

Gamma Ray Digital Inc

Opal

Rig Wheels Passport

Media Blackout - Custom Cables and AKS

Glidecam

Ritter Battery

Gamma Ray Digital Inc

CineLab

Willys Widgets

Aerial Filmworks

rebotnix Technologies

Paralinx LLC

FJS International, LLC

Technodolly

Metropolis Post

Tai Audio

Visual Products

Wooden Camera

The Slider

Broadcast Solutions Inc

Abel Cine

CineTape