Modern debayering...easily extracts a good 75-90% of the cmos array resolution.
I intend to put that lie to death with the truth of real-world examples.
Please visit the site below to see resolution tests comparing a Bayer-based sensor to a true resolution sensor.
These resolution tests involved a Canon 20D (8.2 megapixels) and a Sigma SD14 (4.6 megapixels). The difference between these two cameras is that the SD14 uses a Foveon sensor, which is capable of full-color sampling at each each pixel site. The 20D can only sample one color at each pixel site.
Mathematics would suggest that the resolution of a Bayer-filtered image is only worth half of the advertised pixel count. That is because half of the "pixels" on the chip are green. Why are there more green than red or blue? Because, green is more important to a human's visual system when perceiving detail. That's why Bryce Bayer chose to give more resolving power to the green component. However, green pixels aren't always "stimulated", thereby leaving the red or blue "pixels" to fend for themselves. In that case, the resolution can be as low as 1/4 the advertised pixel count. Can a human's visual system tell the difference in that case? Yes!
Even if the human visual system couldn't tell the difference in resolution between a two megapixel red image and four megapixel green image, that still doesn't negate the fact that such poor sampling resolution is a farce for something that is supposedly so perfect. What would happen if you wanted to use the red component only for black and white conversion? Or what if you wanted to enlarge the image; whether digitally or by physically stepping closer to the image? The decreased resolution would stick out like a sore thumb. It doesn't matter if a human can notice the difference or not, because certain other (computer) processes need all the resolution they can get.
So, according to the methodology outlined above, an 8 "megapixel" Bayer sensor really only has 4 megapixels of true resolution at best. Furthermore, Bayer pattern sensors require low-pass (blur) filters to reduce color artifacts on edges. This further reduces the ability of the sensor to resolve fine detail.
This all seems to ring true judging by the test images. At the site linked to above, scroll down to the resolution pinwheels. Note that the Canon 20D, which supposedly has almost twice as many pixels, really only has equivalent resolution to the SD14 in the white and green quadrants of the pinwheel. Meanwhile, the Sigma SD14 - in spite of it's lower pixel count - clearly outresolves the 20D in the red and blue quadrants of the pinwheel! Also of note is that the 20D's images are significantly softer all across the board!
With all that said, anyone with a decent pair of eyes should be able to see that an image from a Bayer-based digital camera - when viewed at 1:1 pixel magnification - is nowhere near as sharp as the same image viewed at 1:2 pixel magnification (50% zoom). Doesn't that tell them something? It should be obvious that real pixels can do a lot better.