From: John Sheehy on
Steve <steve(a)example.com> wrote in
news:ms4084hnjc51h1jqqdgfml2m8sa91hahuf(a)4ax.com:

> Or Nikon's D3 vs. Canon's 1DsMkIII. Both have very similar sensor
> sizes but very different pixel densities. Both are pretty much the
> top of the heap as far as what either manufacturer is capable of in
> every other respect so it's a fair comparison of what pixel density
> does to noise and DR.

> The problem there is that just like comparing almost identical low-end
> cameras with different pixel densities, the results would not support
> his hypothesis that increasing pixel density increases DR and
> decreases noise for a given sensor.

The quantum efficiencies and the per-pixel low ISO read noise are similar,
though, so you can still see what the noise is like at the image level,
crops blown up, and the same downsample ratios for both. I know the 1Dsmk3
will do better, but you might need proof.

--

<>>< ><<> ><<> <>>< ><<> <>>< <>>< ><<>
John P Sheehy <JPS(a)no.komm>
><<> <>>< <>>< ><<> <>>< ><<> ><<> <>><
From: John Sheehy on
John O'Flaherty <quiasmox(a)yeeha.com> wrote in
news:8lh2841k4vmfcdt6u1qgdv1tkokactlbne(a)4ax.com:

> I did try measuring an area of the sky (which presumably has uniform
> illumination) in photoshop histogram, and found a pixel level of about
> 45 and a standard deviation of about 22, on both magnified images.
> Does that mean an equal p-p noise level, but with the noise on the
> higher resolution image at higher spatial frequencies?

You can't directly take the standard deviation of an upsampled image to
gauge noise; it is meaningless, because standard deviation is only
meaningful as an indicator of pixel-level noise when the noise spectrum is
concentrated at the nyquist. When you have different distributions, then
the noise is in different frequencies, and frequency is everything with
noise. That is why I am emphasizing the visual here.

--

<>>< ><<> ><<> <>>< ><<> <>>< <>>< ><<>
John P Sheehy <JPS(a)no.komm>
><<> <>>< <>>< ><<> <>>< ><<> ><<> <>><
From: John Sheehy on
Steve <steve(a)example.com> wrote in
news:2qm2841aknsu6un0n7hovnar8pe73lbbs1(a)4ax.com:

> 3On Sat, 19 Jul 2008 00:37:19 GMT, John Sheehy <JPS(a)no.komm> wrote:

>>Steve <steve(a)example.com> wrote in
>>news:r14084pd0bfcgb14qej7eiquttaua3f4tn(a)4ax.com:

> Yes. Give each sensor the same image, same ISO, same exposure, same
> everything. Your test does not do that. You are making the 400D
> capture an image that's almost 300% the size of the FZ50 and then
> pixel peeping. That's not a fair test at all.

It is totally fair, because I am demonstrating the effects of PIXEL
DENSITY. Are you illiterate? Big pixels lose here, not because *I* am
cheating, but because they *SUCK* at resolving lenses per unit of area or
linear unit. That is one of the problems of low pixel density!

>>My FZ50 crop shows what the resolution and noise would look like with
>>the same exposure, same scene, etc, etc, with FZ50 pixels filling the
>>same

> Your test doesn't do that at all. Both cameras are not capturing the
> same scene. You're giving the 400D a 289% larger scene to capture.

The crops are from the same size area of the focal plane, with the same
exposure, same f/stop, and they are displayed at the same sizes. That is
*EXACTLY* what I am demonstrating and it is *TOTALLY* relevant to that
demonstration.


--

<>>< ><<> ><<> <>>< ><<> <>>< <>>< ><<>
John P Sheehy <JPS(a)no.komm>
><<> <>>< <>>< ><<> <>>< ><<> ><<> <>><
From: John Sheehy on
rfischer(a)sonic.net (Ray Fischer) wrote in news:48815f97$0$17159
$742ec2ed(a)news.sonic.net:

> Here's what really counts: Noise per pixel.

Oh, really? So a 4 pixel image with a pixel-level SNR of 100:1 has lower
image noise than a 4 billion pixel image with a pixel-level SNR of 101:1?

And what happens if you open up a new canvas in Photoshop or equivalent,
and fill with 50% gray, and add 15% gaussian chromatic noise, and then open
up the "Pixelate|Mosaic" dialogue, and view the canvas at 2x2, 3x3, 4x4,
etc. Now, watch the standard deviation go down in the histogram dialogue,
as you witness the noise remain at the same level.

--

<>>< ><<> ><<> <>>< ><<> <>>< <>>< ><<>
John P Sheehy <JPS(a)no.komm>
><<> <>>< <>>< ><<> <>>< ><<> ><<> <>><
From: John O'Flaherty on
On Sat, 19 Jul 2008 04:39:21 GMT, John Sheehy <JPS(a)no.komm> wrote:

>John O'Flaherty <quiasmox(a)yeeha.com> wrote in
>news:8lh2841k4vmfcdt6u1qgdv1tkokactlbne(a)4ax.com:
>
>> I did try measuring an area of the sky (which presumably has uniform
>> illumination) in photoshop histogram, and found a pixel level of about
>> 45 and a standard deviation of about 22, on both magnified images.
>> Does that mean an equal p-p noise level, but with the noise on the
>> higher resolution image at higher spatial frequencies?
>
>You can't directly take the standard deviation of an upsampled image to
>gauge noise; it is meaningless, because standard deviation is only
>meaningful as an indicator of pixel-level noise when the noise spectrum is
>concentrated at the nyquist. When you have different distributions, then
>the noise is in different frequencies, and frequency is everything with
>noise. That is why I am emphasizing the visual here.

The reason I made the measurement was that the color variation
appeared the same to me, visually, in the two enlargements; the
measurement confirmed what I saw. In the non-enlarged frames, of
course the finer image shows less noise, because it is filtered out,
by the eye or by the display device. I don't see how subjective
perception of noise can be separated from resolution. Is your test
intended to show a subjective effect, or something quantitative?
--
John