From: Roger N. Clark (change username to rnclark) on
acl wrote:
> On Mar 12, 2:11 am, "Bart van der Wolf" <bvdw...(a)no.spam> wrote:
>> "John Sheehy" <J...(a)no.komm> wrote in message
>>
>> news:Xns98F06D6F99D10jpsnokomm(a)130.81.64.196...
>>
>>> "Roger N. Clark (change username to rnclark)" <usern...(a)qwest.net>
>>> wrote
>>> innews:45F160FC.5020001(a)qwest.net:
>>>> The problem is that our eyes plus brain are very good at
>>>> picking out patterns, whether that pattern is below random
>>>> noise, or embedded in other patterns.
>> What's worse, we see non-existing patterns (e.g. a triangle in the
>> following link) because we want to:
>> <http://www.xs4all.nl/~bvdwolf/temp/Triangle-or-not.gif>.
>>
>>> Yes, that is a problem, and that is exactly why you can't evaluate
>>> noise by standard deviation alone.
>> That depends what one wants to evaluate. Standard deviation (together
>> with mean) only tells something about pixel to pixel (or sensel to
>> sensel) performance. It doesn't allow to make valid judgements about
>> anything larger.
>
> As a matter of fact, they don't tell you anything (literally) about
> pixel to pixel behaviour. If I tell you that a signal has mean zero
> and given standard dev, what else can you tell me about it? Nothing.
> It could be anything from an otherwise random time series to a sine
> wave to a series of square waves to anything else. It's like knowing
> the first two coefficients in an infinite power series (well that's
> exactly what it is: the first two coefficients in an infinite power
> series).

And that is why people who evaluate sensors do more than simply
study the standard deviation of one image. To understand noise sources,
the standard procedure is to make a series of exposures and analyze
the results from the different test conditions. e.g.:

The Nikon D50 Digital Camera:
Sensor Noise, Dynamic Range, and Full Well Analysis
http://www.clarkvision.com/imagedetail/evaluation-nikon-d50

http://www.clarkvision.com/imagedetail/long-exposure-comparisons/index.html

and more at:
http://www.clarkvision.com/imagedetail/index.html#sensor_analysis

other:
http://www.astrosurf.org/buil/20d/20dvs10d.htm

Roger

> the reason people use the first two moments (mean and std) is that the
> noises under consideration are often assumed to be gaussian, in which
> case these 2 qtys completely characterise the noise. this is usually a
> good approximation when the noise comes from many different sources.
>
>> Banding could be either calibrated out of the larger
>> structure, or an analysis of systematic noise should be done (and care
>> should be taken to not mistake Raw-converter effects for camera or
>> sensor array effects).
>
>
From: acl on
On Mar 12, 2:53 pm, "Roger N. Clark (change username to rnclark)"
<usern...(a)qwest.net> wrote:

> And that is why people who evaluate sensors do more than simply
> study the standard deviation of one image. To understand noise sources,

Never claimed otherwise! By the way, why don't people study the full
power spectrum of the noise (ie of a blackframe)? That would give
quite a lot of information (it should allow distinguishing between the
white part of the noise and things like banding). And it should not be
too hard (eg with IRIS, split the channels and FT them). And if you do
that to an average of many frames, you'll be studying repeatable noise
only. Is there some particular reason this isn't done by anybody?
>
> The Nikon D50 Digital Camera:
> Sensor Noise, Dynamic Range, and Full Well Analysis
> http://www.clarkvision.com/imagedetail/evaluation-nikon-d50
>

That's quite interesting, why don't you include dark frames from more
cameras? I'd think that this would be quite useful for people
intending to do very low light work.

> http://www.clarkvision.com/imagedetail/long-exposure-comparisons/inde...
>
> and more at:http://www.clarkvision.com/imagedetail/index.html#sensor_analysis
>
> other:http://www.astrosurf.org/buil/20d/20dvs10d.htm
>
> Roger
>

From: acl on
On Mar 12, 12:15 am, "Bart van der Wolf" <bvdw...(a)no.spam> wrote:
> "John Sheehy" <J...(a)no.komm> wrote in message
>
> news:Xns98F06DCDB2811jpsnokomm(a)130.81.64.196...
>
> > "Roger N. Clark (change username to rnclark)" <usern...(a)qwest.net>
> > wrote in
> >news:45F21915.5090409(a)qwest.net:
>
> >> I too agree that pattern noise is more obvious that random noise.
> >> Probably by at least a factor of ten. It is our eye+brain's
> >> ability to pick out a pattern in the presence of a lot
> >> of random noise that makes us able to detect many things
> >> in everyday life. It probably developed as a necessary
> >> thing for survival. But then it becomes a problem when we try
> >> and make something artificial and we see the defects in it.
> >> It gives the makers of camera gear quite a challenge.
>
> > How does that co-exist with your conclusion that current cameras are
> > limited by shot noise?
>
> Shot noise is a physical limitation, not a man made one. The man made
> limitations can be improved upon.
>

The speed of light is also a physical limitation. Would you therefore
agree to the statement that the top speeds of current spaceships are
limited by the speed of light, and therefore we must work on finding
ways to circumvent that (rather than on finding some better propulsion
system than semi-controlled explosions) :)? (I'm not claiming that
banding really is the main limitation, by the way, I actually agree
with Roger and presumably you).

From: Roger N. Clark (change username to rnclark) on
acl wrote:
> On Mar 12, 2:53 pm, "Roger N. Clark (change username to rnclark)"
> <usern...(a)qwest.net> wrote:
>
>> And that is why people who evaluate sensors do more than simply
>> study the standard deviation of one image. To understand noise sources,
>
> Never claimed otherwise! By the way, why don't people study the full
> power spectrum of the noise (ie of a blackframe)? That would give
> quite a lot of information (it should allow distinguishing between the
> white part of the noise and things like banding). And it should not be
> too hard (eg with IRIS, split the channels and FT them). And if you do
> that to an average of many frames, you'll be studying repeatable noise
> only. Is there some particular reason this isn't done by anybody?

Time and effort--remember most are doing this for free out of
curisoty. I started doing this to try and get the best camera for
astrophotography. Then after seeing the trends, it became clear to
me that because the photon noise limit had been reached, one can
model and predict performance pretty closely. Now I find it
interesting about the claims coming out in some press releases
that seem to ignore physical reality ;-).
I and other astrophotographers tend to ignore fixed pattern noise
because we can calibrate most of it out of our images. If that is an
issue for other people, then I suggest they learn how to take
dark frames, average them, and subtract them from their images.
It is really pretty easy, but for best results, it needs to be
done on linear data. Another calibration that can improve images is
flat field calibration, which not only corrects for pixel to pixel
variations, but corrects for light fall-off from lenses.

But if someone wants to pay me to run more tests......

>> The Nikon D50 Digital Camera:
>> Sensor Noise, Dynamic Range, and Full Well Analysis
>> http://www.clarkvision.com/imagedetail/evaluation-nikon-d50
> That's quite interesting, why don't you include dark frames from more
> cameras? I'd think that this would be quite useful for people
> intending to do very low light work.

Again, time. I do have a fair amount of additional data for a number
of cameras but I have not had time to write it up.

Roger

>> http://www.clarkvision.com/imagedetail/long-exposure-comparisons
>>
>> and more at:http://www.clarkvision.com/imagedetail/index.html#sensor_analysis
>>
>> other:http://www.astrosurf.org/buil/20d/20dvs10d.htm
>>
>> Roger
>>
>
From: Doug McDonald on
Roger N. Clark (change username to rnclark) wrote:
> acl wrote:
>> On Mar 12, 2:53 pm, "Roger N. Clark (change username to rnclark)"
>> <usern...(a)qwest.net> wrote:
>>
>>> And that is why people who evaluate sensors do more than simply
>>> study the standard deviation of one image. To understand noise sources,
>>
>> Never claimed otherwise! By the way, why don't people study the full
>> power spectrum of the noise (ie of a blackframe)? That would give
>> quite a lot of information (it should allow distinguishing between the
>> white part of the noise and things like banding). And it should not be
>> too hard (eg with IRIS, split the channels and FT them). And if you do
>> that to an average of many frames, you'll be studying repeatable noise
>> only. Is there some particular reason this isn't done by anybody?
>
> Time and effort--remember most are doing this for free out of
> curisoty. I started doing this to try and get the best camera for
> astrophotography. Then after seeing the trends, it became clear to
> me that because the photon noise limit had been reached, one can
> model and predict performance pretty closely.


I have the Canon 30D. I took a bunch of very underexposed shots
recently (no tripod at critical time) and found that background
subtraction didn't help much. The annoying noise is some sort
of horizontal banding or streaking (these are landscape shots).
Looks sort of like they scan the image TV-wise and this is 1/f noise
in the amplifiers.

Comments?

Doug McDonald