Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Stacking frames


butlermike

Recommended Posts

I'm fairly familiar with stacking software and what darks, bias and flats do and why they're wanted/needed.

lights are obviously the actual 'picture'.

What isn't obvious to me is what stacking does to the lights.

If, for example, you're tracking or guiding and taking exposures, (or even if you're using an automatic program to line up) you're taking exposures of exactly the same picture (near enough).

so why does stacking these same pictures make a difference? Why not take one picture and duplicate it and stack them?

as far as I'm aware the stacking software (DSS or similar) doesn't work in the same way as registax and removes unshared frames...does it?

hope that makes sense and I'm not asking a stupid question!

Link to comment
Share on other sites

The whole idea of stacking is that it averages out any random noise in the pictures. The end result is a pucture with a much better signal to noise ratio. if you just duplicated frames you would duplicate the noise as well as the signal and stacking would have no overall effect.

Link to comment
Share on other sites

Dark frames are to get rid of noise generated from within the camera. For example the slight heat from the cameras own electronics can cause part of the sensor to show a slight signal. To oversimplify - they are taken at the same exposure and temp to generate "the same amount of noise" as the camera generates during each "light" exposure. This is then subtracted from the image by the stacking software.

Link to comment
Share on other sites

The random noise is thermal noise. It appears as random noise in the signal and is different from the dark/readout noise which is sensor/electronics dependent. It is better to take one long exposure image but this may not be practical because of guide errors or the build up of noise swamps the image, although this would only really be an issue for very faint very long exposures. Thermal noise can be reduced by cooling the sensor. When you have to stack the SNR is proportional to the square root of the number of exposures. This is worse than for exposing for an equivalent longer time period but it is quite late and I can't remember why. However as you point out the dark/readout noise has to be subtracted from each image. What you are aiming for is maximum SNR by reducing noise effects as much as possible. e.g. Only reading out once as this adds noise, cooling to reduce thermal noise.

Link to comment
Share on other sites

Thanks for that.

So basically stacking the lights is another way to get rid of noise, along with subtracting the darks, rather than actually using them to get more 'light' information into the image?

technically then you should be able to stretch a single image to get the same amount of detail than in a stacked image but with loads of noise. Until the details get lost in the noise I suppose.

That might just answer why you stack images...?

Thanks for the explanations Bizibilder and gkec!

Link to comment
Share on other sites

The random noise is thermal noise. It appears as random noise in the signal and is different from the dark/readout noise which is sensor/electronics dependent. It is better to take one long exposure image but this may not be practical because of guide errors or the build up of noise swamps the image, although this would only really be an issue for very faint very long exposures. Thermal noise can be reduced by cooling the sensor. When you have to stack the SNR is proportional to the square root of the number of exposures. This is worse than for exposing for an equivalent longer time period but it is quite late and I can't remember why. However as you point out the dark/readout noise has to be subtracted from each image. What you are aiming for is maximum SNR by reducing noise effects as much as possible. e.g. Only reading out once as this adds noise, cooling to reduce thermal noise.

No sorry that is wrong. 'Thermal Noise' is just another name for 'Dark Current Noise'. Broadly speaking the image consists of the following:

- The signal, i.e. the photons from the sky, which includes photons from the target, photons from light pollution (man made or lunar) and photons from natural sky glow.

- Noise in the signal from the sky. Photons do not arrive in a constant stream, they arrive randomly and thus your image will contain something called photon shot noise.

- Dark current "signal". This is a build up of unwanted electrons over time due to thermal effects in the sensor.

- Dark current noise. Again the dark current doesn't build up constantly, it is also generated randomly and contains noise.

- Readout noise, caused by the electronics of the camera.

You remove readout noise by using a bias frame, which is basically a dark frame with the shortest possible exposure. (If you are using a CCD camera, this removes the bias from the image as well, as the camera does not start with 'black' pixels, they are pre-set to a certain voltage before the exposure starts. In a DSLR, the bias voltage is removed automatically by the camera before you get the data, but you still want to remove the readout noise).

A single bias frame will be noisy and not perfectly represent the readout noise. So you would take a number of bias frames and stack them to increase the signal to noise ratio of your master bias frame. Readout noise typically consists of a fixed pattern element that is statistically repeatable over multiple exposures, and random noise in that pattern. You are aiming to measure and remove the fixed pattern element.

You remove dark current by means of subtracting a dark frame. Since the dark frame contains dark current noise, you would usually average together a number of dark frames to create a master dark.

It is important to realise that when we subtract (bias, dark) or divide (flat) one frame from another (light), you will be adding noise to the resulting image as well as removing noise. The objective is to ensure that the net effect is that you end up with less noise that you started with. That is why we stack bias, darks and flats to create master versions. This increases the signal to noise ratio of the bias, dark or flat, but in this case the 'signal' is the noise that we want to remove from our light frame, and the 'noise' is the noise in the calibration frame that we want to minimise.

So after that what are you left with? The photon shot noise in your light frame. This is one of the reasons why a long exposure or a stack of shorter exposures is needed. By stacking you increase the signal (from the sky) relative to the noise (the photon shot noise), thus improving the signal to noise ratio of the image.

The other reason you expose longer or stack is to overcome readout noise. If your image is insufficiently exposed, the signal level will be at or below the level of the readout noise, meaning (roughly speaking) that you can't separate the two elements during processing, so anything that is insufficiently exposed is indistinguishable from the background sky.

Even if parts of the image are exposed enough to separate the signal from readout noise, there may be other (fainter) parts of an object that are still lurking around the readout noise level. This is why you can only stretch a given image so far before it starts getting grainy. E.g. If you have an Orion or Andromeda shot, you may be able to stretch the image so that you can see more of the outer reaches of the object, but find that those outer reaches are much more grainy than the more exposed inner parts.

In some ways, photon shot noise and readout noise are two sides of the same coin. If you have insufficient exposure, light from the sky has too much photon shot noise and looks exactly the same as the random elements of readout noise that are left after subtracting the bias frame.

Finally you might also use flat frames to correct vignetting and uneven response to light of the individual sensor elements. These are basically the same as light frames in terms of their signal and noise contents. You should stack and process flats in the same way as lights to remove all the same noise factors and end up with a higher signal to noise ratio.

Thanks for that.

So basically stacking the lights is another way to get rid of noise, along with subtracting the darks, rather than actually using them to get more 'light' information into the image?

technically then you should be able to stretch a single image to get the same amount of detail than in a stacked image but with loads of noise. Until the details get lost in the noise I suppose.

That might just answer why you stack images...?

Thanks for the explanations Bizibilder and gkec!

I think the confusion here is the same one everyone suffers from at first in AP. What you are interested in is the final 'Signal to Noise' ratio in your image. When people talk about light or brightness in an image, it leads to mistaken conclusions about the process and your objectives.

What you want to do is increase signal more than you increase noise at every stage of the process. Fortunately when imaging, signal increases much faster than noise. So the solution is to expose for longer. Ideally you do this by making one long exposure, but there are limits to what is possible. Firstly the camera pixels may saturate at some point (i.e. contain the maximum number of electrons they can hold, so exposing for longer actually destroys detail in the image). Secondly you may not be able to track for long enough depending on the quality of your kit. Third, there is a risk of each exposure being ruined by aircraft, satellite trails, etc.

The usual solution is to take multiple exposures and stack them. This also increases signal faster than noise, but because each sub-exposure is subjected to more noise-inducing processes (read-out noise and noise from calibration with bias, dark and light frames all have the potential to increase noise in multiple subs more than in a single exposure), it is not as efficient as a single exposure that is the same length as the total of multiple subs.

Link to comment
Share on other sites

I just wanted to add another note onto this thread to compliment ianL's informative post above. This short explanation makes the process of stacking lights clear to the 'newbie' I think and thought posting it here might help others out too. It comes from this website which I'm sure many have already seen but in case no one has here is the extract.

(I hope this is ok with the author of the website, Doug German, who has some great beginner photoshop tutorials on YouTube for astrophotography as well)

http://www.budgetastro.net/

Noise is caused by the electrical circuitry in the camera (among other things) and is affected by the iso setting (higher settings create more noise - much like film), and the exposure length. As we generally use high iso and long exposures in astrophotography, we can't win! The one useful thing about noise is that it is randomly generated i.e. a pixel that appears dark in one sub may appear lighter (or darker) in the next sub (the "tonal value" varies). We can use that characteristic to our advantage: If we stack multiple images eventually the noise will disappear (almost), simply because the tonal value of the noise pixels vary in each image. Imagine six columns of 100 randomly generated numbers between 0 and 255. If we average each column of 100 numbers we should find that the averages will be much the same, as the numbers were randomly generated. The more numbers we use, the closer the match will be when the columns are averaged. The same goes for the tonal values of the noise pixels - the more subs we stack, the closer the tonal values of the noise pixels will become, and the noise will lessen. I hope that's clear :). As regards the "signal" (jargon for the bit of the image that we want to keep), in this case the star, the value of each pixel will be more or less the same in each sub, so the signal will remain in the final image. It's important to understand that stacking doesn't "boost" the signal as such, it just makes the signal clearer - less swamped by the noise.

Link to comment
Share on other sites

Noise is caused by the electrical circuitry in the camera (among other things) and is affected by the iso setting (higher settings create more noise - much like film), and the exposure length

It is not really true that higher ISO creates more noise - it just amplifies the noise more that is already present. But it also amplifies the signal, and the net result is no change in the signal-to-noise ratio, which is what matters for astro imaging.

NigelM

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.