Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Using hundreds or thousand of short narrowband subs?


Recommended Posts

I've been reading up on deep sky "lucky" imaging, where people take hundreds or even thousands of 1-5 second exposures of deep sky objects, stack them, and wind up with really nice images. Here's someone getting amazing results with an uncooled planetary camera on a 16" dob with an unstable eq platform...  http://www.astrokraai.nl/viewimages.php?t=y&category=7

The M51 on that page is one of, if not the best m51 I've seen from an amateur. What I'm wondering about is narrowband imaging. 1 second exposures in monochrome look like they work. 4 or 5 second exposures in RGB look like they work. If someone was to attempt this type of imaging with ordinary 12nm narrowband filters, and used maybe 300 or 400 subs do you think perhaps 10,20, or 30 second exposures on an f/4 scope would provide enough signal without the noise taking over? Remember, we're talking about a $500 uncooled cmos.

Thanks.

Link to comment
Share on other sites

  • Replies 29
  • Created
  • Last Reply
  • 2 weeks later...

The problem with stacking thousands of short exposures is that camera read-noise can become a significant contributor to overall noise in the stacked image. It's not an issue with bright objects such as planets where the S/N ratio in individual frames is still high at short exposure. Some brighter deep sky objects are feasible - I got good results on M57 way back in 2005 using a modified webcam and SCT at f/10: 180 x 12 seconds - but faint deep sky objects are more challenging. The key to success seems to be a very high QE camera with extremely low read noise, and a fast large aperture telescope. 

Narrow band filters are a different matter.  They provide improved S/N ratio when the main noise source is the sky background, but they do nothing to mitigate the effects of read noise. I think you would be struggling!  

But if you have the right equipment, why not have a go anyway?  I would stick to targets that are relatively 'bright' in Ha.

Adrian

Link to comment
Share on other sites

1 hour ago, opticalpath said:

The problem with stacking thousands of short exposures is that camera read-noise can become a significant contributor to overall noise in the stacked image.

I would have thought the read noise would have averaged out to a constant with very little variation after a thousand estimates?

Regards Andrew

Link to comment
Share on other sites

21 hours ago, andrew s said:

I would have thought the read noise would have averaged out to a constant with very little variation after a thousand estimates?

Regards Andrew

It doesn't work quite that way.  Read noise is the same for any length of exposure and accumulates with each exposure added to the stack; it doesn't in any sense cancel out or level out - it diminishes the S/N ratio of every individual frame and the S/N of the stack.   So 3600 one-second exposures will contain SQRT(3600) i.e. 60 times the read noise of a single 3600-second exposure.   High speed cameras have to read-out at high speed and that generally increases read noise of every frame.

A good quality image generally means aiming for a high S/N ratio and you get that by increasing signal and/or reducing noise, so the key question is: how does 60x read noise compare with the accumulated signal from my faint object? .  Often sky background is the major source of noise and the usual advice is to expose each sub-exposure long enough so that camera read noise is very small compared to sky background and does not start to make a significant contribution.   So you can see why, if going for 3600 x 1 sec, you need to be shooting with a high-efficiency, very low-noise imaging system.  

Very short exposures with NB filters will have a relatively low sky background noise level, so camera read noise could become a significant contributor to total noise in the image.

This article explains it more formally: http://www.hiddenloft.com/notes/SubExposures.pdf 
and the first part of this article on lucky imaging underlines the read noise issue: http://www.ast.cam.ac.uk/research/instrumentation.surveys.and.projects/lucky.imaging/lucky.imaging.methods 

I'm not discouraging you from trying, but I'd suggest having a go first with L/R/G/B and see how that works out with your imaging setup before attempting it with NB filters.

 

Adrian

Link to comment
Share on other sites

Interesting Adrian, I did not consider the read noise being higher as I assumed the OP had a standard commercial CCD where you could not change the readout time.

On the other point I don't disagree with your view or the linked paper but is this way of looking at it correct? You don't have the option of a long exposure in Lucky Imaging so a direct comparison is not possible.

I was thinking along the lines of estimating the bias (off-set) of a ccd where the only noise tern is the read noise. Here averaging many estimates reduces the contribution of the read noise and if you did it for many hundreds of estimates it would soon be negligible. I was thinking on the same lines with lucky imaging where you are doing a similar thing with the image.

Not sure which approach is right. I will have to ponder on it more.

Regards Andrew

Link to comment
Share on other sites

I think you have to distinguish between two things here.  Averaging many frames, such as bias frames, is a way of minimising the effect of random variation in the noise of a large sample.  But stacking does not reduce noise exactly; read noise and sky background noise still accumulate as you add images to the stack - they just do so at a slower rate than the wanted signal increases (square root law).  The fact that random noise accumulates more slowly than desired (non-random) signal is why stacking produces a higher S/N ratio than a single frame.  But it does not mean that noise does not accumulate as you stack more frames!  Every additional frame adds some read noise and hundreds or thousands of short exposure frames will add much more read noise than one long exposure frame.  1000 x 1 second produces essentially the same total signal as 1 x 1000 sec ..... but it produces much more read noise because read noise is the same for each of the 1000 short frames as it is for the single 1000 sec frame.

 

Adrian

Link to comment
Share on other sites

33 minutes ago, opticalpath said:

but it produces much more read noise because read noise is the same for each of the 1000 short frames as it is for the single 1000 sec frame.

 

...But it is not the same it is random with a variance. If you averaged the frame rather than just stacking them...?

Regards Andrew s

Link to comment
Share on other sites

50 minutes ago, andrew s said:

...But it is not the same it is random with a variance. If you averaged the frame rather than just stacking them...?

Regards Andrew s

Stacking does already use the average or median of the individual subs. I thought the whole point was that the variance of the independent noise terms (e.g. sky noise) increases as a square root of the number of exposures whereas read noise is a dependent variable and, as Adrian says, increases linearly with the number of subs.

Please correct me if I'm wrong, but this is my understanding of how it works. :)

Link to comment
Share on other sites

I didn't actually mean that it increased linearly.  As I understand it, any noise that is random in nature in an individual exposure increases as the square root of the number of exposures when you stack multiple exposures.  When we increase total exposure time, we do so to increase the object signal and so increase the S/N ratio.  One 3600 second exposure will contain the noise of ONE read and the object signal of 3600 seconds exposure.  3600 exposures of one second will contain the same object signal level but 60 times the read noise of the single long exposure.  All other things being equal, stacking multiple short exposures cannot produce better S/N than a single long exposure of the same total duration, and it could be much worse. If read noise is extremely low compared to the object signal, it may not matter much, but if the read noise in one exposure is somewhat significant compared to the object signal level then it WILL matter when you stack 3600 exposures and the S/N ratio will be degraded compared to one long exposure.

It's for this reason that lucky imaging works well for planetary imaging with ordinary cameras - object signal is very high compared to single frame read noise.  It can work with relatively 'bright' DSOs, if using a fast optical system and a high QE, low-noise camera.  But for faint DSOs (or, I would presume, for brighter objects imaged through NB filters) it becomes much more problematic as read noise starts to look significant.  Hence the use of high tech. EMCCD cameras for this application in professional setups.

I think the clearest explanation I've seen is in the first paragraph of the research article on lucky imaging I mentioned earlier:
http://www.ast.cam.ac.uk/research/instrumentation.surveys.and.projects/lucky.imaging/lucky.imaging.methods 

 

Adrian

Link to comment
Share on other sites

OK I will do some simulations. However, if read noise was a constant (say 5e) you could just subtract it off. Now if it varies you could subtract an estimate so what error you then got would depend on how good the estimate was.

I am well aware of the standard analysis on CCD noise and multiple v single long exposure so no need to repeat again it for me.

As I day I will do some tests.

 

Regards Andrew

Link to comment
Share on other sites

I stand corrected :):

  • Sky noise increases as the square root of exposure time;
  • Read noise increases as the square root of the number of exposures;

It is the trade-off between these variables (amongst others) that is important.

Thanks for the clarification :)

 

Link to comment
Share on other sites

Cant see it working so well for narrowband, especially when youre trying to capture a weak signal. If your exposure time is so short that nothing of that weak signal gets captured, 1000 x nothing still equals nothing.

Link to comment
Share on other sites

2 hours ago, andrew s said:

OK I will do some simulations. However, if read noise was a constant (say 5e) you could just subtract it off. Now if it varies you could subtract an estimate so what error you then got would depend on how good the estimate was.

I am well aware of the standard analysis on CCD noise and multiple v single long exposure so no need to repeat again it for me.

As I day I will do some tests.

 

Regards Andrew

Sorry if I was going on a bit :wink2:

But if it was just a matter of subtracting a master 'read frame', I think we would all be doing multiple short exposures and enjoying much easier guiding!  Bear in mind that if you calibrate every sub. by subtracting a master dark, that dark also already includes the (averaged) read noise.  So in a real sense, dark-frame calibration has already subtracted a 'master read' from every sub - you don't want to do it again.  It's analogous to not subtracting a master bias during calibration unless you plan to scale your darks, because the bias is also in the dark and will be subtracted when you dark-subtract.  

Whilst on calibration, another consideration is the unavoidable small amount of noise that's added to every sub-exposure when you subtract a master dark and divide by a master flat.  Although small for each sub-exposure, that too could start to become significant when multiplied by hundreds or thousands of exposures.

It's all moot if the object signal is relatively strong - all these noise sources become much less significant - but not so for low level signal sources, which is why I think NB-filtered images would be problematic.

 

Adrian 

Link to comment
Share on other sites

On 4/20/2016 at 01:44, Skyhat said:

I've been reading up on deep sky "lucky" imaging, where people take hundreds or even thousands of 1-5 second exposures of deep sky objects, stack them, and wind up with really nice images. Here's someone getting amazing results with an uncooled planetary camera on a 16" dob with an unstable eq platform...  http://www.astrokraai.nl/viewimages.php?t=y&category=7

The M51 on that page is one of, if not the best m51 I've seen from an amateur. What I'm wondering about is narrowband imaging. 1 second exposures in monochrome look like they work. 4 or 5 second exposures in RGB look like they work. If someone was to attempt this type of imaging with ordinary 12nm narrowband filters, and used maybe 300 or 400 subs do you think perhaps 10,20, or 30 second exposures on an f/4 scope would provide enough signal without the noise taking over? Remember, we're talking about a $500 uncooled cmos.

Thanks.

Looking over Emil Kraaikamp's web site I can see why you are impressed, outstanding deep sky, planetary and solar images (thanks for linking).  I am thinking Emil is a very gifted amateur using a very special 16" dob and clearly has an understanding of software.

These new cmos cameras seem set to add another option for deep sky astrophotographers to play with,  It will be interesting to see what the new cooled versions will produce.  Luis Campos has posted some images in this thread.

Link to comment
Share on other sites

So I have reread the post and done some simulations. I think we have got at cross purposes but here is a summary of what I believe to be true.

1) The standard formula for CCD S/N calculation is correct. 

2 )In view of this you should use as long an exposure as possible taking practical considerations and your science/imaging goals into account.

3) Given the above you should then average as many exposures as possible to further reduce the noise.

So when there is a constraint on the length of the exposure e.g. bias and flat frames or Lucky Imaging where you wish to freeze the seeing the best you can do is reduce noise by averaging many exposures . (I agree this is not as good as a single long exposure but it is the best you can do.)

The biggest challenge is when you want high cadence images of a variable or transient phenomenon.

I hope we can all agree on the above and thanks for the interesting debate which seems to be about about confusing points 2 and 3.

Regards Andrew

Link to comment
Share on other sites

On 03/05/2016 at 22:48, opticalpath said:

If read noise is extremely low compared to the object signal, it may not matter much, but if the read noise in one exposure is somewhat significant compared to the object signal level then it WILL matter

Strictly speaking what matters is if it is significant compared to other forms of (random) noise.

NigelM

Link to comment
Share on other sites

On 06/05/2016 at 13:38, dph1nm said:

Strictly speaking what matters is if it is significant compared to other forms of (random) noise.

NigelM

Agreed, and for most of us sky background is the biggest source of noise in practice; in average UK conditions it tends to overwhelm every other source.  What I meant was, in this special case of relatively very short exposures (and especially with narrow-band filters) sky background is much lower in each exposure, so read noise could become a more significant issue, especially if the object signal level is very low too.

However I have to agree that the examples referred to in the posts above, taken using the newer very low noise CMOS cameras, speak for themselves: very impressive.

Adrian

Link to comment
Share on other sites

  • 4 months later...

I thought 'Lucky Imaging' was to do with VERY fast frame rates that only a video capture can aquire ie freeze the seeing. Not sure I would want to stack a lot of very short exposures myself on DSO's.

Longer subs go deeper & collect more data, why try to fix summit that aint broke :-)

Link to comment
Share on other sites

  • 2 weeks later...
On 06/10/2016 at 15:19, Ewan said:

 

Longer subs go deeper & collect more data, why try to fix summit that aint broke :-)

Because you can start to do DSO imaging with what would normally be called unsuitable equipment. For example Emil's M51 was shot with a 16" Dobsonian on an equatorial platform.

Link to comment
Share on other sites

I have quite a few friends who stack their images. I have yet to find success with it myself. And I do not know why. DSS stalls and/or runs out of memory, RegiStax can't seem to complete the tasks, and Autostakert is befuddling. :help:

So, as a friend told me, "Sonny, you just stack with time." Because I can take long exposures. And that is how I have been able to capture my Nebulae images. I have my mounts guiding down to a tight art now. I'm getting smaller and smaller stars, and tighter and tighter focusing.

But I've only gotten one program to actually stack my images, Nebulousity 4. So, if I want to stack images, it appears I'm going to have to pay for a program.

I also have a fundamental dislike for what I consider "Manufactured" photography. Yeah, so I'm a dinosaur. I don't like the re-manufactured women doing the Weather and News either.

So I guess I fit in a small nitch of odd-balls who would rather get it right in the camera, and save the sugary coatings for the donuts. :rolleyes2:

Last nights donuts are over here: https://stargazerslounge.com/topic/279955-then-and-now-helix-nebula-revisited/#comment-3064934

 

Link to comment
Share on other sites

We stack because stacking is a tool to reduce noise and to improve the signal to noise ratio. Its no more or less "manufactured" than using a slice of silicon to convert photons into electricity that can be manipulated into an image on a screen.

Link to comment
Share on other sites

22 hours ago, SonnyE said:

I have quite a few friends who stack their images. I have yet to find success with it myself. And I do not know why. DSS stalls and/or runs out of memory, RegiStax can't seem to complete the tasks, and Autostakert is befuddling. :help:

So, as a friend told me, "Sonny, you just stack with time." Because I can take long exposures. And that is how I have been able to capture my Nebulae images. I have my mounts guiding down to a tight art now. I'm getting smaller and smaller stars, and tighter and tighter focusing.

But I've only gotten one program to actually stack my images, Nebulousity 4. So, if I want to stack images, it appears I'm going to have to pay for a program.

I also have a fundamental dislike for what I consider "Manufactured" photography. Yeah, so I'm a dinosaur. I don't like the re-manufactured women doing the Weather and News either.

So I guess I fit in a small nitch of odd-balls who would rather get it right in the camera, and save the sugary coatings for the donuts. :rolleyes2:

Last nights donuts are over here: https://stargazerslounge.com/topic/279955-then-and-now-helix-nebula-revisited/#comment-3064934

 

To be fair using stacking just reduces the deficiences inherent in the camera anyway. Calibration frames also reduce/remove deficiences in the camera/optics as well and are definitely part of post processing.

In my view these processes just remove errors introduced by the camera, I don't see how that could be considered 'manufactured'.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.