Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.


  • Content Count

  • Joined

  • Last visited

Community Reputation

200 Excellent


About Altais

  • Rank
    Star Forming

Profile Information

  • Gender
  • Location
  1. I don't think CCD manufacturers typically state the gain of their sensors. But you might find numbers from others who have measured it. Or you can measure it yourself using the Astrophotography Lab program.
  2. Here is a version with a wider field of view.
  3. Oslet, I would actually suggest trying a higher ISO (e.g. 1600) if you aim to use short exposures. The ISO value is basically a tradeoff between read noise and dynamic range. With higher ISO comes lower read noise, which is always nice, but this comes at the cost of lower dynamic range. Actually, ISO is just a way of describing the concept of gain, which is the number of electrons (think photons) that must be accumulated in a pixel in order for its digital value to increase by one. High ISO gives low gain, meaning that the pixel value will increase more per electron and hence you get a "high sensitivity". In this case, fewer electrons are required for reaching the maximum pixel value (this value is typically around 2^14 or 2^16, depending on the camera), and hence the pixels are more quickly saturated. Faster saturation gives a smaller difference between the minimum and maximum number of photons that can be registered, corresponding to a lower dynamic range. A sufficiently high dynamic range is required to avoid saturation of important parts of your target (like the core of a galaxy) during a single exposure. So if saturation is a problem, you should lower your ISO. But when you do short exposures, you are not likely to reach saturation even with a relatively high ISO. So then it might be better to use higher ISO so that you can benefit from lower read noise. Anyway, this is my (non-expert) take on it. The best thing is probably to experiment to see what works best for your equipment and conditions in practice.
  4. Well, even though short exposures through narrowband filters will probably be read noise dominated, that is just a result of reduced photon noise from the (broadbanded) background. The total noise level will actually be lower, since you have the same read noise but lower photon noise. So for targets with significant narrowband emission, you should get better SNR using narrowband filters than RGB filters. But the fact that the subframes are dominated by read noise means that you in principle could improve SNR further for a given total integration time by increasing the duration of individual exposures.
  5. Yes, you can obtain an estimate of the brightness in mag/arcsec^2 from a light frame. But there are a lot of uncertainties in this calculation, so you should take it as a rough guideline rather than a precise measurement. The use of filters doesn't really affect the reasoning around signal to noise. But you should keep in mind that the narrower the filter, the longer you need to expose in order for the photon noise to dominate the read noise, since photons are hitting your sensor at a lower rate. This is why narrowband imaging tends to require much longer exposures than RGB imaging.
  6. Craig Stark has a really nice series of articles which I would strongly recommend for anyone interested in understanding signal and noise and how they depend on equipment and conditions. The articles can be found here: https://www.cloudynights.com/articles/cat/column/fishing-for-photons/ One thing he covers is how you can measure the gain of your sensor. I was inspired by this to write a program to do it automatically, which evolved to do a lot of other things as well. If you're interested the software can be found here: http://lars-frogner.github.io/Astrophotography-Lab/ It can be used to experiment with various imaging parameters and see how they will affect the SNR and dynamic range of the final result. As an example, here is a plot showing how the SNR of a spiral arm of M51 in a 6-hour subframe stack will vary as a function of subframe exposure time with an Atik 460EX camera (same read noise as your 428EX). The amount of skyglow roughly corresponds what you would have during a full moon.
  7. I think there is a lot of sense in what you're doing. If our cameras had no read noise, there would in principle be no difference between the SNR of a single 10 minute exposure and stack of 20 half minute exposures. Then it would basically always be better to do many shorter exposures, since you would never over-expose parts of your target, you wouldn't need to do guiding, you can get rid of artefacts like cosmic ray hits and plane trails by stacking and the cost of a failed exposure is very low. (Of course, if taken too far the sheer number of subframes to pre-process would become prohibitive eventually.) Unfortunately the unavoidable presence of read noise in every subframe means that the second approach will always yield more noise. But this effect is basically negligible when the read noise is much smaller than the noise in the photons coming from the direction of your target. The photon noise increases as the square root of the number of photons, so the exposure time required to make read noise negligible is proportional to the square of the read noise divided by the combined intensity of the sky and the target. (I'm ignoring dark noise here since it doesn't really affect the argument.) In other words, low read noise and/or a bright sky background allows for shorter exposures without any penalty on the SNR. Astronomical CMOS sensors with very low read noise (e.g. the ZWO ASI1600) are starting to gain popularity, and I suspect we will see more and more of this kind of short exposure imaging in the future. I didn't mean to turn this into an essay, sorry for that. But it's an interesting discussion though.
  8. Thanks! I should probably have cropped the old one to match the new one so they could be compared directly. The old one is quite close to the full FOV of the camera, but with the new one I prefer a closer crop to better reveal the details. Here is a better comparison of new (first) vs old:
  9. Thank you very much! It's 274 frames of 6 minutes at ISO800 and 21 frames of 3 minutes at ISO1600 (so actually just 28.45 hours but 30 sounds cooler). The latter is the dataset from 2015 which I reused in the new image in addition to the new data. The reason I changed from ISO1600 to 800 is because I think the additional dynamic range you get with 800 is well worth the slight increase in read noise for this target. Yeah, it's quite amazing what digital sensors have done for amateur astrophotography. But then it's really humbling to look back at what people were able to achieve without them!
  10. Thanks Ole Alexander, much appreciated. I live a bit north of Lillestrøm (Lindeberg i Sørum). It's sufficiently remote that light pollution from the cities is not a big issue (and also we have no street lights), so I get reasonably dark skies.
  11. Great work! Nice and round stars, a lot better than anything I've managed to do unguided. Let's just hope the great weather doesn't go away anytime soon.
  12. M51 is nicely positioned in the sky these days, so I've spent quite a lot of time imaging it lately. My previous (and first) attempt at M51 was four years ago. At the time I was struggling a lot with getting decent guiding, and had to discard most of the data. With my newly acquired iOptron CEM60 this is basically no longer an issue, so I wanted to see how much I could improve upon the old result by adding a lot of integration time. So here what I have obtained from a bit less than 30 hours of integration time: For comparison, here is the result I got back in 2015, which consists of 1.1 hours of integration time (under very good conditions): Both were captured with a modded Canon EOS 1100D through a SW Explorer 150PDS. While I'm quite happy with the new result, It looks like I'm starting to reach the point of diminishing returns in terms of integration time, considering that much of the improvement probably is due to more accurate guiding as well as better post processing. This actually makes a lot of sense; most of M51 has a high surface brightness, so it is relatively easy to get a decent signal to noise ratio in most parts of the galaxy. More data will still be beneficial for the diffuse outer regions, but for improving details it's probably better to focus on quality over quantity (e.g. only using frames captured during good seeing conditions). Anyways, feedback and opinions are as always appreciated. Clear skies, Lars
  13. Thank you! Yes, I use the Baader MPCC III. I get very noticeable coma without it, so a coma corrector is definitely a very worthwhile accessory, at least if you want to be able to use most of the field of view. I cannot comment on how the SW is compared to the Baader, as I've only tried the Baader. But I'm very happy with the Baader and I think it works really well.
  14. Thanks Miguel! That's right. The original field of view was quite a bit larger than the image suggests, but the overlap between the two data sets was not perfect. :)
  15. Hello everyone. Here is an image of M81 and M82 that I just completed. This is a combination of some data I captured three years ago and some that I captured a few days ago. Roughly 5 hours for each data set, so a bit less than 10 hours in total. The old data consists of 6-minute exposures with a quite dark sky, while the new subframes are 3 minutes and with more light pollution. I use a full-spectrum modded Canon EOS 1100D, and ideally I should use a UV/IR blocking filter to get correct colours. For the old frames I did not do that since I couldn't get a sharp image with the filter. This time around I got it to work with the filter (it seems that it it simply becomes more sensitive to slightly incorrect focus). So to exploit the best of each data set I created a luminance image from all the subs, then processed only the new frames with better colour as RGB and then merged them using LRGB combination. I'm quite happy with the result, although the background is a bit messy and I might have overdone it slightly with the noise reduction. I think the background is quite tricky, since it is hard to judge what is real signal due to the IFN and what is just unwanted unevenness. Comments and suggestions are appreciated. Clear skies, Lars
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.