Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

M1 test picture


Datalord

Recommended Posts

During the few nights I've had to test out my new scope, I let it spend some time on M1. I did it before I had it properly collimated, which shows as a bit of fuss on the stars. I also need to tune the focus quite a bit, which I find to be a right pita.

15*1200 bin1 Ha and 13*1200s bin1 O3. CFF 12" RC, G3-16200 with astrodon 3nm Ha and Baader 7nm O3. Stacked in DSS (PI gave me nonsense about not being able to find star pairs), processed in PI with flats, darks, bias.

One of the things I'm considering in the future is whether I should reduce the exposure time and get more frames instead. Stars become saturated and bloated quite fast and it seems like it picks up plenty of detail. 600s? 300s?

Another option is to do bin2 and drizzle the data after, combine that with going down to 300s. Any advice?

Finally, should I open the pocket liner and invest in 3nm O3 filters to match the Ha? will that make a notable difference in the ease of processing?

 

M1.thumb.jpg.542ed00fffa2f27b091cc6f05323c9b8.jpg

Link to comment
Share on other sites

If your stars are saturating then a drop in exposure time makes sense - say to 600 - and see how things go as you are capturing plenty of data. Combined with more exposures, this would have the benefit of cleaning up the background sky somewhat.

Matching the filters for bandwidth will make it easier to avoid star halos. I generally prefer not to bin 2x2 but so much depends on your sampling rate. Do you know what your current sampling rate is?

Link to comment
Share on other sites

11 minutes ago, steppenwolf said:

If your stars are saturating then a drop in exposure time makes sense - say to 600 - and see how things go as you are capturing plenty of data. Combined with more exposures, this would have the benefit of cleaning up the background sky somewhat.

Matching the filters for bandwidth will make it easier to avoid star halos. I generally prefer not to bin 2x2 but so much depends on your sampling rate. Do you know what your current sampling rate is?

0.52"/pixel. From what I can gather, that is ok if I have good seeing, but not good enough for anything below that. Although even with good seeing, that doesn't take into account the saturation of the well and the exposure time reduction. On the other hand, I do like the resolution of bin1. Bloody hobby...

Link to comment
Share on other sites

0.52"/pixel is a bit too high in my opinion. You really need perfect conditions to exploit that.

What are your star FWHM in single sub (measured in arc seconds)? For that kind of resolution to be fully exploited you need sub 1" FWHM stars, or have enough SNR to deconvolve (or wavelet process) and recover some resolution back.

 

Link to comment
Share on other sites

Just now, vlaiv said:

0.52"/pixel is a bit too high in my opinion. You really need perfect conditions to exploit that.

What are your star FWHM in single sub (measured in arc seconds)? For that kind of resolution to be fully exploited you need sub 1" FWHM stars, or have enough SNR to deconvolve (or wavelet process) and recover some resolution back.

 

Wait, too high? You mean, the resolution is too high, thus i should do 2*2? Or the number is too high?

Where I am now, I have [removed word] seeing all the time, so I really should only do bin2, but in the remote I hope to have better seeing.

Link to comment
Share on other sites

1 minute ago, Datalord said:

Wait, too high? You mean, the resolution is too high, thus i should do 2*2? Or the number is too high?

Where I am now, I have [removed word] seeing all the time, so I really should only do bin2, but in the remote I hope to have better seeing.

Selecting correct sampling resolution is not an easy topic - many variables involved.

If we choose a subset of most important variables and approximate some things with basic functions (like approximate seeing PSF, airy disk, guide error, etc ...) with Gaussian profile, we can come up with figure above which frequencies are just too attenuated to be of any significant contribution in the image. Depending on your choice of significance (let's say attenuated more than 90%) you can choose correct sampling value (Nyquist criteria that strictly works for band limited signal, but we are limiting our signal in bandwidth by stating that we are not interested in frequencies that are just tiny in magnitude and don't contribute much to detail in image).

I've done some numbers on that, and found that good value is FWHM * 0.622 - as optimal sampling rate (with this simplified model). According to this, for resolution of 0.52"/pixel not to over sample you want star FWHM to be close to 0.52 / 0.622 = ~0.84"

Now, that number is very hard to achieve with amateur equipment. Since all errors add together to produce star FWHM - you need to have atmospheric seeing be less than 0.8" for it to match even closely.

So it will depend on seeing, but also on aperture (you are good there, since you have large scope with 12" - so its impact will be very small), guiding, ...

Even if seeing is 1", but your guiding is 0.5" RMS, you can end up with FWHM of your setup around: 1.7", and you if you sampling resolution is above 1" (meaning smaller number of arc seconds per pixel) you will be over sampling.

Over sampling is not necessarily bad thing. You can always bin in hardware, or software (depending on camera), but you can also use special bin methods that don't impact resolution (but still improve your SNR) - regular binning has impact on final resolution of the image even if over sampling - due to something (not considered above) called pixel blur. I've read somewhere that BBC used high resolution cameras in time when there TV was analog and used 576 lines - precisely because of "pixel blur" - image just looked sharper when over sampled.

Link to comment
Share on other sites

So, I set up the scope this afternoon with the promise of a full night and pointed it at NGC2146, a 10.6 mag galaxy and started a 12 hour plan with 300s bin1 lum and 180s bin2 colour. Obviously the weather here gave the forecast a middle finger and I managed to get only 56 minutes. In desperation I processed what I had. I can only dream of what I could get out of this with 12 hours of data.

NGC2146.thumb.jpg.73a2fc9c9e4fd5d7601d6ccb9537809f.jpg

Since these are broader spectrum filters, I guess I end up in the same conundrum with the stars, but I think I can tame them somewhat with masked stretching. Or maybe I should just go even lower on colour filters and starfields.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.