Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

120, 240 or 480 seconds. What is better ?


jsmoraes

Recommended Posts

I haven't read every single post in this thread but thought I'd also add that using filters, specifically very narrowband Ha, SII or OIII filters will neccessitate longer exposures. As Olly has stated above, the target will dictate the length of the subs and the use of different length subs combined in a HDR style image for correct exposure. M42 is the perfect and obvious example.

Interesting read though. And good thread.

Phil

Link to comment
Share on other sites

  • Replies 37
  • Created
  • Last Reply

I was just looking at your galaxies. One common theme I would say is noise from insufficient data accumulation. You seem to be using very short subs also. 5-10 min subs seems to be the norm for galaxies, with integration times running into many hours. Binning would be good too. I wish I had a ccd.

Link to comment
Share on other sites

Well, let's go see what I got.

1 )  About noises:

I did 60 frames with 180 seconds ISO 800. And I did 3 stacking: 60 frames, 30 frames, 10 frames.  The photo above has the 3 stack, without any processing. An area after the red line has adjust of brightness to better see the noises. The crops are from Autosave.tif 32 bits DSS.

post-43725-0-89240400-1433959996_thumb.j

I see few difference. I don't know if worth do 60 frames to reduce noises.

I perceive few difference in shape and colors.

2 ) About quality and amount of signal.

Firs, the 10 frames from Autosave.tif DSS without any processing. It is expected that this image has the less level of data and quality.

post-43725-0-87192100-1433959999.jpg

And now, the 60 frames without any processing and the 10 frames with HDR adjustment.

post-43725-0-58957600-1433960382.jpg

And last, the 60 frames with autolevel in Photoshop before of convertion to 16 bits and 10 frames with HDR adjustment (as above)

post-43725-0-28788800-1433960462.jpg

The seeing was very good during this session.

It seems that for cluster use 10 frames or 60 frames don't make many difference in the final result. Perhaps 30 frames will be more than good.

It seems that I can continue to do with my 10 or 15 frames for clusters.

I will process the full FOV. I want to know if 60 frames will cause problems to identify the cluster on the background. I think that when you have much signal, the faint stars can be very present, and cluster disapears, vanish.

I am testing with nebula, NGC 3324 - Gabriela Mistral . I think that with nebula large number of frames will make large difference. I already did 26 frames with 4 minutes: 104 minutes. I would like 120 , 180 minutes or more minutes.

When ready I will publish here.

Link to comment
Share on other sites

Phillyo, up to now I am testing only with skyglow filter to reduce any light pollution. Work with narrowband filters is very different. Mainly because narrowband filters limit the light, and so you normally need much more time of exposition.

I am trying to find a start point to choose the best ISO and time exposition for different type of DSOs, in my site, with my optic equipments and enviroment.

I think that there isn't an unique procedure: time and ISO. What can work well for me,  may not work well for other peoples.

ollypenrice, you are right about needing perform a strech. The issue maybe is what intensity of this strech. If we have much signals we can do it more light, and with this get better quality of image, without any aberrations.

Kalasinman, you are right about the short exposure for galaxy. I had problem with light pollution, street light toward inside my OTA !. Now, I did a new observatory and I am more protected. I am needing to return to some galaxies that I did at past years.
I have a good sky, yet.

Link to comment
Share on other sites

Well, now we have here the 60 frames processed. I didn't use HDR.

n3766-60f-wide-k9.jpg

n3766-crop-60f.jpg

The result from DSS reduced the colors of stars. See a copy screen from Canon-ZoomBrowser of a CR2 single frame.

colorOfstarsRAW.jpg

Therefore for the crop (detail) version of cluster I need to do artificial adjust of the color

n3766-crop-60f-blue.jpg

Link to comment
Share on other sites

This is very interesting. Congrats on the observatory. I'm sure it will make a big difference.

About the star color, I'd attribute what you are seeing as far as color loss to DSS. Although I can't afford it, many get better results all around with Pixinsight or other pay software.

I agree that the criteria for clusters are different than for nebulae. Yes, I believe long integrations (maybe into days) bring out the best of dim objects with fine detail.

I've been doing lots of work on my kit: integrating ASCOM, Astrotortilla, and Baader modding my DSLR. I hope when finished in a few weeks I can get results like yours.

I like dim DSOs so AT will give me the capability of collecting data over many session for each target.

Link to comment
Share on other sites

And now, processing 10 frames with HDR, as usually I do.

n3766-10f-wide.jpg

n3766-crop-10f.jpg

With HDR I "see" more stars and more colors. The borders of stars are much blurred, although. I processed 10 frames as normally I do. Maybe with less stretch in HDR I got better image of stars.

It seems that with less frames the DSS didn't vanish with colors of stars. I always have the idea that DSS saturate with full RGB values the stars if you have much signal. All stars becomes white.

I will try less HDR with 10 frames and with 60 frames (the above wasn't used HDR). If they have better appearance ... I publish here for comparison.

Link to comment
Share on other sites

Well, now we have here the 60 frames processed. I didn't use HDR.

The result from DSS reduced the colors of stars. See a copy screen from Canon-ZoomBrowser of a CR2 single frame.

Therefore for the crop (detail) version of cluster I need to do artificial adjust of the color

This is a common misconception on SGL, recently many new imagers complain about DSS colour.

DSS does'nt reduce the colour, It does'nt apply a tone curve so we have to do this ourselves in post.

You can adjust the tone curve and saturation in DSS but most imagers prefer to use other software like PS.

Any software like Zoom browser, DPP or Camera RAW will apply a tone curve so the colour will no doubt look better than DSS.

Link to comment
Share on other sites

wxsatuser, is it possible that stacking task can increase the RGB values to theirs maximum ? How the stacking program work to limit this saturation ? Always I have this impression with DSS. Am I wrong ?

Link to comment
Share on other sites

If you don't perform a stretch in order to extract faint detail then the benefits of long exposure will not readily present themselves and their drawbacks (inc the need for better guiding, better seeing and fewer aircraft  :grin: ) will make them look favourable. Stars may be sharper in short exposures but a stack of long ones allow you firstly to stretch, and then software sharpen, faint details. The image processing process is, then, and extended one in which the merits of longer and shorter subs will favour different parts of the image. Where possible, for this reason, I make starfields out of shorter RGB-only subs but use longer LRGB subs for the faint fuzzies themselves.

This is not to criticise your post. Far from it, it's a good one, but I'm just inviting us to consider the whole process of processing.

Olly

I totally agree. Longer exposures are pre requisite for extraction of faint objects by selective stretching to raise above the noise level. With a star field shorter exposures are at times preferable. Many great DSO images that I seen have been processed by removing the star fields first to allow heavy stretching of the faint targets.

I am no expert on any of this but I would have thought that the test would have been better carried out using mono sensors to remove the effect of the Bayer Matrix. As far as DSLRs are concerned all the so called raw data is molested internally to reduce noise. This is a very interesting comparison none the less.

A.G

Link to comment
Share on other sites

wxsatuser, is it possible that stacking task can increase the RGB values to theirs maximum ? How the stacking program work to limit this saturation ? Always I have this impression with DSS. Am I wrong ?

I suppose it could be possible.

What stacking method do you normally use or do you vary the method of stacking.

Link to comment
Share on other sites

I use DSS, I don't use white balance (nor from camera, neither auto), Mode standart, I adjust the threshould to find at least 55 stars, for light I use kapa sigma 5.0 - 3, dark, flat, dark flat and bais medium, automatic alignment , cometic detect hot and cold pixel 75 % 1 px.

Other parameters I accept the suggestion of DSS that you can see by the button.

Link to comment
Share on other sites

I was thinking to do the same test with nebula. I did 66 frames with 240 seconds, 4 hours and 24 minutes. I am not glad with the result, because I did a comparison with another photo of 2013 with only 40 minutes, and I see that my seeing here is very worse after this two years.

The large photo of 2013 can be seen at: http://www.astrobin.com/41885/

The atual large photo at  http://www.astrobin.com/186662/

Despite of 4 hours the nebula was very faint. To enhance it I will need to do some hours with UHC filter. I don't know if worth to do comparison with less time stacking, as thought.

To get what you can see at http://stargazerslounge.com/topic/246323-ngc-3324-gabriela-mistral-nebula/  I needed to strech very much the signal. Since I published the photos there I wil don't publish here. Please take a look by the link.

So, the first law: if the seeing isn't good ... give up, go to the bed and dream with Hubble images ! sad_eyes.gif

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.