Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.

sgl_imaging_challenge_celestial_motion.thumb.jpg.a9e9349c45f96ed7928eb32f1baf76ed.jpg

Stub Mandrel

NGC1333 Balrog Nebula & Dust Clouds

Recommended Posts

They said it couldn't be done!

I won't pretend this is perfect but it's 90% of 106 2 minute subs on as astro-modded Canon 450D DSLR using an EQ3 mount at a suburban location, and the transparency last night was poorer than the previous evening. NGC133 and its dust clouds are NOT supposed to be the sort of target you can get with this kit.

I know it isn't perfect and doesn't stand up to images with double-figure hours of narrowband subs, but it does have the dusty bits in the right places if compared to Olly's recent image. If I had 20 hours of subs at a dark sky site what would this image be looking like?

ngc1333.png

  • Like 6

Share this post


Link to post
Share on other sites

Knew you could do it :) and it's the sort of challenge that's going to have you addicted and adding more subs every clear night. 

  • Like 1

Share this post


Link to post
Share on other sites

I guess my name has some sticking power--"The Balrog Nebula"  Nice work for such short exposure.  GET MORE!

 

Rodd

Share this post


Link to post
Share on other sites

Great image. You are certainly stretching the limits of this setup/mount. If your setup can handle it, try longer exposures. At the moment , you are close to stretching the noise.

If you have pixinsight, you can try chroma noise reduction using multiscalemediantransform with a lightness mask, protecting the stars and reflection nebula. This will kill the colour mottle in the dust, allowing you to stretch more. Just an idea.

Thanks for sharing.

  • Like 1

Share this post


Link to post
Share on other sites
10 minutes ago, ollypenrice said:

Well done - but I haven't done this recently. Someone else, I suspect?

Olly

Someone did it!

I'm having a devil of a struggle to get a better result - every re-run shows less detail!

Share this post


Link to post
Share on other sites
Just now, Stub Mandrel said:

Someone did it!

I'm having a devil of a struggle to get a better result - every re-run shows less detail!

Was it Rodd?

Olly

Share this post


Link to post
Share on other sites
6 minutes ago, ollypenrice said:

Was it Rodd?

Olly

That's who I got the name from, but I think someone else did it as well... maybe....

Share this post


Link to post
Share on other sites

Methinks Olly is one of the culprits

But Rodd was in on it :icon_biggrin:

Share this post


Link to post
Share on other sites

BTW, Neil, colour mottle (chromatic noise, whatever) is definitely your biggest enemy here. This is rather large scale noise that is mainly in the colour data (not much intensity variation).

As I wrote before, if you have PixInsight, you can kill it with the following procedure before stretching

Create a luminance mask by extracting the Luminance channel from the image. Stretch this with a histogram stretch

Apply the Luminance as a mask and invert (protect the stars and the bright nebula)

Open Multiscale Median Transform, and set the target as Chrominance (restore CIE Y)

MMT_Chroma.png

Hope this helps,

Share this post


Link to post
Share on other sites

Thanks Wim, I'm afraid PI is bit far down my to-get list but I will do a reprocess using Noel's Colour Blotch Reduction which seems to to the same sort of thing and rather well.

  • Like 1

Share this post


Link to post
Share on other sites

That will probably do the trick as well.

Here's a quick before and after on a crop of your image. (Which isn't a good example of the capabilities of the process, since your image is 8 bit png and stretched already).

With more attention to mask generation, the colour loss in the red and the stars near the bright nebula would have been less. But it shows the general idea.

ngc1333_Before.png

ngc1333_After.png

  • Like 1

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By jambouk
      We are running a session at my local society on transits and occultations. One station will focus on exoplanet transits, and we'd like to build a very simple model to demonstrate this. We have a star (light source) and an orbiting "planet" but I need to work out how to detect the changes in light intensity and display this on a laptop, like a classical transit photometry trace below (taken from https://heasarc.gsfc.nasa.gov/docs/tess/primary-science.html).
      Is there a way to take a feed from a DSLR through the USB output to do this, else I could get an adapter for my ZWO and put an EOS lens on the front of that. I really do want a light intensity vs time trace in real time on the laptop. This model will be run in a darkened room.
      Thanks for any comments.
      James
       

    • By gutaker
      Hi guys, it's my first post here, so Hello!
      My synscan goto upgrade started to act like in this video of mine SynScan GOTO Skipping after being disassembled for few weeks in the closet. There were no such problems before.
      When it slews at rates above x3 everything is fine and smooth but as soon as it starts tracking at x3, it skips steps or something like that.
      My setup is eq3-2 on alu and SW 150/750 with Canon 700D attached.
      Thanks in advance for any clues or ideas!
       
    • By Neiman
      Hiya, have absolutely no idea where to begin finding a camera for Astrophotography. And by that I mean - I know I want a canon but am unsure which to buy. It will be a second hand one. Does it need to be full frame ? Can any and all models be modified ? Is a higher pixel count the way to go ? What are the important things to look for in a DSLR ?
      any help would be great.
      Thanks
      Neil
    • By Xiga
      Back in early Feb I tried imaging Markarian's Chain, but after 2 subs I encountered guiding issues (a first for me) so had to give up. 
      So just for kicks and giggles, I decided to process them! ? 
      So this is 2 x 10mins. With the D5300, 80ED and HEQ5-Pro. 'Stacked' (if it's even right to use that term!) in APP, and processed in PS. Then reduced to 75% for posting. 
      Why oh why oh why..... I honestly don't even know myself ?

    • By Deflavio
      I know DSLRs are more for “official” AP imaging and I should probably stick with my trusty ASI224 or maybe think about a 294 or similar sensors but....as probably many, I have a DSLR sitting at home not doing much. Can I do EAA with it?
       
      Today is quite easy to have 24Mp in relativity cheap bodies so why not to use it somehow? Small pixels..I know, but maybe going for wide field and bright objects with fast lenses it may still work and give something interesting. Anyone tried? Any experience? 
       
      What I really like of EAA is the ability to do everything while observing and without the need to come back for long processing sessions. I have searched around a bit but I can’t find a stacking software like Sharpcap for DSLR doing live viewing as well as stacking. I think I could somehow save each frame in a folder and then stack them with DSS (if I got the name right) but can I then control remotely the camera, exposition and have live preview when slewing around?
       
      Just interested to know if anyone know more about this or more in general discuss about DSLR and EAA.
       
      Flavio
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.