Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Blotchy or mottled background nebulosity. Pixinsight.


Recommended Posts

On 29/10/2023 at 10:01, Ouroboros said:

Several selective applications of TGV Denoise applied using masks (to protect brighter areas).  

That's your main culprit. TGV denoise mainly removes small scale noise but can leave blotches, which is probably what we see here. As others have written, use noise reduction when you need it, after stretching. The best noise reduction, and the only way to reveal weak detail, is to have more data.

To get rid of larger scale colour noise, use MMT applied to chrominance with 7 or 8 layers activated, and a mask that protects the main target and partly even the background. (This is the same as mixing the treated image with the untreated image). Also, use curves stretch to control the stretch applied to the faint areas.

SCNR is the best way to remove a green cast. But I seldom use it at full strength, because it can leave a purple cast where the sky is supposed to be neutral. Try it at 0.4 - 0.6 strength.

  • Like 1
Link to comment
Share on other sites

1 hour ago, wimvb said:

SCNR is the best way to remove a green cast. But I seldom use it at full strength, because it can leave a purple cast where the sky is supposed to be neutral. Try it at 0.4 - 0.6 strength.

Yes. I often use a mask with it too to target the background for example.  Those residual green casts can be quite subtle can't they?  It's not until you remove them does it become obvious that they're there. 

Link to comment
Share on other sites

I assert my membership of the barbarian horde!  Here is my effort (a bit quick and dirty - should have suppressed star haloes), and my own narrow fov image oriented the right way round to compare.  Both using PI's Foraxx palette.  Hope you like it.

As someone else said, images noisier (and  lot bigger!) than I am used to.  But I think your data is all real, just a little noisy that can be fixed.  My processing route on your data:

PI-> channel extract->BXT->linfit to G->starnet2 ->starless masters-> RGB combo ->export xisf

Graxpert-> gradient removal -> export fits32 

Affinity Photo -> import fits ->curves sequences -> inpaint (a few bad star haloes) ->Topaz Denoise (severe noise, 31,11) -> Topaz Denoise (standard nosie,15,12) -> export tiff16

PI -> import tf16 -> channel extract -> foraxx utility -> export tif16

Affinity Photo -> import tf16 ->curves tweak -> import star mask (PI, RGB combo with stars, starnet2 star mask) + gaussian blur and level tweaks -> overlay star mask on nebula (screen)

Simon

NGC281_dc_RGB_xstars_combo_GXPT_fts32_cvs_inpnt_tpzSN_31_11_ST_13_12__tf16_FRX_tf16_cvs_smsk(gb_lvl).thumb.jpg.76e04e700b2295863eb73f3d25ddd41c.jpg

 

15E_Cave_Foraxx_cvs_lvl_cvs_smsk_tpzST(22)_frxstarmask_14Aclrblend_flpV.thumb.jpg.8b1117db2ebffe4ffe37b59db8d1597e.jpg

 

 

 

Link to comment
Share on other sites

On 31/10/2023 at 08:37, Fegato said:

Ha ha - yes, the weather has been so bad - nice to have a go at processing something! My effort below...

Observations:

1. The integrated image was noisier than I'm used to dealing with, so I think as has already been said, the best improvement can be made with more data. I purely applied NoiseX with this, after initial stretches, and quite strongly. I think blotchiness is not really there, but I guess it's a bit "plastic" from the noise reduction

2. I got tiling artifacts with StarX, so had to apply the "Large Overlap" option to prevent this

3. I've probably not stretched as much as yours. This felt like a bit of a limit to me. I had to deal with some nice purple and green hues in the background dust, and used a range mask to push the saturation a little in the nebula and dampen it elsewhere

Having said all that, processing this reminded me that I did find my own effort on the Cave a tricky one to process - I think maybe it's an area with quite a lot of obscuring dust that makes it difficult to pull out detail?

Edit: looking at that, I've perhaps not got enough contrast - a bit too dusty?

NGC281_dc stretch.jpg

 

I like the colour balance and bringing out all the surrounding gas clouds which fill up the whole field.    Very nice stars too - do you have short form recipe for that ?

 

  • Like 1
Link to comment
Share on other sites

33 minutes ago, windjammer said:

 

I like the colour balance and bringing out all the surrounding gas clouds which fill up the whole field.    Very nice stars too - do you have short form recipe for that ?

 

Thanks!  re: stars -

I use PI

I use BlurXTerminator after DBE and SPCC (just with Auto PSF, BX is a tool I should explore the detail of and experiment a bit more!). RASA stars can be hard to control and often tend to flare a bit. As well as tightening them up BX can help with this a bit.

I remove stars with StarXTerminator after an initial stretch with Histogram Transformation. This stretch will be  just enough to get the stars about where I want them (generally I like to see plenty of stars, but not overly prominent - but depends on image obviously).  I only ever use HT (although Curves would be OK) when stretching stars. I find other tools aimed at retaining star colour (MaskedStretch, ArchSinh etc.) give my stars a profile I just don't like, and I'd rather have a nicer star profile than bright colours. As with everything - all a matter of taste. 

I give the stars only image a limited push on the colour saturation (using Curves). And then add them back in at the end of processing the non-star image.

 

Link to comment
Share on other sites

On 29/10/2023 at 09:01, Ouroboros said:

Any suggestions of possible solutions? 

So I have enjoyed working with your data, yes its 3hrs and yes its got noise, but in UK getting 3 hrs broadband is a result....


I initially used GradXpert in its new AI mode to remove gradients from the original linear image you posted , then brought that file into PI and then used Pixinsight with the Rus Croman XTsuite and GHS for all the processing and list the History explorer beside the images.

As per above suggestions  I used Rangemask operation to generate a mask that I used to apply NoiseXT and LHE operations, the final mask is a colour mask fof blue as there was a blue cast accross the bottom of the image, however I did apply NoiseXT to the linear image after StarXT as it was clear that the image does have a lot of noise and seems that for this particular image applying NoiseXT before a stretch helps , I applied at 0.4 strength

The final image is a pixelmath  op_screen() of starless and stars

Thisis by no means definitive as above, there are many different ways to process the same data, thsi is my contribution on a wet and windy afternoon

Bryan

NGC281_dc_GraXpert_XT_GHS_Curves_Starless_V2_0_0.thumb.jpg.f0750a5fb2b0960e75a8f7275f535c76.jpg       image.png.978870cbe5b1591a55fc48d0a1f50942.png
NGC281_dc_GraXpert_stars.thumb.jpg.d5462f6f03fb3779ce9150c436852174.jpg        image.png.441fb3100b73ecd5c772373a46ebab68.png 

 

Cave_and_Stars_Combined_1.thumb.jpg.4cf614f13df16f4a21a76760dfaaec7a.jpg

 

  • Like 1
Link to comment
Share on other sites

Hey!  Lots of responses. Thanks all. :) 

On 31/10/2023 at 18:11, wimvb said:

That's your main culprit. TGV denoise mainly removes small scale noise but can leave blotches, which is probably what we see here.

Yes, my experiments with TGVDN last night seems to confirm this.

9 hours ago, windjammer said:

One could tweak this around forever - a fascinating field of view.  And very good base data.

Yes. In fact I’ve played around with this image so much I’m having difficulty seeing it anymore.

Thanks for the breakdown of your workflow. I very much like your second attempt particularly, along with @Fegato’s, for the way you’ve brought up the background nebulosity. 
 

3 hours ago, bdlbug said:

So I have enjoyed working with your data, yes its 3hrs and yes its got noise, but in UK getting 3 hrs broadband is a result....

Indeed it is. My recollection is that I got three hours of good data and then the mist rolled in off the Atlantic.  Thanks for taking the trouble  to provide your workflow. I shall study that.   You’re right - there are any number of ways to skin a cat process an image.  It’s one reason I take so long to process an image. Three steps forward, two steps back. :) Your comment about giving the data a bit of noise reduction at the linear stage. I think I have that stuck in my head as recommended procedure when I first started learning PI.  Warren Keller even has a section on it.  Kill the noise before you stretch it seemed to be the idea ….. but not to the extent that you kill the important details. Anyway, maybe that rule of thumb has been superseded.

I also like the way you’ve developed the background. You’ve lost the purple colour and made it look more misty. Nice. You’ve also brought up the Cave Neb nicely too. 

Link to comment
Share on other sites

13 hours ago, Ouroboros said:

Hey!  Lots of responses. Thanks all. :) 

Yes, my experiments with TGVDN last night seems to confirm this.

Yes. In fact I’ve played around with this image so much I’m having difficulty seeing it anymore.

Thanks for the breakdown of your workflow. I very much like your second attempt particularly, along with @Fegato’s, for the way you’ve brought up the background nebulosity. 
 

Indeed it is. My recollection is that I got three hours of good data and then the mist rolled in off the Atlantic.  Thanks for taking the trouble  to provide your workflow. I shall study that.   You’re right - there are any number of ways to skin a cat process an image.  It’s one reason I take so long to process an image. Three steps forward, two steps back. :) Your comment about giving the data a bit of noise reduction at the linear stage. I think I have that stuck in my head as recommended procedure when I first started learning PI.  Warren Keller even has a section on it.  Kill the noise before you stretch it seemed to be the idea ….. but not to the extent that you kill the important details. Anyway, maybe that rule of thumb has been superseded.

I also like the way you’ve developed the background. You’ve lost the purple colour and made it look more misty. Nice. You’ve also brought up the Cave Neb nicely too. 

 

Hi O

The only issue I have with your data is the star haloes - there are a lot of them! and they largely defeated me.  Load these two images (xstars and +stars) attached into the PI process Blink (0.3s blink interval) and you will see a lot of the nebula circular blobs are associated with the brighter stars.  The images are basically your data after PI's Starnet2 (RGB combos with and without stars) with an automatic Histogram/STF.  So pretty much not my meddling!  Are the haloes a filter issue ?  Might be worth investigating further.  It could be StarXterminator would do a better job than Starnet2 and get rid of them - unfortunately my SXT trial sub has expired so I can't tell.

I take your point re can't see what one is looking at any more.  One reason why I like the PI Foraxx Palette utility script is that it is an algorithm.  Press the button and it does the job with no user judgement required.  No sliders.  The amazing thing is that you can feed it quite different colour balanced images and it manages to always more or less produce the same answer.  Which adds to confidence.

Simon

NGC281_dc_RGB_stars_combo_hgstf_small.thumb.jpg.b0cd5d3f40ab3c9c8ed410a4ed172ef1.jpg

 

NGC281_dc_RGB_xstars_combo_hgstf_small.thumb.jpg.95e355bfcd1e95e7c8172c46e0eef7ca.jpg

 

 

 

 

 

 

 

 

Link to comment
Share on other sites

23 minutes ago, windjammer said:

The only issue I have with your data is the star haloes - there are a lot of them! and they largely defeated me.

I see what you mean. I’ve only recently started using StarXTerminator. Perhaps a bit slap dash of me, but I have so far naively assumed, unless the residues left behind are really bad, that since I’m going to put the two images back together they’ll be hidden again*.  Previously I used Starnet2 for a short while.  I thought StarXT did a slightly better job on balance. I didn’t use filters with this image. It’s straight OSC. The haloes could be caused by faint mist I suppose. Perhaps I should reblink the original subs, weed out if necessary and reprocess.  

* I can see this doesn’t necessarily work because the starless image is probably going to get stretched more than the stars image.  

Link to comment
Share on other sites

55 minutes ago, Ouroboros said:

The haloes could be caused by faint mist I suppose. Perhaps I should reblink the original subs, weed out if necessary and reprocess.  

 I would recommend that you review other images of this region as I would not really describe the areas around the bright stars as Halos. https://astrob.in/1yj0zd/B/

I think there are two things here, first, the diffuse areas around the bright stars are real as this is an area of space with large amounts of gasses and dust, which the stars irradiate , the very definition of an emission nebula and therefore result in this diffuse effect over the stars that are obscured by gasses and dust in space and or the stars are behind , as in light years behind, but their light is diffused/scattered as it passes through this region of space as viewed from earth.
Second , yes you may also have some effect due to atmospheric seeing, high humidity, as you mentioned that mist rolled in and you had to delete 6hrs of images subs due to mist, so that could also be something that has contributed to this - but looking at other broadband images, these stars do have nebulosity around them.

To your point on StarXT - it is 'trained' to remove bright point objects, stars, and leave behind nebulosity, which it has done in this image and it is not able to tell if the diffuse areas around bright objects are space based nebulosity or earth based atmospheric seeing issue due to high cloud /high humidity.  In my view StarXT is performing its job 100% on this image.

 

Link to comment
Share on other sites

I was interested and intrigued to see @bdlbug how you returned to generalised hyperbolic  stretch again at steps 18/19. I don’t expect you to recall why you did that here but is that something you do regularly?  I’ve come across GHS as a means to stretch in a more controlled way from linear to nonlinear than is achieved by using the STF + histogram transform method. But I haven’t seen it used in the later stages as you have here. 

IMG_1816.png.797afeec5cbe2865087da4380c475030.png

Edited by Ouroboros
Link to comment
Share on other sites

2 hours ago, Ouroboros said:

how you returned to generalised hyperbolic  stretch again at steps 18/19. I don’t expect you to recall why you did that here but is that something you do regularly?

Ah, yes GHS is an incredibly powerful tool, it does a lot, at this stage of processing you highlight I use GHS to adjust colour saturation and also some localised contrast adjustments and final black point adjustments.

Adam Block did a great free video on his YouTube channel about using GHS , : Pixinsight : GHS Explained.

Adam also provided another free video called Pixinsight in Focus Stretch Academy 

I would recommend both as they give a lot of in depth teaching as to what stretching does to an  image and the tools used and when they can be applied.

Bryan

  • Like 2
Link to comment
Share on other sites

OK. So this is my most recent effort after reviewing some of your comments.  I think I’m still struggling but at least the blotchiness in my initial attempt has largely disappeared, and I’ve managed to pull up some of the background not unlike your various examples.  I could go on fiddling with it for example by attenuating the noise a tad more. Incidentally I didn’t start applying noise reduction until towards the end.  Anyway, thanks for all your help everyone. NGC281_jpg.thumb.jpeg.78369503197d5358496a8cc6b6839038.jpeg

Edited by Ouroboros
  • Like 3
Link to comment
Share on other sites

Yes, much better.  Still a lot of noise and grain when you zoom in.  I ran your last through Topaz : ) - which I think makes an improved version (ignore the star reduction, I should have extracted stars before the denoise and put them back in, but you get the general idea...)

 

Ofinal_NGC281_tpz.thumb.jpg.541449f56c4599a40549e3fc153509ff.jpg

  • Like 1
Link to comment
Share on other sites

19 hours ago, Ouroboros said:

the STF + histogram transform method

The stf is much too aggressive for a permanent stretch. It’s function is to show as much detail in a linear image as possible. It does this by bringing in the black point and white point and moving the mid point to achieve maximum contrast. If stf is applied permanently, you suddenly need to fix things that didn't need fixing before.

Edited by wimvb
Link to comment
Share on other sites

22 hours ago, wimvb said:

The stf is much too aggressive for a permanent stretch. It’s function is to show as much detail in a linear image as possible. It does this by bringing in the black point and white point and moving the mid point to achieve maximum contrast. If stf is applied permanently, you suddenly need to fix things that didn't need fixing before.

It is too aggressive.  It took me a long time to realise this fact.  Unfortunately a lot of introductory videos use the STF/Histogram as the standard method to stretch to non-linear.  So you sort of assume that's the correct way. They should really carry a health warning.  For a while now I've been using GHS as my go to stretcher. 

Link to comment
Share on other sites

14 hours ago, Ouroboros said:

Unfortunately a lot of introductory videos use the STF/Histogram as the standard method to stretch to non-linear.

That says a lot about the quality of those videos. I normally use arcsinh stretch for rgb and histogram stretch for L. I apply the latter in small increments.

  • Like 1
Link to comment
Share on other sites

I started to have a play with these data but ran into a problem with ABE creating contour lines. I don't have the time to try to sort this out because a robotic shed issue is all-consuming at the moment.

However, I saw no reason for the background to become blotchy and I really saw no vast amount of noise to deal with. Is this because I may be stretching differently?  I've seen lots of stretching videos where the imager keeps on stretching and then pulling in the black point, over and over. This seems crazy to me. Once the darkest part of your image has reached the brightness value that you want it to have, why stretch it any further? Why not pin it in Curves and stretch only above that? Why pull your dark signal up above the noise floor? In the case of this image, there is dark, sooty nebulosity well below the brightness of the more transparent background sky so, once this sooty stuff was up around a Value of 17 in Photoshop, I pinned it and stretched only above it. It all looked pretty good to me.

I don't go for any of the bought-in, sliced cheese stretches with began with DDP, years ago. I'll use a standard log stretch till the darkest signal is where it needs to be then I'll only stretch above that.

Olly

  • Like 1
Link to comment
Share on other sites

I've been on a trip for the past few days, so a bit late to this party. Here's my version. PixInsight Histogram stretch and curves stretch with a pinch of MMT to get a better definition in the nebula structures.

NGC281_dc.thumb.jpg.9b56cc765b1860d01d900c1b8b82e15f.jpg

Cropped just for balance

NGC281_dc2.thumb.jpg.2a0532f984743d535c646c834cadbb78.jpg

Edited by wimvb
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.