Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

ngc1535

New Members
  • Posts

    14
  • Joined

  • Last visited

Posts posted by ngc1535

  1. 3 hours ago, Richard_ said:

    It's been a while since I've posted an image, only because it's taken so long to get enough subs to process :)

    I couldn't quite fit the Rosette nebula into my field of view, so rather than swap from my FLT120 to my Redcat51, I decided it would be a good time to try out a mosaic. This image was a simple 1x2 mosaic captured with Antlia 4.5nm SHO narrowband filters and my QHY268M monochrome camera. There's an almost 50/50 split between panels for total imaging time:

    • Panel 1 = 8h50m
    • Panel 2 = 8h25m

    The OIII data was quite noisy, so it really helped waiting a month of rain and cloud until I could dedicate an entire night to capturing more. It was well worth the wait as the final OIII data was looking much better and the noise between panels was more balanced.

    I used the "PhotometricMosaic" script from John Murphy to assemble each monochrome panel which resulted in three master lights (S, H and O). I then created the SHO image using the "NB Colour Map" script within PixInsight using guidance per Adam Block's YouTube channel. Hue and intensity values were assigned to each monochrome channel and a histogram transformation applied to control the amount of blend between the SHO channels. After running colour calibration, the resulting imaging can be somewhat considered as colour accurate (well, more so compared with traditional narrowband palettes!), which is great as you can see more natural red (Hydrogen) and blue (Oxygen) hues from the Rosette nebula. Sulphur is shown as light orange, which can be seen on the tips of Hydrogen gas but is more visible towards the centre of the image around NGC 2244 and NGC 2246.

    I may have another play around with the process, but otherwise I'm really pleased with how this project has come out! Comments and constructive critcism are welcome :) Image acquisition details are on my astrobin per the link below:

    https://www.astrobin.com/z36geu/

     

     

    Beautiful work on this image!

    -adam

     

     

    • Thanks 1
  2. On 15/01/2024 at 05:55, Ouroboros said:

     

    Even so the star de-emphasised script appears a bit clunky from my current position of ignorance. I am a bit surprised that in these days of AI driven processes someone hasn’t come up with a script or process to make this a simple one-step tool ie leave the top  10, 20, 30% of stars (user set) untouched and de-emphasise the rest. 

    I appreciate you guys mentioning my method for star de-emphasis. Yes, I came up with it back in 2018... well before BXT. 

    In my most recent video (NGC 1491, Fundamentals) I leverage BXT to do this job. If you understand my original methodology... you can probably figure out what I did. However, being a member of my site will save you from guessing. :)

    -adam

     

    • Like 1
  3. 1 minute ago, tomato said:

    My experience to date is I see more noticeable improvements on 'deep' data sets, maybe the AI has more signal to work with? Here are NGC 7331 crops (20 hrs of OSC data), one with SPCC and BlurXterminator (default settings).

    Image12PMCC_crop.thumb.jpg.4ad28d43789fdf5d651a67ef3119e3f9.jpg

    Image05_BXT_crop.thumb.jpg.a7f0d2e62ef6889fc95a892f3b815a80.jpg

     

    Wow, you better put the coffee on!😄

    Regarding the information to work on (Tomato)- this is a BXT specific or special requirement.. .it has always been true for any deconvolution. 

    • Like 2
  4. 11 hours ago, Lee_P said:

    @ngc1535 I'd love to get your thoughts on why I'm not seeing much improvement using BlurX on the nebulosity in these stacks -- is it because I need a higher SNR, as was suggested earlier in this thread? Thanks in advance if you're able to help!

    Correct. You do not have the S/N (in most areas) necessary to enjoy the benefits of deconvolution (of any type). 

    -adam

    • Like 1
    • Thanks 1
  5. On 18/12/2022 at 09:48, wimvb said:

    I think a 14" BXT processed image should beat a 6" APO BXT image, because what BXT does is take care of blurring by atmosphere, guiding, and focus issues. All that is left after that are telescope optics (read: diffraction limit or spot size), imaging scale, and SNR in the unprocessed masters (BXT is done before any magical noise reduction).

    I don't think that a direct shoot out is fully valid. Differences in images of the same target are at least as much due to differences in processing style and processing decisions as they are to differences in equipment and data acquisition.

    Case in point: Adam Block recently published an image of M51, processed with "xXT".

    https://www.astrobin.com/3ehcpo/?nc=all

    The addition of Ha, and different processing, makes this a completely different image from the one Olly linked to earlier in this discussion.

    Just to clarification... the latest image I published on AstroBin (linked to above) is from my own effort. It is not the same at the M51 data in the video (which is from a friend and I used his data as an example in my tutorials- it was taken with a 16-inch telescope. My latest version is with a telescope twice the size). I do think my latest version is pretty good since it leverages many tools and techniques of today. The original image I produced from this data is from 2011. 

    -adam

    • Like 3
  6. On 16/12/2022 at 08:27, gorann said:

    I saw the whole video, but maybe I got it wrong. However, near the end he demonstrates the artifact that it can produce, exemplified by the Crab nebula. What worries me then is that if you apply it early on and then much later realize that it has produced artifacts, it will be a bit frustrating. It suggests that you need to do two processes, one with and one without BXT.

    Goran,

    For clarification to clear any doubt- in the video I operate on linear images that were created after RGB combination (having been DBE'ed and SPCC corrected). 

    Concerning the artifacts- all deconvolution algorithms produce them. The extra work you suggest you should have been doing all along if you deconvolution of any sort. This suggests nothing new and is not more frustrating than before. There is a good argument it is less frustrating. all you need to do and compare traditional decon to BXT. BXT with modest settings minimizes a number of artifacts that traditional decon produces. 

     

    -adam

    • Like 2
    • Thanks 1
  7. It might be because with WBPP it is now using Average Sigma Clip for a frame number of 10... did you set the sigma thresholds for this? The defaults will not reject much if anything... which is why your things are likely showing up. 

    If you want to experiment, lower the Sigma Low to 3 (or maybe even 2.5) from a default of 4 and see what happens. 

    (most people do not use Average Sigma Clipping... )

    I will  send a note to Roberto... perhaps should have used regular Sigma Clipping in my opinion.

     

    -adam

    • Like 1
  8. 9 hours ago, Northernlight said:

    Hi Adam,

    Just slowly working through your video's when time permits and the kids aren't constantly harassing me.  I ordered a new high quality newt today, which will be ready in around 10 weeks - so it's motivation to keep working through the tutorials to get my PI skills in order.

    For me, i've always found the automatic sigma method works very well.

     

    I agree. I use Cosmetic Correction (auto) as an incremental step. Anything that it does not take care of- is likely to be handled during rejection in ImageIntegration.

    -adam

  9. 9 hours ago, Northernlight said:

    No I think That Adam is saying that we do the usual calibration then cosmetic correction can be performed in a number of ways:-

    1)  You can use a master dark with cosmetic correction as one method of removing the HotPixels

    2) The other method is not to use a dark and just use just use the auto detection option for removing the HotPixels, as that what adam does in his Cosmetic correction Videos as far as i remember when i looked at it last week which allows you to use a single process icon for BPP/WBPP

     

    Cheers,

    Rich.

    It is true that after you have calibrated data in hand, you can apply Cosmetic Correction. There are THREE different methods of using Cosmetic Correction. I use the Sigma method (which is relatively automatic and quite robust. You can be particular about it and determine hot pixels somewhat empirically by measuring one of the dark frames. (All of this is explained in my lessons. :) ) I don't like "1)" above as much because it doesn't work in every case.

    Thanks!

    -adam

  10. 10 hours ago, MarkAR said:

    Hi Adam, if I understand you correctly then Cosmetic Calibration is applied to the calibrated frames (if necessary) before they are integrated.

    Also wish to thank you for all the PI videos, they are very helpful.

    Yes! 

    This is something of pitfall of "automation" through scripts- it isn't always obvious what the order of operations is. 

    Great job on working through the videos and I am very glad they are helpful!

     

    Sincerely,

    Adam Block

    • Thanks 1
  11. HI all,

     

    There may be some confusion here. Cosmetic Correction is a kernel filter that is run on data after they are calibrated with biases, darks, and flats. Cosmetic Correction (basically a hot pixel filter as well can have potential uses for column defects) is not connected to darks in the sense of calibration, scaling, etc. I do demonstrate it is possible to scale darks- which has some benefits regarding maintaining a dark library for a cooled CCD camera- but it comes with some pitfalls that deal with hotpixels and occasional scaling errors through dark frame optimization. It is almost always better to subtract a dark of the same duration if you have a high quality master in hand. 

    This is a subtle connect between dark frames and Cosmetic Correction in that one of the methods of using Cosmetic Correction is to use a dark frame to generate a hotpixel map. But again, this isn't related to the calibration process. 

    Feel free to connect with me through my website at AdamBlockStudios.com if this particular issue is confusing. 

    Thanks,

    Adam

     

    • Like 1
    • Thanks 1
  12. Alan,

     

    Great job on the use of the technique. I appreciate you taking it to heart. As I mentioned in my tutorials on this- even if you do not like outcomes of combinatorial means of processing an image (I used this "trick" as part of a larger thing to de-emphasize stars)- sometimes the individual small methods are useful in and of themselves. 

    -Adam

  13. 14 hours ago, alan4908 said:

    Hi Vlaiv

    Yes, the colour is definitely interesting. 

    Here's the image from Adam Block taken through a 20inch RC at the top of a mountain, I think he won an APOD for this so, I'm assuming that the colours are accurate.

    I've used Registar to align with my own attempt and marked the location of the green blob. As you can see very green. :hello: 

     

     

    I would not assume that since an image is published as a NASA APOD that the colors are "accurate." The astronomers (Nemiroff and Bonnell) do not check for this, but are as knowledgeable as anyone in our active community that would spot the weird stuff. I do not believe this image was published as an APOD...but I could be wrong. A *better* assumption is that I took great care in the fidelity of the details and color for images I publish. 😁

    Indeed, small green blobs that look like HII regions are, though uncommon, not really rare. It is all too easy to "remove green" blindly (or nowadays SCNR it out) and miss out on some interesting astrophysical things! This is where examining the data closely and letting it be is a good skill (one you have shown with your image!).  

     

    Another good example of these green blob things is this image:

    http://www.caelumobservatory.com/gallery/n6240.shtml

    It happens to be in this galaxy that I discovered my own Supernova..but it was not related to these blobs!

     

    -Adam Block

    • Like 1
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.