Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

Xiga

Members
  • Posts

    1,237
  • Joined

  • Last visited

  • Days Won

    5

Everything posted by Xiga

  1. Wow, wasn't expecting this now! That's cheered me rightly up, on what has otherwise been a horrid week of work. Massive thanks to IKI, FLO, and everyone who entered. With not being able to do any of my own imaging at the moment, these data sets have been an absolute lifesaver for keeping the processing skills sharpened, so thanks again for sharing! 🙏
  2. That's a very nice M81 indeed! I especially like the colours, they look absolutely spot on. Impressive work for just ~90 mins of Lum. Did you create a Super Lum from all 4.4 hrs of subs?
  3. I totally missed this! It's a bit late obviously, but FWIW I would have also chosen the Double Cluster (it's just stunning) with the Coathanger a very close second. Ps - your image of the Horsehead Nebula will take some beating imho
  4. Thanks Roy! I've already got a few prints up on the wall, i think this one might be a step too far though lol. Thanks Adam. I got an email tonight with a tracking number, so fingers crossed i'm nearing the end of it now. Thanks Richard! 😀
  5. All of us unfortunate enough to suffer the usual dreadful UK weather know just how bad it has been for months on end now. That, coupled with my ongoing mount issues (as in, i currently don't have one! - long story, which i won't go into) means i have had to find other ways to keep the hobby going, and what better way than to make use of old data! 😀 Last Autumn i did a mosaic of Cygnus in NB, and at the time, i tried, unsuccessfully, to make use of some old higher-res data from the trusty old SW 80ed and Nikon D5300. Well, not being the type who likes to give up easily (and sure, what else was there to do) i decided to revisit this and see if i could get it to work. The plan was not to re-process the original image (waaaaay too much work to even contemplate) but rather, just to try and incorporate the higher res Ha data as Luminance in certain areas. Thankfully, i got it to work in the end, but it certainly wasn't easy! Normally, registering images from different optical systems isn't an issue, and APP has no problem with this. The problem i had was, i was trying to register a single image taken with a telescope with an already created large 3x3 mosaic taken with a camera lens, so the two images differed massively not just by image scale but also by overall size, and APP just kept failing to register them. So in the end, i had to register the high-res images with the appropriate single panel of the mosaic. After this, APP was then able to register this newly registered image with the overall mosaic. The original image had about 27 hrs of exposure with an Atik 383l+ and two cheap vintage lenses, with the bulk of the work (i.e Ha) being done at 135mm. The new (i.e old!) data now being used is from a SW 80ED and Nikon D5300 and a 2" Baader Ha filter. I was able to add about 7 hrs of Ha data to the NAN, and even after it was scaled back to match the image scale of the 135mm lens i think it has still made a noticeable difference. I also added 100 mins of Ha data to the Pelican Nebula, taken with a Qhy9m - thanks Adam!. Finally, i was able to make use of my previous Veil Nebula Mosaic data, which was 10 hrs of Ha and 7 hrs of Oiii, which was used as Ha with the Oiii added in Blend Mode Lighten. Finally, i trawled out some 7 hrs worth of Ha data on the Crescent and was able to add that too, although it added very little, as the crescent is just too small at this image scale. All in all, a worthwhile use of a couple of nights in front of the computer, i think! 🤪 New image is first, with the original one second. With a couple of gifs added to show the improvements. Hoping to be back in the imaging game before long. CS all!
  6. Wow, what an image. I think i can safely say that's the best starless image i've ever come across. Those Bok Globules at middle left look amazing. I could honestly look at this for hours, and the colour palette is gorgeous as well. Congrats on the IOTD, well deserved! Oh, and welcome to SGL too! 😀
  7. M33 was the 2nd ever DSO i imaged (a meagre 2 hrs of data with a DSLR). With such a low surface brightness, i can remember just how hard a target it is to process. Having this much quality data to work with is a joy, and a world apart from what i had to work with 4 years ago! I worked on this, here and there, over the course of a few nights, so i'll try and summarise the main steps i took as best i can, although with it mostly being in PS, it's impossible to replicate exactly. I used APP to create the RGB image, correct any gradient and do star colour calibration. For the Lum, i bucked the trend it seems, and didn't actually use the real Lum channel at all. I found it to be a bit too over-powering, and the stars were a bit on the big side, so instead i created 2 stacks. A synthetic Lum using the RGB channels, and another using the Synthetic Lum and the Ha. Then, i used a blend of 33/67 (or 67/33, i can't quite remember!) of the two to create the final Base Lum. This had the advantage of much smaller stars, and with all the lovely Ha detail showing through too (the Ha data was truly a thing of beauty!). I stretched the final Lum in APP, but took the colour into PS to stretch manually. For this, i took a new approach for me. I delved into my copy of 'Lessons from the Masters' (which i've had for a couple of years now, but realised there were sections i had overlooked) and used Adam Block's method of stretching colour - first use Levels and bring in the white point, then do several iterations of the Shadows and Highlights Tool (Shadows only). I figured if it's good enough for him, it'll do for me! I then added the Ha again in blend mode Lighten (to the Reds only) to bring them out more (it seemed a shame not to!). Then it was just the usual array of many, many small adjustments, too many to list. No star reduction needed either, due to not using the Lum channel. Thanks for sharing. Although i'm not sure how i'm going to be able to go back to my own crumby data after this lol 😋
  8. Beautiful, Richard. I'm so used to seeing this in pure NB, that you forget how good it looks when broadband data is included too. Interesting too, how the RHS seems to be more densely populated with stars than the LHS. I've never noticed that before.
  9. Seriously good M51 Peter! Amazing to think that image wasn't guided! 😲 Quick question - most LRGB imagers seem to go for parity between their L and RGB, but in this case you have gone for a quite drastic reversal of 4:1 ratio in favour of RGB. Any particular reason why? Is it because dark skies favour RGB over LRGB for quality, rather than speed? I know that creating a synthetic Luminance from RGB over using a true Luminance can be beneficial in terms of smaller stars and perhaps slightly sharper details (at the expense of faint detail). Is that anything to do with it?
  10. Amazing how sharp that looks at full size. Given what i'm seeing, i'd say you probably don't need to bin or downscale, the resolution is unreal. It doesn't have that soft look that can come from oversampling. Bring on Galaxy season! 😀
  11. That looks amazing Adam! Especially for just 3 hrs. It looks like you have a killer Galaxy setup there. Can i ask, judging by the image dimensions, it looks like you have downscaled or binned, which i think is smart, otherwise the image scale would be too high at ~0.62". Which did you do - capture at Bin x2 or capture at full res and just downscale it afterwards? I suspect just downscaling will be no different.
  12. Very nice Adrian. I really like the subtle colour palette in this. ps - To avoid the stars looking like discs when adding them back in, the key is to use 'Screen' Blend Mode and not 'Lighten'. I recently put up a guide on how to do this, see below:
  13. Thanks David! That star (Gamma Cas) was a real challenge wasn't it?! I'll try and show below how i went about adding in the 'lesser-stretched Ha stars' because this stage alone made a huge difference. From then on, it was just a case of making smaller adjustments. Unfortunately this will all be done in PS (i know you use P.I) but pretty much anything that can be done in one program can be done in another (once you work out how) so hopefully it will still be of some use to you. I'll try and keep the waffling to a minimum, so here goes: 1. Save 2 versions of the Ha stack using your favourite astro program. I use APP myself, so i do a DDP stretch and save it as a 16bit tiff. For the 2nd one, i save a linear unstretched version, also in 16bit tiff format. Obviously the stars are way too big in the stretched version, but we can still use it for all the nebulosity. Now bring the stretched version into PS and convert it to RGB (even though it's Mono) as this lets us view the pixel brightness in the usual 0-256 range. Now take a note of the Mean readout on the Histogram panel (for later). DDP: 2. Now run Starnet++ on it, clean up any remaining star artefacts, and then Denoise it. These days, i'm liking Topaz Denoise Ai, but you can use whatever you fancy. Starless Denoised: 3. Now put the starless layer on top of the linear version in PS, and set the starless layer to Blend Mode Screen (note, normally i would put the stars on top in Screen mode, but i had to do it the other way around in this image due to Gamma Cas, which needed a mask). Also, when adding in stars to a starless image, don't be tempted to use 'Lighten' mode, it doesn't work very well on extended nebulae as you will usually end up with dark rings around a lot of stars (Lighten does work quite well on galaxy images though). Now select the linear layer, and start to manually stretch it, and stop when the stars are coming through at just the right size. I usually do about 4 stretches (i don't like the stars too small) and i like to use a mixture of both Arcsinh and regular stretches for this, as it keeps the cores down while still retaining their natural profile. Once done, now select the starless layer, and add a feathered layer mask to exclude Gamma Cas from the starless layer. Now Flatten (or do a Stamp Visible). Lightly stretched stars: Gama Cas Mask 1: Results after Flattening: 4. Now the stars should look good, but everything is obv way too bright, so we need to do a downwards Curve adjustment. For this, i place a point at the highest possible in Curves, which is at point 251, and then i use the downward arrow to lower it so that it gets close to the Mean level from the earlier DDP image. I usually stop on the high side, then place one further point mid-way and then lower that slowly until it's closer. This helps to preserve the very faintest of stars. We should now have an image that is very close to the DDP image from earlier, but with greatly reduced stars. Downward Curve: 5. Even though Gamma Cas is already looking a lot better, it still needed more work. It was this part of the processing that i am struggling to remember exactly what i did, but i think i used an even lesser stretched version again, and blended that in. I have the mask that i used, you can see it below, but i think i also blurred the out halo a bit, and also used the shadows & highlights tool as well. Gamma Cas Mask 2: 7. Some minor star bloating tweaks to a couple of smaller stars, and that was the Luminosity layer done. After i merged it with the tonemapped colour layer, i then did a further bit of shadows & highlights to bring down the main nebulosity, but also on Gamma Cas again, to help tame it a bit more. Final Lum: Here's a screengrab of what the layers looked like in PS: And finally, a gif showing the before (original DDP stretch) and after: Sorry, ended up waffling on more than i intended to lol. I'm sure there are other/better ways to go about this, but this is how i do it at least 🤪 Hope others find it useful too, and if anyone has any ideas for improvements i'd love to hear them! 😀
  14. Thanks everyone for the kind words. I feel truly honoured! ☺️ I have to say, it came as a pleasant surprise when i found out last night. It's great seeing so many different takes on the same data. When i see what other people are able to do with their versions, it certainly gives me impetus to try out new things myself, so i find it a great motivation for trying to improve one's skills. My mount has been away for servicing for over 13 weeks now 🤨 so it's been nice to have some data to play with during that time, especially data as nice as this, so thanks again Grant for providing us with it. And a big thanks to FLO too! 😀
  15. The part about throwing away information, what I meant to say was, wouldn't binning x2 early in the workflow (ie right after calibration) not have a detrimental effect on star analysis and registration? Surely it would be better to keep the full resolution for this part of the workflow, and then reduce the resolution afterwards? So it looks like for any APP users out there with oversampled CMOS data, it's a simple operation to effectively bin the data by just changing the scale factor at the integration stage, which is good news 😀
  16. A pretty barren year for me with just 5 images to show for it. Cygnus SHO Mosaic - 27 hrs, Atik 383l+, Tamron Adaptall II 135mm F2.8 shot at F4.5, Zuiko 50mm F1.8 shot at F4. Double Cluster. ED80, Atik 383l+ Lum (60 mins), Nikon D5300 Colour (60 mins): Pelican Nebula. 100 mins each of SHO, ED80, Qhy9m (thanks Adam!): M106. ED80, just under 7 hrs with a Qhy163c: Comet Neowise. About 40 mins with a Nikon D5300 and a very dewy ED80!
  17. Vlaiv, i went ahead and did a few test stacks using the only mono data i have, which was shot at 2.13", so more under-sampled than over-sampled, but it should hopefully still have some use for comparisons sake. The data set was just 6 subs (20 mins each) in Sii on the Pelican. I thought about using Ha, but decided on using Sii instead, as it was much fainter, so i thought it would accentuate the noise levels more. Firstly, I used ImageJ to Bin each calibrated sub x2 and then i stacked them in APP on default settings (which uses Lanczos-3 at the integration stage). Note, i would hope not to ever actually go down this route, as it would mean throwing away useful information before the registration phase of stacking, which is definitely not something i'd want to do. Plus, there's the actual conversion of all the files! Next, I then simply loaded in all the calibrated subs to APP and let it stack them with exactly the same settings as above, except i chose Bilinear and a scale factor of 0.5 at the final Integration stage. I'm no expert at analysing files, but as far as i can tell, the APP stack had less noise but also had noticeably less detail (even in a stack that was now effectively at 4.26"). Basically it looked a lot blurrier and not as good as the ImageJ stack, so to my eye, Bilinear and a scale factor of 0.5 in APP does not appear to be the same as binning x2. I do note that the level of stretch is not the same in the jpg below, the Bilinear one is stretched more (these are just DDP-stretches straight from APP) but when i compared them visually, i got them a lot closer and the Bilinear one still looked blurrier to me. Finally, i changed the interpolation back to Lanczos-3 and let APP do another stack at a scale of 0.5. To me, this now looks very close to the ImageJ stack. Would be interested to hear your analysis Vlaiv, but from what i'm seeing, i think i'd be happy to just go with this method when i shoot over-sampled data. ImageJ Binned Stack: ImageJ_Binned_Stack-fy-0degCW-1.0x-LZ3-NS.fits APP Bilinear Scale 0.5: APP_BilinearAtIntegrationStage.fits APP Lanczos-3 Scale 0.5: APP_Lanczos3AtIntegrationStage.fits
  18. Yes I believe that's all possible. I've no idea how to use ImageJ though. Is there a quick way to just point it to a folder and have it bin all the files in it? It would still be my preference to keep everything within APP, but it would be interesting to see if there was a noticeable difference between APP's version of binning (Bilenear and 0.5 Scale factor) and the method of feeding APP actual binned subs.
  19. Thanks for this Vlaiv, i think this clarifies things for me now. Your approach of creating sub-groups of smaller files does sound good, but APP is my tool of choice for calibrating and stacking so i just need to find the best workaround using it, and by the sounds of things, it looks like choosing Bilinear and a Scale Factor of 0.5 at the Integration stage is the way to go. I'd still like to run some tests on real world data though, just to see the improvements with my own eye. @discardedastro sorry for derailing your thread! 🙏
  20. Thanks Vlaiv. What do you make of the 2nd post from the thread below: https://www.astropixelprocessor.com/community/main-forum/binning/ Mabula (the creator of APP) seems to suggest that it's possible to bin x2 in APP by using the nearest neighbour debayering algorithm and a scale factor of 0.5, but that he actually recommends using Cubic B-Spline instead. I take it in your example above, the binned image is using nearest neighbour? What would the result look like if it were to use Cubic B-Spline? I'd like to do some testing in APP, trying out various approaches, to see what produces the best results for oversampled images that one wants to bin after the fact. I would need some oversampled data though. @discardedastro if you would be willing to share some of your calibrated Lum subs with me for this, drop me a PM. It would be greatly appreciated.
  21. @vlaiv is there any difference between software binning and just choosing to reduce the scale of one's stack? Case in point - when i get my RC6 up and running with my 268M at some point, the image scale will be about 0.58" so obviously oversampled. My plan is just to capture as normal (Bin x1) but then in APP when the stacking process gets to the final stage (integration) I was just planning on choosing a Scale Factor of 0.5 (it uses Lanczos- 3 by default) to basically do the same thing as Bin x2 and bring the image scale up to 1.16". Do you think this method would actually be better than capturing Binned x2 data at source? Apart from smaller file sizes, there shouldn't be much, if any, difference right? Although what about FWC? Wouldn't that be much higher if binning at source?
  22. KiThe results you get are amazing Richard, I really hope the LED lights don't have too much of an effect on your imaging. It's good that they will turn them down after midnight, that's something at least. We bought our new house 2 years ago. My heart sank when we came to view it near the end and I saw the LED light literally smack bang right at the front (see below) 😥 Out the back (South) isn't much better, tall trees blocking everything below 60 degrees. Annoying, as my LP isn't too bad either, Bortle 5, not far off 4. I might try phoning the council to see if they can turn the light off after midnight. Not holding out much hope though
  23. While I wait for my 268M to arrive, I've been ordering all the necessary bits to make up the 55mm of backfocus i need for the 80ED. I'm planning on using the 5mm M48 plate that comes with the camera, which I now understand brings the camera's backfocus to 19.5mm. My TS filter drawer is T2, so I've ordered the ZWO M48-M42 adapter below. Has anyone used this before? It's fairly minimal looking, to say the least, but it only has to support the weight of the camera itself, so I'm hoping it's ok. I did think of going with the M54 plate, and ordering an M54-M42 adapter, but nowhere had any in stock. https://www.firstlightoptics.com/zwo-accessories/zwo-m48-to-m42-adapter-ring.html As I have to pack away after each session, I just ordered a Baader T2 Dust Cap from FLO as well.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.