Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

BlurXTerminator - Wow!


Xiga

Recommended Posts

Came across this last night. Russ Croman has done it again, this time using AI to help with deconvolution\sharpening. Adam's video is excellent, i really recommend watching the whole thing. The improvements he was able to achieve are remarkable, so much so, i think i'm now going to have to finally give in and get Pixinsight! Up to now i've been happy using StarXT and NoiseXT in Photoshop, but this one only works in P.I as it needs linear data. It's a bit pricey mind, $100 (or $90 if you already own one of Russ's prior tools) but once i saw how it can even fix bad corner stars (or even stars affected by differential flexure throughout an image) then i realised it was something i need to have in my life. 

And the best part is, like all of Russ' tools, it looks easy to use. Deconvolution is a total faff to try and do, so this simplicity is a big plus in my book. 

 

Edited by Xiga
  • Like 4
  • Thanks 1
Link to comment
Share on other sites

Has to be worth looking into. I do have Pixinsight but consider it a barbaric environment of tyrants fresh from the Spanish Inquisition. If you do something of which they disapprove (which I do all the time in Photoshop) there is a roar from the heavens of 'Die, heretic!'

:grin:lly

  • Haha 3
Link to comment
Share on other sites

5 minutes ago, ollypenrice said:

Has to be worth looking into. I do have Pixinsight but consider it a barbaric environment of tyrants fresh from the Spanish Inquisition. If you do something of which they disapprove (which I do all the time in Photoshop) there is a roar from the heavens of 'Die, heretic!'

:grin:lly

It does, I agree, but I can’t help thinking the tool is just adding information that was not there to start with, which is just wrong, this tool was trained with Hubble data, ….so it’s adding what it think should be there based on what the AI has been trained on… I’m not sure this is right personally…I’m on the fence…🤔🤔

  • Like 2
Link to comment
Share on other sites

6 minutes ago, Stuart1971 said:

It does, I agree, but I can’t help thinking the tool is just adding information that was not there to start with, which is just wrong, this tool was trained with Hubble data, ….so it’s adding what it think should be there based on what the AI has been trained on… I’m not sure this is right personally…I’m on the fence…🤔🤔

We might be able to test this by taking a low res and high res image of the same object and running BlurX on the low res one before comparing the result with the high res.

Olly

  • Like 2
Link to comment
Share on other sites

8 minutes ago, ollypenrice said:

We might be able to test this by taking a low res and high res image of the same object and running BlurX on the low res one before comparing the result with the high res.

Olly

I think the low res one with BlurX will be the winner, as the other one will be “real data” but would be interesting to see….agreed….

  • Like 2
Link to comment
Share on other sites

Curious to see also, although it's APP and Photoshop for me. I did see a luminance image comparison on astrobin and my first thought was that it seemed to have some artefacts similar to  topaz ai, which I do not trust. Need to know, irrespective of training, how it does the process since adding what it determines to be correctly scaled hst data(?) or another process based on data extraction from some source other than the image being processed, is dubious to me imo (but I do not know how it works I side). Let's see in time. It might well be a cleverly trained sharpening, blurring and erosion option, where it learns the sharpening or related processes based on ideally captured data, rather than adding other data. That would be interesting. Would like to know either way out of curiosity.

https://www.astrobin.com/full/ylvwu6/0/?utm_source=astrobin&utm_medium=email&utm_campaign=notification&from_user=106258

Mind you, that astrobin process may have fallen foul of the blurxt settings?

  • Like 1
Link to comment
Share on other sites

I've given this a try. I must be missing something, because although it does a good job tightening the stars in my images, for the life of me I can't see much difference at all in nebulosity. I want the same "wow, what a difference!" reaction as are in the videos 😂

  • Like 2
Link to comment
Share on other sites

I have been so pleased with Russ´filters/actions (or what you would call them) that I bought it immediately earlier today. With the default setting I think it tightened the stars a bit too much but it did a good job on the nebuloisity. So if I use it it will be selectively, adding it selectively as a layer to my PS processing. Here is my result first without and then with a bit of selective BlurXT. Not a huge difference but a difference that I liked:

Cheers, Göran

20221212 LBN438 RASA1+2 PS8smallSign.jpg

20221212 LBN438 RASA1+2 PS9smallSign.jpg

  • Like 1
Link to comment
Share on other sites

As a pragmatist I'd be happy to take an image from a professional scope and compare the effect BlurX had on my image of the same target. If it made mine look more like the professional image, I'd consider it valid. If it took my image in some random direction, I wouldn't.

One thing I don't need is tightened stars! Since using StarXterminator, I frequently find myself needing to give them a gentle blur.*  Say I'm at the stage of having the extracted stars placed over the starless image in blend mode Screen. Since I'm working, at this stage, with a moderately or fully-stretched image, I then de-stretch the stars using Levels in order to reduce them.  In order to give them more 'pop,' especially in the case of the now very tiny ones, I simply increase the contrast on the star layer. At this point I often make two adjustments, the first being to blur the stars (Gauss 0.2 to 0.6 or so). This makes them blend into the image more naturally, as does reducing the opacity of the star layer by a very small amount. The other tweak is for any stars which are still 'standing out' from the nebulosity around them in a sharp, un-natural way. Here I activate the starless layer and give the area beneath the stars a dab with the burn tool to give a slight glow around the stars, as happens naturally with optics.

So I don't want my stars tightening but having the corner stars made rounder would be nice for the RASA.

Olly

* Edit: Is this what they call a first world problem?  :grin:

Edited by ollypenrice
Stated
  • Like 2
Link to comment
Share on other sites

13 hours ago, Stuart1971 said:

It does, I agree, but I can’t help thinking the tool is just adding information that was not there to start with, which is just wrong, this tool was trained with Hubble data, ….so it’s adding what it think should be there based on what the AI has been trained on… I’m not sure this is right personally…I’m on the fence…🤔🤔

I agree, if it adds resolution to an image that exceeds the capabilities of the scope and equipment used to capture it, then what's the point of the hobby? Might as well just invest in a powerful PC, software and a load of processing tools and make it up as you go along. Surprised though that it's available on Pixinsight if this is the case, knowing how the developers feel about "real astrophotos"

  • Like 2
Link to comment
Share on other sites

7 minutes ago, david_taurus83 said:

I agree, if it adds resolution to an image that exceeds the capabilities of the scope and equipment used to capture it, then what's the point of the hobby? Might as well just invest in a powerful PC, software and a load of processing tools and make it up as you go along. Surprised though that it's available on Pixinsight if this is the case, knowing how the developers feel about "real astrophotos"

If you have a way of measuring something to a certain degree of accuracy, and you also have a good knowledge of what your measurement errors are, might you not, in principle, measure with an accuracy which exceeds that of your system?

Olly

Edited by ollypenrice
False click
  • Like 1
Link to comment
Share on other sites

6 minutes ago, ollypenrice said:

If you have a way of measuring something to a certain degree of accuracy, and you also have a good knowledge of what your measurement errors are, might you not, in principle, measure with an accuracy which exceeds that of your system?

Olly

Not quite sure I understand your point Olly? To me it's like taking RGB data with an amateur scope and using Hubble Lum data then to tidy it up? I know it's not exactly like that but perhaps the same principle?

  • Like 2
Link to comment
Share on other sites

42 minutes ago, ollypenrice said:

If you have a way of measuring something to a certain degree of accuracy, and you also have a good knowledge of what your measurement errors are, might you not, in principle, measure with an accuracy which exceeds that of your system?

I'm not sure you can, I guess it depends on the application/data.

Say you have a star that lights up a single pixel without effecting the surrounding pixels in any way, it's impossible to determine the exact position of the star within that pixel.

Although when you start image stacking/super-resolutioning, you can start to exact it's exact position due to each image being ever so slightly different (due to telescope/atmosphere wobble) and so each separate image contains a little more of the missing jigsaw puzzle.

  • Like 1
Link to comment
Share on other sites

1 hour ago, david_taurus83 said:

I agree, if it adds resolution to an image that exceeds the capabilities of the scope and equipment used to capture it, then what's the point of the hobby? Might as well just invest in a powerful PC, software and a load of processing tools and make it up as you go along. Surprised though that it's available on Pixinsight if this is the case, knowing how the developers feel about "real astrophotos"

Totally agree….👍🏻

  • Like 2
Link to comment
Share on other sites

Here's what it did at default settings to our NGC7331 data from 2019 Olly.  Looks pretty good to me, certainly not as artificial looking as the Astrobin image of the same galaxy that Colm put a link to.   I've also tried it on Samyang 135 data and it does a good job there too  in particular reducing the coma on corner stars.

Dave

NGC7331_W-WO_BlurExt.thumb.jpg.155848815ef3b88418d58ae8ff022194.jpg 

  • Like 6
  • Thanks 1
Link to comment
Share on other sites

1 minute ago, Laurin Dave said:

Here's what it did at default settings to our NGC7331 data from 2019 Olly.  Looks pretty good to me, certainly not as artificial looking as the Astrobin image of the same galaxy that Colm put a link to.   I've also tried it on Samyang 135 data and it does a good job there too  in particular reducing the coma on corner stars.

Dave

NGC7331_W-WO_BlurExt.thumb.jpg.155848815ef3b88418d58ae8ff022194.jpg 

Does it do anything for very slight elongated stars, due to a tiny bit of tilt in one corner…? As that would be very welcome for a lot of people….and the software is cheaper than a quality tilt adjuster….👍🏻😀

  • Like 1
Link to comment
Share on other sites

5 minutes ago, Stuart1971 said:

Does it do anything for very slight elongated stars, due to a tiny bit of tilt in one corner…? As that would be very welcome for a lot of people….and the software is cheaper than a quality tilt adjuster….👍🏻😀

I've found it does but my experience is so far limited to 2 images!  NGC7331 and this one with the Samyang

Dave

Orion_Belt_Samyang_Final_15Dec22_50pc.thumb.jpg.3819b6b7290d40e9cb10fdc959f71768.jpg

 

  • Like 2
Link to comment
Share on other sites

This seems like a gamechanger!  The original was me messing about four an hour or so, NR, sharpening etc, to be fair I think there was more I could have got from it, it was a quick process but..... For the new process, I just ran NoiseXTerminator then BlurXTerminator and combined the channels and stretched, thats it!

 

After:

image.png.6be4ec7cf49c1938b522aec340bd4d4f.png

 

Original:

image.png.0c50193121a180da51ecf88b131a63e5.png

  • Like 2
Link to comment
Share on other sites

21 minutes ago, Laurin Dave said:

Here's what it did at default settings to our NGC7331 data from 2019 Olly.  Looks pretty good to me, certainly not as artificial looking as the Astrobin image of the same galaxy that Colm put a link to.   I've also tried it on Samyang 135 data and it does a good job there too  in particular reducing the coma on corner stars.

Dave

NGC7331_W-WO_BlurExt.thumb.jpg.155848815ef3b88418d58ae8ff022194.jpg 

I have to admit, it does look very good! I may have to swallow my pride on this one!

  • Like 1
Link to comment
Share on other sites

1 hour ago, david_taurus83 said:

Not quite sure I understand your point Olly? To me it's like taking RGB data with an amateur scope and using Hubble Lum data then to tidy it up? I know it's not exactly like that but perhaps the same principle?

When we hear that BlurX is trained on Hubble data, that does not mean it is going to apply Hubble data to yours. Clearly that would be cheating and wouldn't work on anything except a target imaged by Hubble. Given Hubble's tiny field of view, that's not going to cover many images. My understanding is that it has been trained to understand the point spread functions of astrophotos using ones of very high quality. From this source of information it can recognize and correct distorted point spread functions in our inferior systems. It does not take any Hubble information specifically on M31 to apply to our M31s. What it takes from Hubble, I presume, is an ideal PSF against which to asses our PSFs.

I think this kind of deconvolution is more akin to calibration, which takes me back to my point about understanding the systematic errors of our measurements. (Our photos are measurements of the amount of light, pixel by pixel, arriving from the object.) When we apply darks and flats we improve the accuracy of our measurements by adjusting them to the known errors in our camera noise and our optical illumination. We can, therefore, measure to an accuracy greatly exceeding the natural accuracy of our uncalibrated systems.  In a similar (but not identical) way, BlurX analyses our data to identify the distortions and characteristics of our PSFs so that it can correct them.

The difference between this and darks-flats is that darks and flats are direct measurements rather than computations. However, they are computations derived only from your own image. That is why the Spanish Inquisition (Pixinsight) have approved the routine.

Olly

  • Like 3
  • Thanks 1
Link to comment
Share on other sites

22 minutes ago, blinky said:

This seems like a gamechanger!  The original was me messing about four an hour or so, NR, sharpening etc, to be fair I think there was more I could have got from it, it was a quick process but..... For the new process, I just ran NoiseXTerminator then BlurXTerminator and combined the channels and stretched, thats it!

 

After:

image.png.6be4ec7cf49c1938b522aec340bd4d4f.png

 

Original:

image.png.0c50193121a180da51ecf88b131a63e5.png

That's the wrong way round. You should run BlurX first (Adam Block explains why in the video.) Good result, even so.

Olly

Edited by ollypenrice
typo
  • Like 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.