Jump to content

SkySurveyBanner.jpg.21855908fce40597655603b6c9af720d.jpg

BlurXTerminator and oversampled data


tomato

Recommended Posts

This is perhaps challenging established methodology, but can BlurXterminator do a good job with oversampled data? I found just under 11 hrs of LHaRGB data captured with the Esprit150/ASI178, imaged at native resolution of 0.47 arcsec per pixel (before I knew better).

I've just run it through my SPCC/BXT/SXT/NXT workflow and got a reasonable result. I tried the same thing but integer resampled x2 at the start. To my mind the first result is better, and certainly it stands zooming in better as the stars get pixelated very quickly on the second image. Of course the $64K question is is the detail real, but the consensus on BXT seems to be it is doing an honest job in that department. Any views?

0.47 arcsec/pixel

Image06_BXT_0.47aspp.thumb.jpg.75925ff35583b7ac658facd835cd8778.jpg

x2 Integer resample

Image05x2RS.thumb.jpg.df5fa1d1b2d225dfa5248b974f5cd284.jpg

  • Like 5
Link to comment
Share on other sites

I can't really pass judgement between the two unless we have data that has not been treated like this - to see the differences.

I'm also not very impressed with this plugin. So far - all examples that I've seen feel to artificial and obviously treated (over processed).

Link to comment
Share on other sites

Here are two linear images. 21x15 minutes at 0.9"PP using TEC140 with Atik 460.  At 0.9"PP this is slightly oversampled but not absurdly so.

Both have been through DBE but nothing else. Lum21X15 les granges M51 has had no further processing while Lum BXT has been through Blur Xterminator set to a high degree of non-stellar sharpening and automatic PSF. Obviously an adjustment of BlurXterminator could have reduced its effects but I worked with this version.

https://www.dropbox.com/scl/fo/5r7omurjdyaxth83dwz5r/h?dl=0&rlkey=294dwqgyw44kultgl1c3n7ec7

https://www.dropbox.com/s/en9bov5uxq9igtz/LUM BXT.tif?dl=0

I think these links will give access but Dropbox seems to change on a regular basis and leaves me baffled. If they don't work for you, please PM me a reguar email address and I'll send them to that.  

My own final version is in the link below. I did not push large scale contrasts as hard as I might have done because I was looking fr a result which was precise but gentle. I've already enthused about BlurX on here so I leave you to try the data and see how you feel about it.

https://ollypenrice.smugmug.com/Other/Galaxies/i-d5BBZQp/A

Olly

 

  • Like 1
Link to comment
Share on other sites

1 hour ago, ollypenrice said:

Here are two linear images. 21x15 minutes at 0.9"PP using TEC140 with Atik 460.  At 0.9"PP this is slightly oversampled but not absurdly so.

Here is an example of why I'm not impressed with the plugin:

image.png.f16efede42ee44218b302d5f2a3d1de5.png

Left image is Olly's one from the last link (fully processed version), right one is just luminance from drop box without that plugin applied - with simple levels stretch in Gimp + Gold-Meinel deconvolution done in Gimp as well (G'mic plugins).

 

Link to comment
Share on other sites

35 minutes ago, vlaiv said:

Here is an example of why I'm not impressed with the plugin:

image.png.f16efede42ee44218b302d5f2a3d1de5.png

Left image is Olly's one from the last link (fully processed version), right one is just luminance from drop box without that plugin applied - with simple levels stretch in Gimp + Gold-Meinel deconvolution done in Gimp as well (G'mic plugins).

 

I think the one processed via BlurX is perceptibly better. What is it that you don't like?

Olly

Link to comment
Share on other sites

As requested I have attempted to crop the images and for (yet another) unscientific comparison, included my version of the HST data, no BXT applied!

0.47 arecsecpp, BXT

Image06_BXT_0.47aspp_crop.jpg.5c6dc33b28bc0f0775d5260199af7340.jpg

0.47 arcsecpp, x2 integer resample, BXT

Image05x2RS_crop.jpg.0033d6f27d19a90c89799fe8d94ea7fc.jpg

HST

HSTM51_crop.jpg.4808d4a17064d19aedbbc255aae649dd.jpg

Its purely subjective, but I don't think BXT has created any significant structural artefacts that are not present in the HST image.

  • Like 1
Link to comment
Share on other sites

2 minutes ago, ollypenrice said:

I think the one processed via BlurX is perceptibly better. What is it that you don't like?

How so?

To my eyes right one seems far more natural. Here is an example:

image.png.fc70e7d386b8c0f241618ffb45a1fe24.png

From left one it is not very clear what sort of nebulosity is captured, detail is just not there, on the right (at least to my eyes) - it is clear that we have concentration of hot young stars.

In general, right one seems sharper with more defined detail.

image.png.4a7b1baed433992ddbff84b90ffaf46c.png

Left one - Hubble reference, middle is Gold-Meinel sharpening, right one is BlurX.

 

Link to comment
Share on other sites

2 minutes ago, tomato said:

Its purely subjective, but I don't think BXT has created any significant structural artefacts that are not present in the HST image.

I agree - but it did not sharpen as good as it can be done and debluring softness is just too artificial.

What I do object in your comparison is doing comparison on oversampled data, binned data - but both with BlurX applied to them to draw conclusions about binning/over sampling. You need both over sampled data and binned data processed differently to be able to say that BlurX brings out something in over sampled data that is not there in binned data (provided that binning is producing correctly sampled data).

As far as we know - lack of detail in binned data might be just because BlurX killed it. It is more than obvious that it kills the data in the image.

Part of your original image:

image.png.b3225e8edc098ad517e3325a04e4e17a.png

Part of your bin x2 image from the first post upscaled to match the size:

image.png.8c04ff9c9a84b6c6d4db09cd2fb95cdb.png

Link to comment
Share on other sites

But isn't there a distinction to be made between enhancing bright but blurred detail, as in M51 itself, and dim objects on the threshold of visibility above the background?

I'm not advocating oversampling as good imaging practice, but my original question is can BXT improve an image derived from oversampled data without creating artificial detail? I would still venture that it can.

Link to comment
Share on other sites

3 minutes ago, tomato said:

I'm not advocating oversampling as good imaging practice, but my original question is can BXT improve an image derived from oversampled data without creating artificial detail? I would still venture that it can.

Ok, here is the thing.

When we capture data that is either properly sampled or over sampled - that data is still blurred and any sharpening will bring out that blurred data.

Think of it this way:

Detail brightness is multiplied with some number less than one - that is blurring (just replace detail with frequency in previous sentence and you will get correct statement, but people get scared when they read frequency, so it is better to use inaccurate but more familiar term detail).

Large detail is say multiplied with 0.9

Medium detail is multiplied with 0.5

Fine detail is multiplied with 0.1

Ultra fine detail that one might hope to capture by over sampling is multiplied with 0

Sharpening is nothing more than taking all those detail levels and then dividing with appropriate number (inverse operation to multiplication) in order to restore original value.

So we take large detail and divide it with 0.9 so we get

Some_Value * 0.9 / 0.9 = Some_Value

Similarly, we divide Medium detail with 0.5 and get

Other_Value * 0.5 / 0.5 = Other_value

In the same way we restore Fine detail by dividing with 0.1, but there is no number that we can divide following expression to get original value:

Original_Value * 0 = 0

Whatever number you use as divisor - you won't get original value as 0 divided with any number is still zero.

Even sharpening methods can't fully restore what can be captured at certain resolution because we have finite precision / some error in recording (finite SNR).

Here, have a look at this:

image.png.1e4e955f239fcdd4836f282b31f20996.png

This is roughly 0.9"/px.

Left is Hubble reference, middle is your oversampled image sharpened with BlurX and presented at 50% of original size (again at 0.9"/px) and right one is binned image to 0.9"/px and then treated with BlurX

Even at 0.9"/px, with sharpening you can't reach level of detail that is supported by 0.9"/px (left image) which clearly out resolves the other two.

BlurX - did not bring out those features because over sampled version somehow captured it and 0.9"/px version did not. It failed to bring out that detail in 0.9"/px version, but both versions captured that level of detail, and it is still lower than what can be recorded at 0.9"/px.

I'm sure that other sharpening algorithms would bring out such detail even from 0.9"/px sampling - as is shown from Olly's data:

image.png.4972a5cdd618310b635b888ef032e029.png

 

Link to comment
Share on other sites

3 minutes ago, tomato said:

but hasn't the middle oversampled image in your example resolved the structure closer to the HST image than the binned image?

Yes - but not because of sampling rate but because of how BlurX works.

It failed to resolve on binned image and not "over succeeded" on over sampled image - if you get what I'm saying.

This is why I call for comparison to another set of your two images sharpened by different sharpening algorithm - one that won't fail at binned image - much like Olly's data and sharpening that I applied - here it is again, but with increased contrast so you can cluster detail versus background spiral arm (above was just a quick levels stretch to bring the data out not proper processing)

image.png.a72f371c295c1ff92933b425f3935be7.png

HST image clearly shows what 0.9"/px sampling rate supports - how much detail it can be captured by it, and both images fail to deliver even that much detail let alone level of detail at 0.46"/px

Here - look what can be captured at 0.46"/px:

image.png.e10129c5de9e159aa1c6e66fd5f9d011.png

image.png.8cd43a411df1f48fbcae7387ed1bdaee.png

Do you still think that over sampled version captured detail anywhere close to that?

 

Link to comment
Share on other sites

Hmm, i do think the 0.47'' version looks better in all of the above comparisons so far to the binned one. Neither are comparable to HST data, but we are not comparing it to HST data are we? At least we shouldn't be. The comparison was between @tomatos 6'' refractor and whether or not BXT worked better with the unbinned or not.

1 hour ago, vlaiv said:

How so?

To my eyes right one seems far more natural. Here is an example:

image.png.fc70e7d386b8c0f241618ffb45a1fe24.png

From left one it is not very clear what sort of nebulosity is captured, detail is just not there, on the right (at least to my eyes) - it is clear that we have concentration of hot young stars.

In general, right one seems sharper with more defined detail.

image.png.4a7b1baed433992ddbff84b90ffaf46c.png

Left one - Hubble reference, middle is Gold-Meinel sharpening, right one is BlurX.

 

The middle one from the bottom row looks sharpest here, agree. But these are not apples to apples comparisons in my opinion as they are different processes so different choices must have been made. Also comparing mono to colour, which hardly has a point since the point is to present a colour image. My eye gravitates towards the colour image here at first, the HST image second and your sharpened version third.

Link to comment
Share on other sites

43 minutes ago, ONIKKINEN said:

Hmm, i do think the 0.47'' version looks better in all of the above comparisons so far to the binned one.

I agree - but I maintain that that is the consequence of BlurX butchering the binned version and not because oversampled is somehow superior to binned version.

43 minutes ago, ONIKKINEN said:

Neither are comparable to HST data, but we are not comparing it to HST data are we?

HST data serves particular purpose in these comparisons - not because it is HST data, but because it shows what level of detail can be recorded at particular sampling rate.

When we lower sampling rate for both HST and particular image - at some point we will have two same looking images. Any advantage that HST had over other image will be negated by sampling rate and at that point sampling rate will be matched to underlying data for non HST image.

As long as HST image is "beating" your image at particular scale - you are effectively over sampled or not sharpened enough. Here - look what happens when I scale down Olly's data to 66% of it size (that will make it effectively ~1.36"/px):

image.png.ec0e0bcbcbf3c0be591599965cbec90c.png

difference in detail between the two starts to reduce even further.

43 minutes ago, ONIKKINEN said:

The middle one from the bottom row looks sharpest here, agree. But these are not apples to apples comparisons in my opinion as they are different processes so different choices must have been made. Also comparing mono to colour, which hardly has a point since the point is to present a colour image. My eye gravitates towards the colour image here at first, the HST image second and your sharpened version third.

I agree that different processing will yield different results - but that was not the point. The point was - at 0.9"/px sharpening that is already available in Gimp for several years (and in G'mic collection even longer) will give you better results than BlurX - closer to what HST data shows at 0.9" - but still not quite there yet (because we are over sampled at 0.9"/px let alone at 0.46"/px).

Luminance contains most of the detail sharpness information of the image and that is why it is good to compare just luminance. We can use same chrominance to color each of those luminances and difference in sharpness will still be the same.

In any case - that is my view of BlurX - I'm not particularly impressed with it. I believe that better results can be obtained with different line of processing and I certainly don't think it can "tease out" the detail in over sampled images over properly sampled images.

I've shown what I believe to be evidence to support my view, but of course - people are free to prefer and use any particular processing workflow that suits them and this should not be viewed as argument against that.

 

Edited by vlaiv
  • Like 3
Link to comment
Share on other sites

To me, the three examples above side by side (HST, oversampled and binned), ignoring the hst one, the binned one shows more star detail (or at least brighter parts), even though the gas region in the middle is better on the OS version.

Link to comment
Share on other sites

Maybe I still have a lot to learn about processing, I used to use the sharpening tool in GIMP when I started out but in my hands it produced terrible artefacts around the stars. I was most probably applying too much in my efforts to improve the subject in the image, but rightly or wrongly I see BXT as a big step forward as I believe it has demonstrated that it can improve the appearance of both the extended object and the stars.

Link to comment
Share on other sites

20 hours ago, vlaiv said:

How so?

To my eyes right one seems far more natural. Here is an example:

image.png.fc70e7d386b8c0f241618ffb45a1fe24.png

From left one it is not very clear what sort of nebulosity is captured, detail is just not there, on the right (at least to my eyes) - it is clear that we have concentration of hot young stars.

In general, right one seems sharper with more defined detail.

image.png.4a7b1baed433992ddbff84b90ffaf46c.png

Left one - Hubble reference, middle is Gold-Meinel sharpening, right one is BlurX.

 

I guess we're looking at different things. I'm concentrating on the filamentary structures, the curved lines roughly radial within the galaxy/

Olly

Link to comment
Share on other sites

On 23/12/2022 at 11:46, tomato said:

As requested I have attempted to crop the images and for (yet another) unscientific comparison, included my version of the HST data, no BXT applied!

0.47 arecsecpp, BXT

Image06_BXT_0.47aspp_crop.jpg.5c6dc33b28bc0f0775d5260199af7340.jpg

0.47 arcsecpp, x2 integer resample, BXT

Image05x2RS_crop.jpg.0033d6f27d19a90c89799fe8d94ea7fc.jpg

HST

HSTM51_crop.jpg.4808d4a17064d19aedbbc255aae649dd.jpg

Its purely subjective, but I don't think BXT has created any significant structural artefacts that are not present in the HST image.

This crop confirms what I saw in the original images; the 0.47"/p image looks a bit oversharpened. The dark lanes in front of ngc 1595 look unnatural. Whenever deconvolution (classic or bxt) creates squigly ridges like these, I dial down deconvolution a notch.

Just my € 0.02

  • Like 2
Link to comment
Share on other sites

30 minutes ago, licho52 said:

It used to always be better to apply traditional (currently deprecated) deconvolution on non-binned images

Just love statements like this one.

Could you name deconvolution method that has been "deprecated" and what made it deprecated, and offer any reason and/or proof that deconvolution methods work differently based on sampling rate and more specifically that they are less efficient on lower sampled version of the data?

Link to comment
Share on other sites

18 hours ago, wimvb said:

This crop confirms what I saw in the original images; the 0.47"/p image looks a bit oversharpened. The dark lanes in front of ngc 1595 look unnatural.

Agreed. On the bright arm of M51 nearest to NGC1595 there are also squiggly bright artifacts.  Once you start seeing them, you can't unsee them.  In my opinion, for this data and for the settings used, BXT has created structural artefacts that are not present in the HST image.

Edited by sharkmelley
Link to comment
Share on other sites

A lot of broad judgments here based on first time uses and little experimentation..... For example, I haven't seen any mention of using an inputted PSF diameter instead of the auto setting and I've seen that that can have a significant effect on both artefacts and degree of de-blurring. Every processing tool can be pushed too far or misused and RC has made it clear that BX has its limitations. How it's used and when it's used are just as important as if it's used.

Cheers,
Scott

  • Like 1
Link to comment
Share on other sites

4 hours ago, Scott Badger said:

A lot of broad judgments here based on first time uses and little experimentation..... For example, I haven't seen any mention of using an inputted PSF diameter instead of the auto setting and I've seen that that can have a significant effect on both artefacts and degree of de-blurring. Every processing tool can be pushed too far or misused and RC has made it clear that BX has its limitations. How it's used and when it's used are just as important as if it's used.

Cheers,
Scott

I've been inputting PSFs generated by PSFImage but haven't yet come to any conclusions worth sharing.

Olly

  • Thanks 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.