Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

M13 wide-field, OSC, Bortle 8


Lee_P

Recommended Posts

Still waiting for Summer nebulae to rise, so I put a few weeks into this image of M13. More info on my website, and imaging details below the picture. Big thanks to @mike1485 and @Gunshy for making the excellent Generalised Hyperbolic Stretch (GHS) plug-in for PixInsight, which I used for this image.

2076095636_M13-fullres.thumb.jpg.25ab6af0868bfc4a1b80b7c9d94cf97c.jpg

 

* March to April 2022
* Bristol, UK (Bortle 8 )
* Telescope: Askar FRA400 f/5.6 Quintuplet APO Astrograph
* Camera: ZWO ASI 2600MC-PRO
* Filter: none
* Mount: Orion Sirius EQ-G
* Guide: William Optics 32mm; ZWO ASI 120MM Mini
* Control: ASIAIR Plus, ZWO EAF
* Software: PixInsight, Lightroom
* 480 x 120 seconds

Total integration time: 16 hours

By Lee Pullen

 

 

  • Like 16
Link to comment
Share on other sites

There's a lot to like but the core is saturated or, at least, sufficiently stretched to have lost stellar resolution. Is the core like this in the linear data? I'd suspect it probably isn't so I think a more differentiated stretch would preserve more stellar separation in the core.

Personally, I'm not a fan of any ready-made stretches that I've tried but I don't know the one you used.

Olly

  • Thanks 1
Link to comment
Share on other sites

3 hours ago, ollypenrice said:

There's a lot to like but the core is saturated or, at least, sufficiently stretched to have lost stellar resolution. Is the core like this in the linear data? I'd suspect it probably isn't so I think a more differentiated stretch would preserve more stellar separation in the core.

Personally, I'm not a fan of any ready-made stretches that I've tried but I don't know the one you used.

Olly

Thanks for the comment Olly, maybe I did over-stretch the core a tad. Attached is the integrated file, if you've time to have a crack at it yourself! By the way, GHS isn't a ready-made stretch, you've got full control over it. It's worth a look!

M13 integrated.xisf

  • Like 1
Link to comment
Share on other sites

5 minutes ago, Lee_P said:

Thanks for the comment Olly, maybe I did over-stretch the core a tad. Attached is the integrated file, if you've time to have a crack at it yourself! By the way, GHS isn't a ready-made stretch, you've got full control over it. It's worth a look!

M13 integrated.xisf 299.41 MB · 0 downloads

I'd love to look at the data but I've no chance of downloading that file other than via something like Dropbox. We communicate largely by pigeon here in rural France... 🤣

I'll have a look at the stretch though.

Olly

  • Haha 1
Link to comment
Share on other sites

14 minutes ago, ollypenrice said:

I'd love to look at the data but I've no chance of downloading that file other than via something like Dropbox. We communicate largely by pigeon here in rural France... 🤣

I'll have a look at the stretch though.

Olly

Maybe your pigeons can handle Google Drive? No worries if not! https://drive.google.com/drive/folders/1FIaH7Tx1p5OekLac4fne8W4PzUQetux6?usp=sharing

 

Link to comment
Share on other sites

1 hour ago, Lee_P said:

Thanks for the comment Olly, maybe I did over-stretch the core a tad. Attached is the integrated file, if you've time to have a crack at it yourself! By the way, GHS isn't a ready-made stretch, you've got full control over it. It's worth a look!

M13 integrated.xisf 299.41 MB · 0 downloads

Is this a variation on mark Shelley arc sin hyperbolic stretch

Link to comment
Share on other sites

I'd have a look, but typical of PI's arrogance and determination to be as unlike anything else, cannot open a wretched Xisf file. Why can't it save a FITS like everyone else.

  • Like 1
Link to comment
Share on other sites

47 minutes ago, newbie alert said:

Is this a variation on mark Shelley arc sin hyperbolic stretch

I think it's a bit more than just that, but I'm not totally sure. https://ghsastro.co.uk/

39 minutes ago, DaveS said:

I'd have a look, but typical of PI's arrogance and determination to be as unlike anything else, cannot open a wretched Xisf file. Why can't it save a FITS like everyone else.

FITS version attached 😁

M13 integrated FITS.fit

  • Thanks 1
Link to comment
Share on other sites

46 minutes ago, vlaiv said:

Oh, this is so depressing :D

Whenever I make true color processing - it just does not look attractive :D

What do you prefer in the image, true color rendition or uber saturated version :D

result.thumb.jpeg.d78e66098f4241f705a8a2c409664732.jpeg

"Uber saturated" for me - I want to make something I like the look of, first and foremost. For me, scientific accuracy is important, but secondary. 

Link to comment
Share on other sites

Just now, Lee_P said:

"Uber saturated" for me - I want to make something I like the look of, first and foremost. For me, scientific accuracy is important, but secondary. 

I think that internet is to blame - everyone expects / are used to - too processed / too saturated images and regular images look uninteresting because of that.

Link to comment
Share on other sites

5 minutes ago, vlaiv said:

I think that internet is to blame - everyone expects / are used to - too processed / too saturated images and regular images look uninteresting because of that.

Let's think about this. If we post a linear image we get a black background and a few stars. Not very interesting. If we extract the image's information in a way adapted to what our eyes can see, we see an object in much of its complexity. Now you can't possibly object to this. The stretching process is a process of exaggeration and lies at the heart of astrophotography. We exaggerate the faint signal by a greater amount than we exaggerate the bright and that is the primary skill of the astrophotographer (in my view.) In a similar way we exaggerate the colour information we have captured. Sure, some imagers impose pre-determined notions of colour on their images but I don't see a difference in kind between your processing and the original. Have any stars turned from blue to red or vice-versa? I'd have thought not.

Olly

Link to comment
Share on other sites

31 minutes ago, ollypenrice said:

Let's think about this. If we post a linear image we get a black background and a few stars. Not very interesting. If we extract the image's information in a way adapted to what our eyes can see, we see an object in much of its complexity. Now you can't possibly object to this. The stretching process is a process of exaggeration and lies at the heart of astrophotography. We exaggerate the faint signal by a greater amount than we exaggerate the bright and that is the primary skill of the astrophotographer (in my view.) In a similar way we exaggerate the colour information we have captured. Sure, some imagers impose pre-determined notions of colour on their images but I don't see a difference in kind between your processing and the original. Have any stars turned from blue to red or vice-versa? I'd have thought not.

Olly

We don't have to exaggerate stretching either.

We can split stretching into two distinct parts / two purposes.

Purpose one would be to truthfully represent the image. This is required by standards and our display devices. sRGB standard incorporates gamma stretch of roughly 2.2 (technically a bit different but very close to this value). We perceive images made in daylight by our cameras because data linearity, sRGB gamma, response of our display devices (if properly calibrated) - are all matched. Linear data coming in from camera can't be directly displayed as image in our browsers because it is not adjusted in the way it is required by standard (or it can but it won't match reality).

You can show even very faint data using only this level of stretch. Here is above image / luminance data - without any "special" stretching.

image.png.1de514828a32e8a349966bcb55a82b78.png

Only things done to image are black and white point and standard sRGB gamma (properly encoded in sRGB).

This does not mean we can't show faint detail in the image. Here is same thing done to the same image - but made to show fainter detail:

image.png.36a6cdf9b063a722fa116ce22286b4de.png

But here we have to sacrifice the core - we have effectively "over exposed" image.

When we have much larger dynamic range than our device is capable of showing (or for that matter - our eyesight is capable of viewing at the same time) - we need to compress this dynamic range in order to fit both highlights and shadows in the same image.

This is the second type of stretch - compressing dynamic range into dynamic range our device is capable of showing.

But we don't need to do it.  We can do what nature does - if you look at bright torch next to computer screen -  you won't be able to see what is on computer screen. There is limit to range of intensities we can observe at any given time (and it is about 1000:1 for human vision - you might think that is more than 256 levels - but remember gamma is there for a reason it compresses very bright and stretched very faint so it effectively encodes what our vision can perceive).

There is also fundamental difference between stretching and saturation. Stretching is changing exposure on a pixel level - exposing different pixels to different amount. By doing this you alter amount of received light but you don't change its nature.

Changing saturation is changing type of light that you recorded - not only its intensity.

Link to comment
Share on other sites

This is my effort (Sorry it took so long) . Import into AstroArt, split RGB channels so I could recombine with WB adjustment. DDP on the recombined RGB followed by Richardson Lucy deconvolution, and a slight saturation boost.

881769683_M13FromSGL.thumb.jpg.f799ed1f19e60566fe5b20ee077419b2.jpg

  • Thanks 1
Link to comment
Share on other sites

On 22/04/2022 at 17:12, Lee_P said:

Big thanks to @mike1485 and @Gunshy for making the excellent Generalised Hyperbolic Stretch (GHS) plug-in for PixInsight, which I used for this image.

Thanks for the mention Lee - it's great to see GHS being used to good effect in your image.

On 23/04/2022 at 19:38, vlaiv said:

There is also fundamental difference between stretching and saturation. Stretching is changing exposure on a pixel level - exposing different pixels to different amount. By doing this you alter amount of received light but you don't change its nature.

Changing saturation is changing type of light that you recorded - not only its intensity.

I think this over simplifies things vlaiv.  Stretching an image will generally affect the saturation.  If (r, g, b) are the red green and blue values for a given pixel then saturation is generally measured as 1 - (min(r, g, b) / max(r, g, b)).  If an intensity transformation (stretch) increases all pixel values by the same proportionate amount, this formula clearly shows that saturation would be preserved.  However, typically most initial stretches apply an intensity transformation that stretches lower values (eg min(r, g, b)) proportionately more than higher values (eg max(r, g, b).  Hence generally you will actually lose saturation as you stretch from linear.  That is why images tend to look a bit washed out when you first move to the non-linear state.

One way round this would be to stretch all three channels by the same proportional amount.  This is the approach used by the arcsinh stretching process.  Effectively, instead of stretching each r, g, b channel separately, a single stretch ratio is calculated by reference to the average of the r, g and b values and this same ratio is applied to all three channels.  In this way saturation is maintained. Many image processors like using the arcsinh process precisely because of this feature.  

However, it is not the arcsinh intensity transformation that preserves colour, rather it is the way it is applied - ie in a way that ensures each channel is stretched by the same ratio.  This same approach has been incorporated as an option within the GHS script as the "Colour stretch" option.  GHS gives the benefit of using this approach alongside a range of different intensity transformations including arcsinh and mtf (as used in Histogram Transformation) but also with the generalised hyperbolic equations which give the script its name (a family of extremely flexible equations well adapted to stretching astronomical images).

Link to comment
Share on other sites

7 hours ago, mike1485 said:

Thanks for the mention Lee - it's great to see GHS being used to good effect in your image.

I think this over simplifies things vlaiv.  Stretching an image will generally affect the saturation.  If (r, g, b) are the red green and blue values for a given pixel then saturation is generally measured as 1 - (min(r, g, b) / max(r, g, b)).  If an intensity transformation (stretch) increases all pixel values by the same proportionate amount, this formula clearly shows that saturation would be preserved.  However, typically most initial stretches apply an intensity transformation that stretches lower values (eg min(r, g, b)) proportionately more than higher values (eg max(r, g, b).  Hence generally you will actually lose saturation as you stretch from linear.  That is why images tend to look a bit washed out when you first move to the non-linear state.

One way round this would be to stretch all three channels by the same proportional amount.  This is the approach used by the arcsinh stretching process.  Effectively, instead of stretching each r, g, b channel separately, a single stretch ratio is calculated by reference to the average of the r, g and b values and this same ratio is applied to all three channels.  In this way saturation is maintained. Many image processors like using the arcsinh process precisely because of this feature.  

However, it is not the arcsinh intensity transformation that preserves colour, rather it is the way it is applied - ie in a way that ensures each channel is stretched by the same ratio.  This same approach has been incorporated as an option within the GHS script as the "Colour stretch" option.  GHS gives the benefit of using this approach alongside a range of different intensity transformations including arcsinh and mtf (as used in Histogram Transformation) but also with the generalised hyperbolic equations which give the script its name (a family of extremely flexible equations well adapted to stretching astronomical images).

I probably was not clear enough in what I said or rather - I did not say everything I meant.

I have completely new workflow that I'm using currently and I was thinking in terms of it. I will describe it briefly so you can see that what I said makes sense.

My current workflow involves working in XYZ color space rather than RGB as XYZ gives nice feedback on brightness as Y component is in effect luminance information.

Steps would be as follows:

1. RAW_RGB -> XYZ color space (matrix multiplication, matrix is derived from measurements with calibrated device or of calibrated source - for example using DSLR to shoot tablet displaying number of color patches or using calibrated computer screen, in first case we have calibrated device in second calibrated source)

2. Y is extracted as mono data and saved

3. Saved Y is stretched with visual feedback (say we open it in PS/Gimp and we stretch it using levels or curves), or some alternative used like DDP or whatever one likes. At this stage noise reduction and sharpening can also be performed

4. We save stretched Y (lets call it st_Y from now on)

5. We perform inverse sRGB gamma on st_Y to move it back in linear space

6. For each pixel we compute st_Y/Y value - this will be some coefficient c

7. We multiply original XYZ with c - this is what I meant by changing exposure time on pixel level - multiplying with a constant is the same as exposing for c longer/shorter (depending if c is larger or smaller than 1 - it is usually larger)

8. With new XYZ values we perform standard XYZ -> sRGB transform to get final image (standard matrix transform defined by sRGB standard + sRGB defined gamma).

XYZ should be viewed as standardized sensor used for true color processing, much like there are standardized filter responses used in photometry (like UGRIZ / UBVRI) with benefit of Y being luminance (which makes it convenient for processing and also handy to easily mix in L from LRGB - we would perform stretch on L and derive XYZ from LRGB and discard Y from it and replace it with L, or it opens up whole new ways of imaging like LRG - as you only need three components to derive transform matrix to XYZ color space)

Only trick is to use appropriate matrix to transform RAW camera values to XYZ, from then on - you don't really care which camera was used to record the data, workflow is the same.

Link to comment
Share on other sites

On 23/04/2022 at 17:30, vlaiv said:

Oh, this is so depressing :D

Whenever I make true color processing - it just does not look attractive :D

What do you prefer in the image, true color rendition or uber saturated version :D

result.thumb.jpeg.d78e66098f4241f705a8a2c409664732.jpeg

Some place between the two. The star to the 4 o'clock of the cluster should appear white with a very slight blue tint from its spectral class. I personally dont like it when people push it to be a bright blue star. On my monitor yours is looking ever so slightly green tinted and so i think you could stand to boost the blue channel a little. 

Adam

Edited by Adam J
Link to comment
Share on other sites

27 minutes ago, Adam J said:

Some place between the two. The star to the 4 o'clock of the cluster should appear white with a very slight blue tint from its spectral class. I personally dont like it when people push it to be a bright blue star. On my monitor yours is looking ever so slightly green tinted and so i think you could stand to boost the blue channel a little. 

Adam

Yes it does look a bit greenish, I've noticed that too, but data is not color calibrated and I worked with like it was.

I just performed color scaling on B-V 0.46 star. Any slight color cast is due to missing proper color calibration step (not just scaling but 3x3 matrix multiplication)

Stellarium reports that star to be B-V of 0.26

According to this formula:

image.png.039e0a616d65e19002bc53f9b743f454.png

https://en.wikipedia.org/wiki/Color_index

It has ~7726K temperature. Similarly Gaia DR2 says it has 7448.00K temperature.

In any case, colors corresponding to to 7450 and 7725 are

236, 237, 255

and

231, 235, 255

respectively

(source http://www.brucelindbloom.com/index.html?ColorCalculator.html)

Magnified star from my image:

image.png.13195d4daa40ebfe419e56535cca50b4.png

and RGB colors obtained by temperature for comparison:

image.png.9fb9ffe0dd2257494c81756f302ce6ba.png

So yes - tiny bit towards the green, but overall color is good- quite close to what it is supposed to be.

Note that 7700K star is really not that blue as people make it to be. You need 20000K+ to really start to see bluish color - and even then it is always grey-blue never deep or ocean blue.

Link to comment
Share on other sites

For what it’s worth, I shot this target a while ago now only because it was there, not because I wanted too particularly. I find star clusters pretty uninspiring as a visual experience so won’t be doing any more. Before anyone jumps on me, my point is we all like different stuff, use different kit and drive different cars. How someone decides to represent any target is down to them. As Olly said, all of us amplify in so many ways what is really there. If we were limited by a science first approach and not our innate creativity then Picasso would have been out of a job and of course many thought he should have stuck with fixing bicycles,  Just saying.


Love the science in the thread though so hats of to those smarter fellows who understand that stuff and for bringing it out here. 

Wishing you all clear skies and a bit more darkness.

Edited by PadrePeace
  • Like 2
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.