Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

M101 data set


Rodd

Recommended Posts

11 minutes ago, ollypenrice said:

This set me thinking. How close can I get to this with my data and processing.

In theory - if two different people were to process different data using same colorimetric calibration - they should get fairly close if not exactly the same colors in the image.

I'm now going to do two different approaches to colorimetric calibration - one rather simple - that is single star calibration - which won't be as precise, and a bit more involved multiple stars colorimetric calibration. It is a bit more involved process since I don't have software in place to do it for me and need to do much manually (like select / measure stars and look up their temperature in online catalog and then solve for transformation matrix in LibreOffice cal).

I'll post my results later so we can discuss how well it matches above Hubble reference.

Btw - I'm not 100% sure that Hubble team used proper color calibration - but it does look like that once we take into account colors of different spectral classes, their luminosity and abundance in universe.

Link to comment
Share on other sites

5 minutes ago, gorann said:

How do we know that Hubble got the colours right?

I'm not really sure like I mentioned in my previous post, but here is a good reference:

image.png.2b2d52b7154027d2667c9ae5bfb8c8b5.png

On the left we have closest color to what particular stellar class starlight looks like to us. We also have luminosity and frequency of occurrence.

source: https://en.wikipedia.org/wiki/Stellar_classification

Put this color table next to above Hubble image and I think you'll find that they match pretty good.

 

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

In fact, I was intrigued - to me it looked like colors are good but I did not do direct comparison. Here is direct comparison, and now I'm even more convinced they've done proper color calibration on that image:

image.png.81ef651abadeb80e947443d7c96fd279.png

What do you all think?

We can even adopt this as a way to test color calibration of the images? - Just put this little color bar in the image and you should be able to tell if colors in the image are proper.

  • Like 3
Link to comment
Share on other sites

Rodd, this is my attempt at your image. Nice data btw!

m101rodd.thumb.jpg.3900102f1553da08e392cd4a70d8f5fc.jpg

DBE on all channels

linear fit

RGB combination with LRGB (l unchecked)

Ha addition with pixlemath

Photometric color calibration

Histogram transformation on rgb and l

LRGB combination

masks+curves

  • Like 2
Link to comment
Share on other sites

Here is first set of comparison renditions:

1. Naive RGB composition + Luminance transfer:

naive2.thumb.png.aa7648f67dc2ac5f4b4174f7e26e94fb.png

This image was done in following manner:

- Luminance was processed separately and stretched

- RGB data had background wiped and single star calibration was performed - B-V 0.48 was set to 1:1:1 linear RGB ratio

- RGB that was loaded in Gimp, composed into RGB image and stretched. After that luminance layer was added as separate layer and mode set to luminance (simple color transfer).

Image suffers from background color gradients due to way color was processed. It looks like there were some high altitude clouds in color data so background is uneven and I don't use any background neutralization algorithms other than linear light pollution wipe.

Linear RGB ratio transfer:

linear.thumb.png.b753378a316d9388f398afcc88744f20.png

Second image was processed more in line with proper processing - but "common mistake" was purposely  done - color data was left linear without gamma adjustment needed by sRGB color coding:

- Luminance same as above

- RGB data background wiped, single star calibration as above and RGB ratio calculated. RGB ratio applied to luminance data

- Gimp used just for color composing and export to 8 bit png

This is how most images are processed. People seem to add more saturation to this type of processing regardless the fact that colors are already stronger then they should be (people like vibrant images it seems).

3. Gamma corrected RGB ratio transfer:

srgb.thumb.png.ce873c9daf60cb59c9513b990d8ad015.png

- luminance as above

- RGB wiped then calibrated on single star

- RGB ratio transfer was applied on inverse gamma luminance and then result was gamma corrected (as per sRGB standard)

- final color composing and PNG export done in GIMP

Out of the three approaches - third one is the most correct approach.

Only drawback to these three images is single star calibration which is inadequate for mono camera + RGB filters as RGB filters have very different QE curves than RGB matching functions. In order to get correct transform matrix for color calibration - multiple stars are needed and nine variables need to be determined (3x3 transform matrix).

That will be next step in our quest for true color of this image :D

Btw, I'm having so much fun with this data - this is seriously good work in acquisition by Rodd.

 

  • Like 3
Link to comment
Share on other sites

1 hour ago, vlaiv said:

Here is first set of comparison renditions:

I greatly appreciate the time you have allocated to this.  Here's what I think.  I think that if these are the images of M101 my data yields---28.5 hours of data....it was a waste of 28.5 hours and I am going to retire.  No offense intended...I am sure you have eeked out all the data can provide, so your processing is perfect.  As far as the data....I have never been as dissapointed as I am now with this hobby.  If what you say is true.....that M101 is most appropriately depicted as a gray, colorless image, then all the APODS and other amazing images one sees from the masters on the various forums are incorrect, and my lackluster, droll, noise ridden, color light image is the goal?  I do not know what to say, or think for that matter.   You have short circuited my brain.  It needs to reboot.

Meanwhile--following as best I can with the tenor of your processing goals, here is my final HaLRGB image.  appropriately for this topsey turvey day, about the only aspect of this image I like are the stars and background--which is most definitely not typically the case for my images.  To make matters worse--I actually gave the galaxy a saturation boost!--not a terribly large one, but without it no color at all was visible.  No doubt I overdid it and should not have done it.  A Mono camera indeed!  I think the STF screen stretch on my luminance stack looks better than my image (except for the background).

Image05b.thumb.jpg.618c948a59651ca83e060a70e95ff90c.jpg

 

Link to comment
Share on other sites

3 hours ago, HunterHarling said:

Rodd, this is my attempt at your image. Nice data btw!

Thanks Hunter--stars look great.  Not sure about the sglaxy aturation now......looks good, You got color is areas I could not

Link to comment
Share on other sites

4 hours ago, Knight of Clear Skies said:

Impressive as that image is, to me it's too blue. Especially the faint outer arms and structure which look very artificial to my eye.

I agree--but what I like about it is it appears to be glowing--which is what galaxies do.  A light shines from within.  Also--it is very resolved at full resolution....not blurry like mine

Link to comment
Share on other sites

3 hours ago, vlaiv said:

This is how most images are processed. People seem to add more saturation to this type of processing regardless the fact that colors are already stronger then they should be (people like vibrant images it seems).

Personally, I think it's worth presenting really good data more than one way. A bold image can complement a more naturalistic one, there is nothing inherently wrong with exaggerating features to show them more clearly. But there are some things I'm not keen on:

  • Bright blue spiral arms, particularly in non-starburst galaxies. Especially in tenuous outer regions, I'd prefer to see them as more ephemeral.
  • Galaxy halos and outer arms that have been over-stretched to jump out of the background. In reality they are tenuous and amorphous. These structures don't typically have sharply defined edges, the density just drops off with distance.
  • Entirely wrong colours such as greenish Milky Way landscape shots.

Sometimes narrowband data lends itself to bolder images. The colours are artificial so why not pump up the saturation?

I've also seen some quite 'wrong' images which have brought new life to the subject, such as a super-sharp Andromeda that could be used as a scouring pad. There are no hard rules about any of this but it's good to have some idea of how 'natural' an image is. Some low-fidelity and grainy images can also really hit the mark.

I don't have the processing skill to present data as I'd like but that's fine as I get decent results, and I slowly improve.

Edited by Knight of Clear Skies
  • Like 1
Link to comment
Share on other sites

11 minutes ago, Rodd said:

I greatly appreciate the time you have allocated to this.  Here's what I think.  I think that if these are the images of M101 my data yields---28.5 hours of data....it was a waste of 28.5 hours and I am going to retire.  No offense intended...I am sure you have eeked out all the data can provide, so your processing is perfect.  As far as the data....I have never been as dissapointed as I am now with this hobby.  If what you say is true.....that M101 is most appropriately depicted as a gray, colorless image, then all the APODS and other amazing images one sees from the masters on the various forums are incorrect, and my lackluster, droll, noise ridden, color light image is the goal?  I do not know what to say, or think for that matter.   You have short circuited my brain.  It needs to reboot.

Don't be disappointed. I'm after something else - I'm after "documentary" value of the image - what is the real color of the thing - one that you are likely to see with your own eyes if you amplify light from that galaxy enough to see the color.

This is not necessarily what people value as good image. Your data is good and it is about what you make of it. I like to make it as accurate, nicely rendered / sharp but without of obvious signs of forcing the data to be sharp or other obvious signs of processing - I like that natural feel of the image and I strive for that. This of course does not mean that someone else with different processing style and aspiration to present image in certain way won't make your data look differently.

I do have one additional goal in all of this - and that is to educate people that have similar goals as my own - to do documentary type of image - how to process data so it is "scientifically" consistent, or in this case - renders proper true color.

It is also important to understand that if you see someone's rendition of a target - it might not be what that target actually looks like. In fact, as many people now take on astrophotography - there is sort of copycat culture. We search previous work online to get the sense of what our work should look like. That is only natural - when we learn we look up to others to see how it's done. Issue is that this behavior can propagate and cement some beliefs that are wrong - like that strongly saturated colors are proper colors in sense that "it is how what target looks like" and "there is no true color" in astrophotography.

In the end - I'm after left side of this image:

What-is-Color-Correction-Color-Grading-C

but I'm certain that it is the right side of this image that provides viewing pleasure for spectators around the world.

  • Like 4
Link to comment
Share on other sites

I'm seriously reluctant to post this image because I fear for Rodd's health and his interest in astronomy.

Rodd, please don't take this image seriously as it is example of partial calibration / failed calibration. I'll explain why it failed and why it gives result that it gives:

First a bit of background - this calibration was done with 18 stars. Image was plate solved by Astrometry.net, 18 stars randomly picked (well much more, but many were not found in Simbad), their effective temperature extracted from VizieR / GAIA DR2 and photometric measurement done in AstroImageJ.

This provided 18 color points to calculate transform matrix. Here is transform matrix:

image.png.dd6961e71a7b2125a4f12fed2eda04f5.png

And here is "Delta E" (calculated in linear RGB), for different temperatures:

image.png.0916634c927f935ea69a617007a04828.png

As you can see - error starts to be quite large at temperature of 6550K - ~0.07, you'll also notice very limited temperature range: ~ 3990K - 6790K used for calibration. Issue being that as we have seen previously, A, O and B class stars tend to be rare in main sequence population - and luck has it, I did not manage to get single of them in my calibration set (luck or simple probability? :D ). In any case, I did not have one blue star in my calibration set. For that reason error starts to increase on blue end of the scale and given that I had only a few samples below 5200K - error starts to increase there as well. In any case, here is the image:

srgb_trans.thumb.png.befa4710329262bd2470926cd8c6cafd.png

You'll notice that it is completely devoid of blue :D - but that is because of the poor calibration - I did not have a single blue star in my calibration set. Nor deep orange for that matter.

Link to comment
Share on other sites

58 minutes ago, vlaiv said:

Don't be disappointed. I'm after something else - I'm after "documentary" value of the image - what is the real color of the thing - one that you are likely to see with your own eyes if you amplify light from that galaxy enough to see the color.

This is not necessarily what people value as good image. Your data is good and it is about what you make of it. I like to make it as accurate, nicely rendered / sharp but without of obvious signs of forcing the data to be sharp or other obvious signs of processing - I like that natural feel of the image and I strive for that. This of course does not mean that someone else with different processing style and aspiration to present image in certain way won't make your data look differently.

I do have one additional goal in all of this - and that is to educate people that have similar goals as my own - to do documentary type of image - how to process data so it is "scientifically" consistent, or in this case - renders proper true color.

It is also important to understand that if you see someone's rendition of a target - it might not be what that target actually looks like. In fact, as many people now take on astrophotography - there is sort of copycat culture. We search previous work online to get the sense of what our work should look like. That is only natural - when we learn we look up to others to see how it's done. Issue is that this behavior can propagate and cement some beliefs that are wrong - like that strongly saturated colors are proper colors in sense that "it is how what target looks like" and "there is no true color" in astrophotography.

In the end - I'm after left side of this image:

What-is-Color-Correction-Color-Grading-C

but I'm certain that it is the right side of this image that provides viewing pleasure for spectators around the world.

You need an optician! Or a sunny day!! 😁 The only way a face looks like the left hand side to my eye is in a thick fog. The right hand side is a little saturated but not by that much.

1 hour ago, Rodd said:

I greatly appreciate the time you have allocated to this.  Here's what I think.  I think that if these are the images of M101 my data yields---28.5 hours of data....it was a waste of 28.5 hours and I am going to retire.  No offense intended...I am sure you have eeked out all the data can provide, so your processing is perfect.  As far as the data....I have never been as dissapointed as I am now with this hobby.  If what you say is true.....that M101 is most appropriately depicted as a gray, colorless image, then all the APODS and other amazing images one sees from the masters on the various forums are incorrect, and my lackluster, droll, noise ridden, color light image is the goal?  I do not know what to say, or think for that matter.   You have short circuited my brain.  It needs to reboot.

 

We are making pictures, Rodd. They are not, and never will be, reality. What about writers? Take a writer who is trying to evoke a scene so that it comes to life for the reader. A writer who plods through all the objective details will never succeed in doing this. Good writing involves embracing the subjective experience, emphasizing the features which define that experience over those which are to be found everywhere. Exaggeration? Of course, it is necessary. Selection of features of interest? Obviously - or you'll drown in details about the character's bus ticket that day. (It was a slightly yellowish grey, just a tad less yellow and more grey than the lady next door's poodle's collar, except when it was wet. The collar's yellow became slightly more intense in the rain... zzzzzzzz!!!)

1 hour ago, Knight of Clear Skies said:

Personally, I think it's worth presenting really good data more than one way. A bold image can complement a more naturalistic one, there is nothing inherently wrong with exaggerating features to show them more clearly. But there are some things I'm not keen on:

  •  
  • Galaxy halos and outer arms that have been over-stretched to jump out of the background. In reality they are tenuous and amorphous. These structures don't typically have sharply defined edges, the density just drops off with distance.
  •  

 

I agree, though I think that exaggerating galaxy halos and outer arms, if they are genuinely buried in the data, is the whole point of a certain kind of imaging. If you don't want to see them, do visual observing and you won't. If you're interested in what's really there, as I am, you'll need to exaggerate them to reveal them. I don't do astrophotography to confirm what I can't see at the eyepiece, I do it to find what I can't see at the eyepiece.

The discussion on colour is very interesting and confirms something I've long suspected - that galaxies are not, in fact, all that colourful and are probably a lot greener than we usually see them.

Olly

  • Like 3
Link to comment
Share on other sites

5 hours ago, ollypenrice said:

This set me thinking. How close can I get to this with my data and processing. Here I'm using my own M101, not Rodd's. I'm unsure how to de-blue any harder than this and I can't get that slivery-green, silvery-blue look really. I think they might have me on resolution as well... 🤣

Thoughts, anyone?

415068583_M101Hubbleteamcolour.thumb.jpg.4b07107172c33f5476e13cb6c4064147.jpg

Olly

 

 

 

 

You sure got a lot of faint stuff. Vlaiv has shocked my system and I am seeing pink elephants and spotted zebras, so the color I will let alone.....I think it looks good

  • Like 1
Link to comment
Share on other sites

Here's the RGB image, processed in PixInsight:

  • Channelcombination
  • Dynamic Crop, removing the stacking edges
  • DBE
  • Background neutralization
  • Photometric colour calibration
  • Arcsinh stretch (which is supposed to preserve colour balance)

No additional colour saturation was used, and since PI is quite rigorous in its mathematics, I think that this is the closest you can come to natural colours á la PixInsight.

m101_RGB.thumb.jpg.70107352a9d5027ff5dbc100099303b6.jpg

For me, this is the starting point of my image processing. I will push the colours, and process the luminance data to keep as much colour in the galaxy as possible, while at the same time lifting the faint arms and suppressing the core.

Btw, if natural colours are the aim, then how does narrowband Ha fit in? Doesn't this destroy the (natural) colour balance?

Edited by wimvb
Link to comment
Share on other sites

2 hours ago, ollypenrice said:

We are making pictures, Rodd. They are not, and never will be, reality. What about writers? Take a writer who is trying to evoke a scene so that it comes to life for the reader. A writer who plods through all the objective details will never succeed in doing this. Good writing involves embracing the subjective experience, emphasizing the features which define that experience over those which are to be found everywhere. Exaggeration? Of course, it is necessary. Selection of features of interest? Obviously - or you'll drown in details about the character's bus ticket that day. (It was a slightly yellowish grey, just a tad less yellow and more grey than the lady next door's poodle's collar, except when it was wet. The collar's yellow became slightly more intense in the rain... zzzzzzzz!!!)

I know....and my picture is tosh.  Your image is not.  What to exaggerate?  How to exaggerate?  What are the limits.? Therin lies the key.  Its not all personal taste.  There is an ingredient (or several) missing from the recipe.......I am trying to identify them.  

1 hour ago, wimvb said:

Here's the RGB image, processed in PixInsight:

That looks exactly like mine at the RGB point.  Finished product?

Link to comment
Share on other sites

15 minutes ago, Rodd said:

Finished product?

Definitely not. But when I process the lum, and combine it with the colour, I lose the very faintest arms in the galaxy. The data is in the colour image, but not in the L. Even when I superstretch the L, I can't separate the faint arms from the noise floor. But it's very evident in the blue channel of the colour image. You can just barely see it in the RGB image I posted here. I will try with the synthetic luminance.

Link to comment
Share on other sites

3 minutes ago, wimvb said:

Definitely not. But when I process the lum, and combine it with the colour, I lose the very faintest arms in the galaxy. The data is in the colour image, but not in the L. Even when I superstretch the L, I can't separate the faint arms from the noise floor. But it's very evident in the blue channel of the colour image. You can just barely see it in the RGB image I posted here. I will try with the synthetic luminance.

My lum is brighter than the blue. Something sounds off.  I see the extensions more in lum than any other channel.  

Link to comment
Share on other sites

4 hours ago, ollypenrice said:

You need an optician! Or a sunny day!! 😁 The only way a face looks like the left hand side to my eye is in a thick fog. The right hand side is a little saturated but not by that much.

We are making pictures, Rodd. They are not, and never will be, reality. What about writers? Take a writer who is trying to evoke a scene so that it comes to life for the reader. A writer who plods through all the objective details will never succeed in doing this. Good writing involves embracing the subjective experience, emphasizing the features which define that experience over those which are to be found everywhere. Exaggeration? Of course, it is necessary. Selection of features of interest? Obviously - or you'll drown in details about the character's bus ticket that day. (It was a slightly yellowish grey, just a tad less yellow and more grey than the lady next door's poodle's collar, except when it was wet. The collar's yellow became slightly more intense in the rain... zzzzzzzz!!!)

I agree, though I think that exaggerating galaxy halos and outer arms, if they are genuinely buried in the data, is the whole point of a certain kind of imaging. If you don't want to see them, do visual observing and you won't. If you're interested in what's really there, as I am, you'll need to exaggerate them to reveal them. I don't do astrophotography to confirm what I can't see at the eyepiece, I do it to find what I can't see at the eyepiece.

The discussion on colour is very interesting and confirms something I've long suspected - that galaxies are not, in fact, all that colourful and are probably a lot greener than we usually see them.

Olly

I agree with Olly - my aim with processing is to exaggerate both the faint stuff and the colours. Visual astronomy is essentially a B/W experience and imaging allows us to bring out colours. Earlier today I entertained myself by putting together some HaRGB images of Sharpless object that I imaged in Feb-March. There were almost no traces of these nebulae in the RGB data - only visible in the Ha data. Ha is red to I used blend mode lighten in PS to add it to the red channel. So red is red but much more red than what our eyes could ever see, probably not even if we put an eyepiece on the biggest scopes on Earth (if there is a possibility to fit an eyepiece to them):

SH2-170,173, 183, 207 CompositeNew 173smallSign.jpg

Edited by gorann
Link to comment
Share on other sites

22 minutes ago, Rodd said:

My lum is brighter than the blue. Something sounds off.  I see the extensions more in lum than any other channel.  

My processing, very likely. It's getting late, so I'll have to continue this with fresh eyes tomorrow.

Link to comment
Share on other sites

25 minutes ago, wimvb said:

My processing, very likely. It's getting late, so I'll have to continue this with fresh eyes tomorrow.

No processing needed. Just hit both with stf and see. It’s not what you would call ideal. But it is based on the data and gives an idea 

Link to comment
Share on other sites

7 minutes ago, Rodd said:

Let me ask you all. Do you think adding 5 or 6 more hours of lum would help?

To be honest, I dropped 34 hours of exposure in m101 with my Edge scope a few years ago and it looked worse than your image. I blame the target and if I were you I'd image some other galaxy with your time, m101 has a very low surface brightness. Markarian's chain might be a good target for your scope btw:)

Link to comment
Share on other sites

1 hour ago, HunterHarling said:

To be honest, I dropped 34 hours of exposure in m101 with my Edge scope a few years ago and it looked worse than your image. I blame the target and if I were you I'd image some other galaxy with your time, m101 has a very low surface brightness. Markarian's chain might be a good target for your scope btw:)

Too small a FOV, even with the refucer

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.