Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.

sgl_imaging_challenge_banner_christmas_presents_winners.thumb.jpg.0650e36a94861077374d1ab41812d185.jpg

gorann

What is the true colour of M104 Sombrero galaxy?

Recommended Posts

When I saw this excellent image that tomato just posted of the Sombrero, I had a look at my short (3 x 10 min Canon 60Da on Esprit 150) attempt at it close to the horizon on 22 April last year (final image of that season). I was struck by the difference in colour. My version was more yellowish. When I then had a look at other posted versions I saw everything from bluish, to white and yellowish. I now had a go at my image with curves in Lab colour mode in PS and made it bluish. Kind of like it but what is the true colour really? Could it have to do with how close it is to the horizon?

Here are my old and new version:

IMG6305-7PS15smallSign.jpg

IMG6305-7PS19smallSign.jpg

  • Like 3

Share this post


Link to post
Share on other sites

Final result without color calibration will indeed be subject to position in the sky. Lower down, light travels thru more atmosphere - more chance for blue part of spectrum to be scattered away (think setting sun just above horizon - it is distinctly red rather than yellow).

To get close to true color, one should do color calibration. Maybe simplest method would be to take a few stars in the image of known spectral class and adjust color balance until you get their color right.

  • Like 1

Share this post


Link to post
Share on other sites

There is a free software eXcalibrator http://bf-astro.com/eXcalibrator/excalibrator.htm that can assist you in calibration of the image. It analyses stars color in your image, compares it to star color indexes in databases and then calculates the color correction factors.

Share this post


Link to post
Share on other sites
2 hours ago, vlaiv said:

Final result without color calibration will indeed be subject to position in the sky. Lower down, light travels thru more atmosphere - more chance for blue part of spectrum to be scattered away (think setting sun just above horizon - it is distinctly red rather than yellow).

To get close to true color, one should do color calibration. Maybe simplest method would be to take a few stars in the image of known spectral class and adjust color balance until you get their color right.

I have to admit I am not sure what procedure to use in Photoshop for calibrating colours.

Share this post


Link to post
Share on other sites
1 hour ago, drjolo said:

There is a free software eXcalibrator http://bf-astro.com/eXcalibrator/excalibrator.htm that can assist you in calibration of the image. It analyses stars color in your image, compares it to star color indexes in databases and then calculates the color correction factors.

Thanks - seems useful but I see that it is a Windows program and I do my processing in Mac :(

Share this post


Link to post
Share on other sites
3 minutes ago, gorann said:

I have to admit I am not sure what procedure to use in Photoshop for calibrating colours.

Maybe easiest, and not very accurate way to do it would be:

Look up few stars on the image and find their spectral class - you can even use stellarium for this and match couple of stars "by eye". Pick stars that are not too bright to saturate when you stretch them.

Use this page for reference on RGB values for given spectral class:

http://www.vendian.org/mncharity/dir3/starcolor/

Use Channel Mixer to tweak color balance when image is still linear - after channel compose. Stretch RGB image to bring out color, and use inspector tool to see R, G and B values on test stars. Repeat until you get close match.

After do color -> luminance transfer (in PS you can do that by converting RGB image to LAB and then recomposing back to RGB but using stretched luminance as L component instead of original one).

Share this post


Link to post
Share on other sites
3 hours ago, vlaiv said:

To get close to true color, one should do color calibration. Maybe simplest method would be to take a few stars in the image of known spectral class and adjust color balance until you get their color right.

That's just what photometric colour calibration in PixInsight does.

@gorann: you have this already. No need for ps

  • Haha 1

Share this post


Link to post
Share on other sites
Posted (edited)

Gorann, I have spent some time working on G2V star colours and I am very impressed with the results as some of my images have needed no colour calibration whatsoever, you need to do it for each of you imaging trains, and it doesn't take long to do the calculations.

Find a G2V star that is in the region you are imaging.

1. Capture images for RGB at 1, 2 and 3 secs.

2. Choose the duration for the same filter exposure that is not saturated

3. Zoom in and check the ADU or Intensity of each image.

4. Calculate as a ratio of green as being 1:00

So for example my 10" Truss with my Moravian G2-8300 and Chroma RGB filters came back with: -

Red 2.41 Ratio 1.16
Green 2.81 Ratio 1.00
Blue 2.10 Ratio 1.33

I will do the same for my VX10, Esprit 80 and Esprit 100 before I start using them again on RGB

I just googled G2V stars and one of the results I got was this http://www.solstation.com/stars3/100-gs.htm

Hope that helps?

NB Just to add, this translated to when I did M51 to

600sec L

696sec R

600sec G

798sec B

Edited by Jkulin
  • Thanks 1

Share this post


Link to post
Share on other sites
1 hour ago, Jkulin said:

Gorann, I have spent some time working on G2V star colours and I am very impressed with the results as some of my images have needed no colour calibration whatsoever, you need to do it for each of you imaging trains, and it doesn't take long to do the calculations.

Find a G2V star that is in the region you are imaging.

1. Capture images for RGB at 1, 2 and 3 secs.

2. Choose the duration for the same filter exposure that is not saturated

3. Zoom in and check the ADU or Intensity of each image.

4. Calculate as a ratio of green as being 1:00

So for example my 10" Truss with my Moravian G2-8300 and Chroma RGB filters came back with: -

Red 2.41 Ratio 1.16
Green 2.81 Ratio 1.00
Blue 2.10 Ratio 1.33

I will do the same for my VX10, Esprit 80 and Esprit 100 before I start using them again on RGB

I just googled G2V stars and one of the results I got was this http://www.solstation.com/stars3/100-gs.htm

Hope that helps?

NB Just to add, this translated to when I did M51 to

600sec L

696sec R

600sec G

798sec B

That's a sound procedure. But I see one practical problem with the use of mono cmos cameras: since you generally can't scale darks, you'd need a master dark for each filter. Of course, as long as you can reuse the darks, this is more of a nuisance than a problem.

Share this post


Link to post
Share on other sites

I see another problem with that - it won't always work as expected - there will be small variations.

Depending on LP levels - each sub, both ones used to derive ratios and ones used for image will have different level of offset which will skew results just a bit.

It's better to "wipe" each sub prior to calculations - or set background mean value to 0 - for each color. If you "scale" your subs by using different exposure length - you are not accounting for LP levels on particular night.

Another concern is use of flats. Flats tend to "color" the image, depending on color of the light source used for flats. It will generally be "white" - but it will have certain "warmth" to it - maybe cool white from LEDs or warm white from other sources. One way to deal with flats is to "normalize" them prior to calibrating with them - scale each color flat to 0-1 range.

Share this post


Link to post
Share on other sites

Hi Lads,

Just a couple of things to mention,: -

1. If you are shooting from the same place then the LP levels are unlikely to vary by much and I totally dismiss anything to do with LP as I shoot from my back yard.

2. With regards to Flats, I calibrate my flats every time using an LED flat panel so my flats are always consistent.

3. Not sure what you mean by wiping each sub Vlaiv, just capture the values from the G2V star and use them, there is no need for rocket science, all you are doing is trying to calibrate your captures to represent the light emitted from our own star, who knows what the actual colour is, it might be totally different than what we think we are seeing, all the G2V does is try to balance the images the way our eyes will see them in our solar system.

3. Darks are not necessarily needed with CCD and I'm not sure if they are as vital with CMOS (I don't use CMOS for captures), I do use darks, but not a fresh set, I do them about once a year, but one of the biggest advocates of G2V is my mate Peter Shah, and he doesn't take any darks whatsoever because his full format sensor doesn't have any set point cooling, there aren't many people able to challenge Peters processing.

You either believe in G2V principles or you don't, for every person that advocates it there are others that dispel it, all I can say is give it a try and if it doesn't work then what have you lost?

BTW when you look on Astrobin at images with the capture details which indicate varied capture times in RGB then you can guarantee that they have used G2V.

Share this post


Link to post
Share on other sites
11 minutes ago, Jkulin said:

Darks are not necessarily needed with CCD and I'm not sure if they are as vital with CMOS (I don't use CMOS for captures), I do use darks, but not a fresh set, I do them about once a year, but one of the biggest advocates of G2V is my mate Peter Shah, and he doesn't take any darks whatsoever because his full format sensor doesn't have any set point cooling, there aren't many people able to challenge Peters processing

I agree, darks aren't always needed, and they can be reused. Many cmos cameras suffer from amp glow to some degree. In this situation, you do need darks, and they can't be scaled.

Peter Shah's results with g2v calibration certainly are stunning.

Share this post


Link to post
Share on other sites

The version I posted was after running the star colour calibration tool in APP, loading the calibrated  LRGB data direct into Startools gave a distinctly yellow galaxy, the subject was only about 20 degrees above the horizon from @Tomatobro's location.

combine-RGB-image-fy-0degCW-1.jpg

Share this post


Link to post
Share on other sites
Posted (edited)
39 minutes ago, tomato said:

The version I posted was after running the star colour calibration tool in APP, loading the calibrated  LRGB data direct into Startools gave a distinctly yellow galaxy, the subject was only about 20 degrees above the horizon from @Tomatobro's location.

 

Interesting tomato,  in your new version everything seems to have been turned yellowish, so that procedure seems not to have worked too well.

Meanwhile, I followed  @wimvb suggestion and used Photometric Color Calibration in PI on both my versions (yellowish and blueish). It did a lot of calculations and altered my images (I used the default settings), but the end result was not the same for them, which I would have expected. So this is what the yellowish and blueish ones looks like now after the PI procedure. I have to say that it worked better on the bluish one as the other one now looks quite greenish.

IMG6305-7PS19b(Photomet ColCal).jpg

IMG6305-7PS15b(Photomet ColCal).jpg

Edited by gorann

Share this post


Link to post
Share on other sites
48 minutes ago, Jkulin said:

Hi Lads,

Just a couple of things to mention,: -

1. If you are shooting from the same place then the LP levels are unlikely to vary by much and I totally dismiss anything to do with LP as I shoot from my back yard.

2. With regards to Flats, I calibrate my flats every time using an LED flat panel so my flats are always consistent.

3. Not sure what you mean by wiping each sub Vlaiv, just capture the values from the G2V star and use them, there is no need for rocket science, all you are doing is trying to calibrate your captures to represent the light emitted from our own star, who knows what the actual colour is, it might be totally different than what we think we are seeing, all the G2V does is try to balance the images the way our eyes will see them in our solar system.

3. Darks are not necessarily needed with CCD and I'm not sure if they are as vital with CMOS (I don't use CMOS for captures), I do use darks, but not a fresh set, I do them about once a year, but one of the biggest advocates of G2V is my mate Peter Shah, and he doesn't take any darks whatsoever because his full format sensor doesn't have any set point cooling, there aren't many people able to challenge Peters processing.

You either believe in G2V principles or you don't, for every person that advocates it there are others that dispel it, all I can say is give it a try and if it doesn't work then what have you lost?

BTW when you look on Astrobin at images with the capture details which indicate varied capture times in RGB then you can guarantee that they have used G2V.

I was not implying that G2V calibration does not produce good results - my argument was more theoretical than anything. There are couple of issues with G2V calibration itself, and also with the way you are using it.

Again - these are theoretical issues, and G2V calibration - both "regular" and one that you are using can give very credible results - so I'm not against it.

The way you are doing it leaves room for slight error in comparison to "regular" G2V calibration - which means that colors are scaled as to give 1:1:1 ratio on G2V star.

By "wiping" frames, I mean to set background to 0 mean value across frames used to read off respective ratios (r, g and b). I also mean that before applying again these ratios, one should set background to 0 mean value.

Here is why:

Let's suppose that G2V star and given sensor / imaging system give ratios 0.6:1 for B and G respectively, so camera is "less sensitive" in blue by factor of 0.6, or rather you need to divide blue channel with 0.6. Now, we image such G2V star and pick up photons in exposure and end up with 6000e in blue and 10000e in green from the star. However - there is LP no matter how small it is - let's make it 100e for sake of argument in both channels (it will actually be different depending on type of LP). This means that we will read off 6100 and 11000 electrons in respective channels.

If we take ratio of those two 6100/11000 : 11000/11000 = ~0.554 : 1 we end up with wrong ratio that is no longer 0.6 : 1. We introduced small color shift.

So proper/regular way to do G2V calibration does not rely on exposure duration but rather on "wiped" subs and measured ADU values. You take exposure of G2V star, calibrate and "wipe" sub, do measurement and then apply derived coefficients on again calibrated and "wiped" stack of lights.

Just for clarity, I'll include a bit about why G2V (regular or otherwise) is not optimal method of doing color calibration - there are better / more accurate ways to do it.

First let's examine direct RGB calibration, here are RGB color matching functions:

image.png.96d26a829e8746e2f0cdc7fd18583752.png

Interesting thing about rgb matching functions is that there is part of interval where r matching function is negative. This means that some colors that human eye can see can't be produced by RGB type device. No matter what. This is called gamut of device - how much of observable color space is it able to record / reproduce. All pure spectral colors but three primaries lie outside of RGB gamut. This is general limitation for any type of color calibration - there will be some error in reproduction.

But now specific to G2V calibration. G2V calibration produces 3 coefficients - one fore each Red, Green and Blue. Now look at response curves of typical RGB filters:

image.png.0f0256ee95242f6e5b4a48da745b8216.png

Compare that to above rgb matching functions. RGB matching functions overlap, so particular wavelengths will produce response in both green and red for example. RGB filters on the other hand distinctly separate red and green.

This means that you can't have simple coefficients in order to produce good calibration, you need to have a mix of colors form RGB filters. For example let's say we have light source that produces wavelengths from 560 to 700nm. R filter will collect only part in 600-700nm range while G filter will capture only part in 560-580 range. Color matching functions on the other hand will pick full range 560-700nm for red and 560-640 for green. In order to reproduce this with filters we need to have expression like this:

red_result = R_filter * A1 + G_filter * A2

green_result = R_filter * A3 + G_filter * A4

or in general case, we need at least 3x3 matrix of coefficients to do conversion between filter recording and RGB space. You can't do that with single G2V stars. You need a bunch of stars with different color / stellar class to build your transform matrix and have more accurate color reproduction.

This would be considered "step above" G2V single star calibration. There is problem with this approach however - and it lies in the fact that r matching function is negative in one interval. We can't record negative light, nor can RGB values be negative, you can't emit negative light. This imposes RGB gamut limit onto our camera - further reducing already smaller gamut of camera/filter combination.

Best way to do color calibration is in CIE XYZ absolute color space. Or rather build above 3x3 transform not to RGB but to XYZ color matching functions and first convert to XYZ color space. This will keep all available color information regardless of type of display used to reproduce it. Later you can convert to sRGB in processing for display on internet / regular monitors, or convert to AdobeRGB/wide gamut RGB / ProPhoro RGB and such if you have display devices capable of showing that sort of color gamut.

XYZ matching functions are modeled on human eye sensitivity (for example Y closely matches cone's spectral sensitivity) and can capture whole gamut - every color human eye can see (that is physical color - there are also "imaginary colors" that are not product of light but rather of our mind - a bit like optical illusion):

image.png.d7108d534733498dd1c71a26b6207b8a.png

Share this post


Link to post
Share on other sites
7 minutes ago, gorann said:

Interesting tomato,  in your new version everything seems to have been turned yellowish, so that procedure seems not to have worked too well.

Meanwhile, I followed  @wimvb suggestion and used Photometric Color Calibration in PI on both my versions (yellowish and blueish). It did a lot of calculations and altered my images (I used the default settings), but the end result was not the same for them, which I would have expected. So this is what the yellowish and blueish ones looks like now after the PI procedure. I have to say that it worked better on the bluish one as the other one now looks quite greenish.

IMG6305-7PS19b(Photomet ColCal).jpg

IMG6305-7PS15b(Photomet ColCal).jpg

I don't quite follow what you did there?

Are those two images produced from the same data or different? In case of the same data - color calibration can't be done on stretched and processed image. Do it on raw still linear stacks of R, G and B prior to any processing.

Share this post


Link to post
Share on other sites
8 minutes ago, gorann said:

I followed  @wimvb suggestion and used Photometric Color Calibration in PI on both my versions (yellowish and blueish)

I don't get it. Isn't that the same raw data?

Share this post


Link to post
Share on other sites

Need to correct myself in above post - I did not use proper numbers - recorded values will be 6100 and 10100 and not 11000, so actual ratio that we will get will be 6100/10100 : 10100/10100 = ~0.604 : 1. Different from above, but again different from expected 0.6 : 1 (by small amount admittedly - this is why in principle G2V works when done like that - error will be at most 1 out of 255).

Share this post


Link to post
Share on other sites

I used on my final stretched images. Obviously that does not work but why not? Would be more convenient.

Share this post


Link to post
Share on other sites
5 minutes ago, gorann said:

I used on my final stretched images. Obviously that does not work but why not? Would be more convenient.

You need to do colour calibration on linear data.

It's always the third step in my workflow:

  • Crop
  • DBE
  • Colour calibration

Share this post


Link to post
Share on other sites
Posted (edited)

 

16 minutes ago, wimvb said:

You need to do colour calibration on linear data.

It's always the third step in my workflow:

  • Crop
  • DBE
  • Colour calibration

That is a PI work flow. I am a simple PS processor....

Edited by gorann

Share this post


Link to post
Share on other sites

Interestingly enough G2V is not the "whitest" stellar class out there - F5I is closer to pure white

as it is #fff2e8 for G2V vs #fffafe for F5I according to Black body spectrum for respective temperatures.

Share this post


Link to post
Share on other sites
2 hours ago, vlaiv said:

I don't quite follow what you did there?

Are those two images produced from the same data or different? In case of the same data - color calibration can't be done on stretched and processed image. Do it on raw still linear stacks of R, G and B prior to any processing.

Yes, the data is the same (not much data by the way just 30 min of DSLR). In my original stretch it came out with a yellowish galaxy. Then I turned it towards blue using Lab colour curves in PS, which is the second version. Then I run both versions through Photometric Color Calibration in PI and afterwards they were both slightly changed but still very different so obviously that PI procedure did not "standardise" them to a known colour of the galaxy. Maybe it standardised the star colours but it clearly did something also to the galxy but it did not make the galaxy look the same in both versions.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.