Jump to content

Achieving natural colours when post processing.


Danjc

Recommended Posts

As everyone else I like looking at other people’s images of DSO and wondered how best to achieve the most natural colour representation. Take M42 for example an image we have all seen many times but can look very different colour wise image to image. 

I appreciate various factors play a part in image capture and what one deems as great another will see it for want of a better word garish but how do I know what a true representation of the chosen subject should be  ? 

I have read a little about the Hubble palate but I’m using a DSLR and post processing in Startools.

 

Link to comment
Share on other sites

That topic is something I've been thinking quite a lot, and you will often find that there is certain freedom in choosing colors, because there is no "true" representation of color out there.

This is not strictly true. Let's concentrate on true color (RGB/LRGB) imaging for the moment. NB imaging (such as Hubble palette) is false color - one assigns certain color channel based on emission line, so it's out of the scope of this.

There is well established theory of color and color reproduction - colorimetry, and we can use that to get true color reproduction (with a few caveats that will be discussed below).

First we need to understand XYZ color space - that is linear color space established to this exact purpose (CIE 1931 XYZ) to be precise. One can also use CIE 1931 RGB color space (not to be mixed with RGB/sRGB color spaces).

This is simply put "virtual sensor" with 3 values - X, Y and Z defined by following matching functions:

image.png.a0c1426f0ace20d7f3188632ea82ec86.png

There are corresponding CIE RGB matching functions:

image.png.09556bc4701e82deb1bde77d75b921bd.png

But you will notice that R component matching function is negative. These are abstract tri chromatic color spaces, and it's allowed for one component to be negative. Anyway, due to this, its better to use XYZ color space as a base.

More on CIE color space:

https://en.wikipedia.org/wiki/CIE_1931_color_space

(I think there is later revision of CIE color space, but argument is the same for both - it's just transforms and matching curves that differ slightly).

Our imaging sensors have also 3 components (R, G and B ) that look something like this:

image.png.4c07f267ef24e40d3eb122308929c70e.png

Curves above are for ASI290 for example, but each sensor has specific curve. Mono sensors have QE curve that can be combined (multiplied) with filter curve (RGB filters) to get appropriate response.

First step in obtaining true colors would be to transform sensor response like above to XYZ color space. This can be done in couple of ways, but it always "boils down to" creating transform matrix (something like "channel mixer" in most image processing software - we have sensor RGB values and we treat them as 3 component vector and we need to get XYZ values - also 3 component vector and there is 3x3 matrix that will represent transform). Creating transform matrix can be done in multiple ways. First would be to either use already present subs for image (or image) and identify stars in the image and their spectral class (plate solve). Now we have a bunch of samples (photometry of each star in each channel) from our sensor to match against XYZ color space values for those stars - each stellar class has distinct spectrum (but even plank's black body spectrum can be used if we know the temp of particular star) and we can calculate expected XYZ values. After that it is the matter of solving matrix via least squares method. Another way would be to purposely shoot bunch of known stars to derive matrix transform for particular sensor (something like creating darks and flats - one would create color profile from bunch of star images - same color profile can be extracted from single image with plate solving as well). Third way would be to use graph like above and run algorithm that will calculate matrix transform (not that useful if one uses any sort of additional filter like UV/IR cut or maybe LPS filter).

Once we have transform matrix we can then convert our "RGB" information (or sensor response) to XYZ color space. Next would be to convert XYZ color space to CIE LAB color space - in two different ways. First would be "normal" conversion of our image to LAB (this is for color sensors) - transform from XYZ to LAB is readily available - https://en.wikipedia.org/wiki/CIELAB_color_space#CIELAB–CIEXYZ_conversions. We need to separate L component from LAB and do a stretch on it (in case of LRGB we will use L channel we acquired via luminance filter).

We do separate XYZ -> CIE LAB conversion but we "equalize" each pixel in "intensity" - Y component of XYZ is scaled to base value and X and Z follow it. Because XYZ is linear intensity - this is akin to asking what would be XYZ values of that pixel if source of light in it was equally close to us as every other pixel in the image. We want only color information but when that pixel is at "constant" distance. This is because things that are far away give off fainter signal of light. On the other hand our eye perceives fainter light as darker shade of color. We want color information that has not been influenced by distance (at this point).

This LAB values represent color (not only AB part, but full LAB triplet - in this case L does not represent luminosity of object but "lightness" of color).

We go back to first LAB, or just L part of it (or luminance channel of LRGB data) and we stretch it as we normally would do bring out faint detail. Once we have done that we "apply" color from second LAB, and in particular way. We multiply stretched L with second LAB L and add AB from second LAB to create "final" LAB.

Let me explain - suppose you have two galaxies that by stretch appear equally bright in your luminance. But color data says that one is red and other is blue. At this point you don't know this yet and purpose of stretched L is to make their brightness appear as if they were equally close (or distant - however you are looking at it). But since one is red and other is blue - these two colors have different intrinsic luminosity, have a look here:

image.png.9948d947e37f08500ac3768ac27db560.png

So, although we made both galaxies to be 0.7 in brightness (on a scale 0-1, or 179 on 0-255) this is light intensity, or closeness. We now need to mix in color luminosity into this. So first galaxy red one will have L be 0.7 * 0.54, while other one - blue (let's go with dark blue in above diagram) will be 0.7 * 0.44.

So in effect dark blue will be less bright - but not because its "further" - its because blue is less luminous color.

Ok, now that I've explained basic principle how to get to stretched LAB version, only thing that is left to do is transform this to computer standard color representation - that is sRGB. Again transformations are readily available, and image processing software should be able to handle that as well (Gimp seems to be able to transform from LAB to RGB easily).

If one uses this approach - one should be able to get close to "true" color image, also, two different sensors (or sensor / filters combination) that would otherwise get different color rendition - should be getting the same result (within transform error - depending on sensor, one can't expect that 100% color be able to be mapped to XYZ - in simpler terms - such sensor can't record full gamut).

Note that this approach is, shall we say, "scientific" - it only allows for luminance stretch. No color saturation boost, or "temperature" / hue change.

Link to comment
Share on other sites

I just realized that this topic has been posted in Getting started with imaging, so sorry about my technical rambling above :D

There is fairly simple way to get decent results in regular RGB image that is "decent" color balancing.

Stretch your image and make sure you set your black point properly (neutral very dark grey - don't go to full black it does not look good).

Identify a star of medium brightness (it does not clip when stretched) in your image and try to find it in stellarium. In stellarium examine stellar class of that star like this:

image.png.98cfd7a16732cf668103e894431a3067.png

And then use this table to get what should RGB values be for color of that star:

http://www.isthe.com/chongo/tech/astro/HR-temp-mass-table-byhrclass.html

Once you have expected color of that star - scale R, G and B (channel mixer) until pixels in your star match that RGB color (or get close enough).

Link to comment
Share on other sites

10 hours ago, vlaiv said:

I just realized that this topic has been posted in Getting started with imaging, so sorry about my technical rambling above :D

There is fairly simple way to get decent results in regular RGB image that is "decent" color balancing.

Stretch your image and make sure you set your black point properly (neutral very dark grey - don't go to full black it does not look good).

Identify a star of medium brightness (it does not clip when stretched) in your image and try to find it in stellarium. In stellarium examine stellar class of that star like this:

image.png.98cfd7a16732cf668103e894431a3067.png

And then use this table to get what should RGB values be for color of that star:

http://www.isthe.com/chongo/tech/astro/HR-temp-mass-table-byhrclass.html

Once you have expected color of that star - scale R, G and B (channel mixer) until pixels in your star match that RGB color (or get close enough).

I've never come across a numerical value for colour weighting spectral class but will see what it gives me. Thanks for the link. I just use this colour chart to see if my stellar colours are about right.

https://www.google.com/search?q=B-V+colour+chart&rlz=1C1CHBF_enFR821FR821&source=lnms&tbm=isch&sa=X&ved=0ahUKEwjG-cj7243gAhVDKBoKHagLDVIQ_AUIDigB&biw=1536&bih=775#imgrc=V2F1r5rd9-lO-M:

This link needs you to look up the B-V colour index of a few stars in your image. These are available for nearly every star on the planetarium programme I use, SkyMap Pro.

In practice I simply shoot equal amounts of each colour and then run Dynamic Background Extraction in Pixinsight. That's it: the stellar colours are in good agreement with the chart above.

Olly

 

 

Link to comment
Share on other sites

15 hours ago, Danjc said:

As everyone else I like looking at other people’s images of DSO and wondered how best to achieve the most natural colour representation. Take M42 for example an image we have all seen many times but can look very different colour wise image to image. 

I appreciate various factors play a part in image capture and what one deems as great another will see it for want of a better word garish but how do I know what a true representation of the chosen subject should be  ? 

I have read a little about the Hubble palate but I’m using a DSLR and post processing in Startools.

 

Best way to achieve natural colour with a non modified DSLR is to use daylight white balance.
If the DSLR is modified there are several ways to achieve this, checkout Jerry Lodriguss at
http://www.astropix.com/html/i_astrop/customwb.html

Link to comment
Share on other sites

51 minutes ago, jjosefsen said:

I use the photometric color calibration of PixInsight, and I guess it does what vlaiv showed in his second post, just on a number of stars. ?

I believe it does.

This technique is fairly similar to one described in first post - creating transform matrix, but it skips steps related to XYZ and LAB color spaces and goes directly to creating transform matrix for linear RGB (don't know if it converts to sRGB after - which is just gamma corrected for human vision). "Problem" with this approach is that it does not separate distance-luminosity from color-luminosity, so I suspect that my outlined approach would give different results - but I need to implement workflow and do some tests.

Btw all those color spaces are related by means of appropriate transforms, they just represent different aspects (RGB is for monitors and trichromatic sources, LAB is meant to be space where distance corresponds with perceptual change in either brightness or color, XYZ is generic color space modeled on human vision, etc ...)

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.