Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

Keep close to authentic look of object or create stunning, award winning and over processed photo?


Recommended Posts

9 hours ago, vlaiv said:

After stacking it is rather easy to do color correction for atmosphere - you select couple of stars and calculate correction based on their stellar class / color index (similar to differential photometry). Only thing you need to be careful about is interstellar reddening.

However - people don't do above - in part as it is not available as simple "click here" option in processing software and doing it by hand requires both knowledge and tools.

Most popular software package for processing PixInsight - has tool that I expected to do above - but it turned out that it does not. It is named Photometric color calibration - and that name suggests above process, however it does not do actual color calibration in terms of image - it does some sort of "color index" calibration for photometry.

You can read details here:

https://pixinsight.com/tutorials/PCC/

 

Hi Vlaiv. I use PhotometricColorCalibration for all my RGB images (taken with the D5300). Could you explain me why it doesn't work as expected? Are there some other methods I could use in PixInsight to get to a more correct white balance?

Edited by endless-sky
Link to comment
Share on other sites

1 hour ago, endless-sky said:

Hi Vlaiv. I use PhotometricColorCalibration for all my RGB images (taken with the D5300). Could you explain me why it doesn't work as expected? Are there some other methods I could use in PixInsight to get to a more correct white balance?

White balance is not proper term to use in context of astrophotography.

I'll briefly explain why.

Our visual system (eyes + brain) has a very interesting feature - it "adopts" to environment and we will perceive color differently depending on our viewing environment.

This is particularly pronounced in white color. When something has white color in our eyes - that neither means that it is the "the whitest white" nor that it in fact white color in "absolute terms" (although there is no really such thing as white is product of our perception and it changes depending on conditions).

We may see object as being pure white - and this can instantly change when something "whiter" enters our field of view. Sheet of white paper is good for this as it is generally really white. If you have white wall and you see it as white - just put a sheet of white paper next to it and then think - is the wall still white?

Another interesting side of this is that we can take that sheet of paper and in sunlight or cloudy day or next to incandescent or fluorescent light - we will always see it as white (given that it is our reference white and it will force our brain to see it as white).

Read more about it here:

https://en.wikipedia.org/wiki/Chromatic_adaptation

https://en.wikipedia.org/wiki/Color_constancy

White balance is technique used to compensate for the fact that our brain is flexible and camera is not. Camera is very deterministic in what it measures - it will not measure things differently depending on conditions - it will always make same measurement.

You take your camera - you shoot a scene in artificial light - you come back to different environment like your computer and you look at the image - and it is all wrong - you don't remember seeing those colors back when you were looking at the scene. You want colors to look as you remember them. This is what white balance does - it matches what your brain saw under one lighting conditions (at the scene) to what your brain sees under different lighting conditions (at computer).

In astrophotography - there is simply no need to do that as there are no changing lighting conditions - you don't have someone turn on tungsten light to shine on nebula and now you want to see how it looks under different environment. In astrophotography you have no changing lighting - each object has unique light - either emitting their own or reflecting near by sources.

Hold on, but we are still viewing images at computer screen, right? There must be some environment that we are looking images in and our brain will be "tuned" to that environment. What's with that, you might ask?

There indeed is. It is defined in sRGB standard. sRGB is used on computer screens for a reason, or rather sRGB was defined to accommodate working at computer screens. It has precisely defined set of conditions where sRGB colors are perceived correctly.  

image.png.381b1b7479689c8eca4d2c53c808caf4.png

These are parameters according to sRGB standard. You can find more here:

https://en.wikipedia.org/wiki/SRGB#Viewing_environment

In any case, if you use D65 white point for your display and are in dimly lit room that has slightly warmer ambient light (D50) and looking at not too bright screen (80 cd/m2), etc ... your brain will see colors as they are intended to be seen.

Here is important thing - you don't need to calculate white point - sRGB standard is defining that for you, white point of your astro photo should be D65 as long as you intend that image to be viewed on computer screen or shared on internet (like posted on websites and similar). You don't need to do white balance - as long as you encode your color information in sRGB format - you are fine.

What should you do then? You should do color correction not white balance.

The sensor that you are using has certain QE response for R, G and B colors. These are not R, G and B colors of sRGB color space, nor they are R, G and B of any particular RGB color space. They are just arbitrary color filters that usually resemble R, G and B in their function and hence the names (they are of some red and some green and some blue color - but each of them are different depending on particular camera and not the same as primary colors of any particular rgb color space).

For example, here is QE curve of ASI294 - color sensor:

image.png.6686de3e47fe53f891d06cb5c74a4b13.png

And this is response curve of "ideal sensor" - or response curves of CieXYZ color space (absolute/reference color space):

image.png.4de4a9023b06916e0ba9890420a2a9f5.png

(note that these are called x, y and z although they also roughly correspond to r, g and b - to signify distinction - they are not red green and blue).

In general, there is a transform matrix (3x3 matrix of numbers) that will transform any raw RGB triplet from your camera to CieXYZ color space. CieXYZ color space has well defined matrix that transforms it to sRGB linear coordinates:

image.png.69822283edd00738ecfc3654be236b0a.png

And if you multiply these two matrices - you'll get direct transformation of raw RGB triplet to sRGB linear triplet.

That is color correction. That is what makes different cameras with different response curves - given same color (well - first part is responsible - raw to CieXYZ, this second part is just that you can show it on computer screen - if you want to print it, you would transform it to CMYK or some other color profile of your printer).

In principle you can do the above with photometry and perform photometric color calibration properly by taking raw RGB measurements for known stars - or rather stars of known stellar class and temperature, and hence their XYZ values and then solving for transform matrix.

I was hoping that PI photometric color calibration does that - but it does not - it tries for some strange reason to "white balance" the image and uses photometric color index (which is just astronomy expression and loosely relates to actual color) instead of proper color spaces.

To understand relationship between star temperature, Cie XYZ values and sRGB values - here is handy conversion tool:

http://www.brucelindbloom.com/index.html?ColorCalculator.html

So if I want to know values for star that has 7800K temperature - I enter the number and press the button (CCT one):

image.png.c656e913af77835367becaf3d2cd930d.png

In the end - I did not help you much since I don't know PI that much, but first step would be to find out CCM for D5300 (for D65 illuminant) or to derive it yourself for CieXYZ color space.

I did a quick search for CCM for D5300 but I could not find much except this discussion (but not actual data):

https://www.cloudynights.com/topic/591807-altering-color-correction-matrix-for-d5300a/

You can try to derive it yourself, but you would need either sRGB calibrated monitor or color checker chart and nice sunny day in western/central Europe :D (see D65 illuminant definition).

Another approach would be to have exact QE chart of your sensor - then we could derive CCM against CieXYZ curves.

Third one would be to use your eye to judge color calibration (probably worst approach). Take your cell phone and make it display this image (or any similar one):

image.png.f1d63eb1231752bc496dc87ab4ea85d8.png

Turn off all lights - let your mobile phone be only light source and take RAW image with your camera. Now load that Raw photo into your favorite app that let's you fiddle with illumination settings and choose color temperature that will make colors on your phone match those on your computer screen (this requires that you computer screen is properly calibrated). Once you get it - save image as png.

Now you need to open that png in some application that let's you do pixel math to reverse gamma and to measure each of three channels for linear light and then compare that to raw values that are not adjusted.

As I'm writing this - I'm realizing work involved and the fact that no one will actually want to do this just to get proper color. Photographers do it with color checker charts and software already made to handle this. Our astro software does not have those features implemented yet and I don't think people will bother until they have suitable tools.

  • Like 3
Link to comment
Share on other sites

32 minutes ago, vlaiv said:

 

In the end - I did not help you much since I don't know PI that much, but first step would be to find out CCM for D5300 (for D65 illuminant) or to derive it yourself for CieXYZ color space.

 

DXOMARK publishes CCMs for many consumer cameras.  For instance the CCM for the Nikon D5300 can be found at the following link and clicking on the "Color Response" tab.

Nikon D5300 - DxOMark

It assumes that white balance has already been applied.

Mark

  • Like 2
Link to comment
Share on other sites

The basic steps I use to process a DSLR image as "true colour", starting with stacked linear data, are the following:

  • Apply white balance (I generally use the WB multipliers for "daylight" or use PixInsight's PhotometricColourCalibration or similar)
  • Apply the camera specific Colour Correction Matrix (CCM)
  • Apply the gamma for the working colour space (typically sRGB or AdobeRGB)

If you do this correctly, the result will be a colorimetric image with colour tones limited only by the camera's ability to accurately reproduce colour and the gamut of colour space chosen.  Typically the final appearance is fairly bland and you may wish to saturate the colour slightly to make it appear "right".  For some reason the eye seems to need this.

Some additional quick points:

  • Subtraction of light pollution must take place before gamma is applied
  • I'm assuming the CCM is the DXOMARK CCM which goes straight from the camera's raw colour space to sRGB, avoiding the unnecessary complication of the intermediate CIE XYZ colour space.  If you are going to use AdobeRGB then you need a different CCM or apply the sRGB CCM followed by the sRGB to AdobeRGB matrix.
  • Colour preserving stretches such as ArcsinhStretch (which preserve R:G:B ratios of each pixel) need to be applied before the gamma or you must apply them in a colour space such as AdobeRGB which uses a constant gamma instead of the variable gamma used by sRGB.  I therefore definitely recommend working in AdobeRGB because the gamma is very easy to apply and because ArcsinhStretch still works correctly after the gamma has been applied.

Obviously I have just given the main steps here because a detailed tutorial would be pages long.

 

The sRGB to AdobeRGB matrix referred to earlier is this one:

0.7152  0.2848  0.0000

0.0000  1.0000  0.0000

0.0000  0.0412  0.9588

 

Mark

Edited by sharkmelley
  • Like 1
Link to comment
Share on other sites

Thank you both! Lots to read and digest.

As for your workflow, Mark, I am lost after point 1: PhotometricColorCalibration, which I already do - and that's about my stopping point... 😅

You also mentioned light pollution subtraction. I am assuming DBE (or ABE) - I prefer the former. I have been doing this as step 1 (well, maybe not exactly 1, but after master integration and dynamic crop to get rid of the edges). Then I do PCC and also check Background Neutralization in the same process (used to do BN before, but then I saw it already built in and started doing it along with PCC). Does this sequence sound "correct" so far?

Would absolutely love a detailed tutorial on the following steps, if you can spare the time, as I have no clue on how to adjust for CCM.

Assuming D5300, would I pick the values given by Color Response --> CIE-D50 in the link you posted?

Also, other point: I astromodified my camera and put a UV/IR cut filter in front of the sensor. I assume the same matrix doesn't apply anymore? Even more so if there's a light pollution filter (L-Pro) in the imaging train. Correct?

Link to comment
Share on other sites

27 minutes ago, endless-sky said:

Thank you both! Lots to read and digest.

As for your workflow, Mark, I am lost after point 1: PhotometricColorCalibration, which I already do - and that's about my stopping point... 😅

You also mentioned light pollution subtraction. I am assuming DBE (or ABE) - I prefer the former. I have been doing this as step 1 (well, maybe not exactly 1, but after master integration and dynamic crop to get rid of the edges). Then I do PCC and also check Background Neutralization in the same process (used to do BN before, but then I saw it already built in and started doing it along with PCC). Does this sequence sound "correct" so far?

Would absolutely love a detailed tutorial on the following steps, if you can spare the time, as I have no clue on how to adjust for CCM.

Assuming D5300, would I pick the values given by Color Response --> CIE-D50 in the link you posted?

Also, other point: I astromodified my camera and put a UV/IR cut filter in front of the sensor. I assume the same matrix doesn't apply anymore? Even more so if there's a light pollution filter (L-Pro) in the imaging train. Correct?

PCC does not work well on the Nikon D5300 because the Nikon raw data filtering causes stars to turn green.  This makes it impossible to correctly calibrate star colours.

Yes your sequence for light pollution subtraction is good.

Here's the PixelMath for applying the D5300 CCM (using the CIE-D50 version):

D5300PixelMath.png.61041e28d9eeb8c8dde59e435c54b297.png

The matrix still works well for a modified camera as long as you use an IR/UV blocking filter - I've tested this with a ColorChecker.  But if you are using light pollution filters then the colours can go seriously wrong.

The big issue with a modified camera is that it's no longer possible to produce a "true colour" image.  If you calibrate the white balance for correct star colours then regions of hydrogen emissivity will appear reddish instead of the correct pink colour.   

Mark

 

 

 

Edited by sharkmelley
  • Like 1
Link to comment
Share on other sites

Thank you, Mark, for the thorough explanation!

I'll give a try to those PixelMath expressions on my next post-processing. The L-Pro is a relatively "mild" filter, but yes, there are pictures of color checker cards taken with and without and there is indeed some noticeable difference.

All my images of emission nebulae are red instead of pink, so I guess "true colors" are out of the window...

Link to comment
Share on other sites

10 hours ago, sharkmelley said:

DXOMARK publishes CCMs for many consumer cameras.  For instance the CCM for the Nikon D5300 can be found at the following link and clicking on the "Color Response" tab.

Nikon D5300 - DxOMark

It assumes that white balance has already been applied.

Mark

I'm lost a bit here.

DxOMark gives D50 and A illuminant.

sRGB requires D65 illuminant as white point.

How do we convert matrix from one illuminant to the other without knowing spectral response?

Link to comment
Share on other sites

This is even worse than I thought :D

https://ninedegreesbelow.com/photography/srgb-color-space-to-profile.html

Here is good website with a bunch of articles on color spaces, color profiles (important when working in color managed software like Photoshop and gimp) and conversions between.

In any case, here is acid test if your color correction workflow is good:

Take a raw image with your camera. Use your favorite app to make regular image out of it. Try not to do white balance on it - let it record what it sees if possible - something like faithful or whatever.

Take the same raw file and use software like FitsWork to extract raw fits data. Debayer that raw fits data and then use your workflow and compose image out of it (as you would with astro image). See how much different it is to above image handled by specific software.

Link to comment
Share on other sites

1 hour ago, vlaiv said:

DxOMark gives D50 and A illuminant.

sRGB requires D65 illuminant as white point.

I'll try to explain this using a thought experiment. 

Let's assume you take a photo of a scene (including a grey step wedge) illuminated with a D50 light source.  When the data are correctly colour balanced for D50, the white step might have an RGB value of (250,250,250).  If we now apply the CCM for sRGB, the RGB value of this white will remain unchanged as (250,250,250) and it will appear white to you when displayed on the computer monitor.  However, if you take a photo of the monitor displaying this "white" it will have a slightly blue tint (which you can measure in the raw data) because sRGB is displaying white with a D65 illuminant.  However the image of the scene on your monitor appears quite normal because your eye immediately adapts to seeing D65 as white just like your eye immediately adapts to walking into a room lit by incandescent lights.

So although the original scene was lit with a D50 illuminant, the image of that scene appearing on a D65 monitor looks quite normal because of the eye's adaption.  It is only when you put the monitor next to the original scene that you realise they actually look noticeably different and you realise a visual "trick" is being played on you.

The CCM we need to apply to the image data is the CCM relevant to the original light source and not the reference white of the destination colour space.

Mark

Edited by sharkmelley
Link to comment
Share on other sites

15 minutes ago, sharkmelley said:

The CCM we need to apply to the image data is the CCM relevant to the original light source and not the reference white of the destination colour space.

This part I'm having issue with - astronomical object don't have original light source that shines on it that you need to create CCM for. You need to create CCM for D65 in order for light that is emitted by object to have same Cie XYZ coordinates as computer screen emitted light of image of astronomical object. Right?

You want light from object and light from computer screen when viewed "next to each other" to appear as same color - regardless of what our brain sees that color to be - pure white, yellowish tint or whatever.

In order for us to see them as same color regardless of what that color is - we need to match not spectrum but CieXYZ coordinates. Coordinates are matched if same white point is used - in camera and on screen - in this case D65.

Or am I missing something?

I have interesting experiment, let me process the raw data ...

Edited by vlaiv
added a bit ...
Link to comment
Share on other sites

26 minutes ago, vlaiv said:

This part I'm having issue with - astronomical object don't have original light source that shines on them that you need to create CCM for. You need to create CCM for D65 in order for light that is emitted by object to have same XYZ coordinates as computer screen emitted light of image of astronomical object. Right?

You want light from object and light from computer screen when viewed "next to each other" to appear as same color - regardless of what our brain sees that color to be - pure white, yellowish tint or whatever.

If you want the colours in the image that appears on your monitor to identical to the emitted light from an astronomical object then you are right - you need to apply a D65 white balance and use a D65 CCM.   If you are using PixInsight's PhotometricColourCalibration to perform the white balance then you would choose an appropriate white reference - maybe an F-type star.

The only problem with that approach is that G2V stars like our sun would not appear to be white in the displayed image because of the way the eye will adapt to the D65 reference white of the monitor.

Mark

Edited by sharkmelley
Link to comment
Share on other sites

Ok, here is what I mean by there is no need for white balance in astrophotos, or rather we need to find CCM for "no white balance" or however you want to call it.

I took this image:

Screenshot_1.png.ae76ba84f9bd8cc971fff41ca171ebd4.png

Displayed it on screen (actually - I had this thread opened on computer screen and there was this image displayed) and I took my DSLR camera and chose faithful style and custom color balance - where I did not change it but left it all at zeros / center / whatever, like default.

I took raw image of that of screen and piece of white paper.

demo2.JPG.52b260037b0c59859035effd86a2ba42.JPG

My computer screen shows just a tad over saturated colors - or is that camera with "vibrant" colors although I specified faithful? But colors match pretty good.

However - white paper in the left part of the image is not white at all - it is rather murky brown. That is because I have artificial light on next to me - 2700K - warm yellow light mixing with cloudy daylight coming from windows.

To my eyes that piece of paper looks white - as my mind is doing its thing - but colors on screen look as they should - as in image. If I try to color balance in any way - I get this:

demo.JPG.3bced2a972093879f1f1df7e7266fe01.JPG

Sure, now paper looks ok, but colors on screen no longer look right.

My point being - we want CCM that will do following:

When you take a picture of light source and you display it on screen, two things happen:

you see the color of light source and image on screen being the same color (regardless of the actual color) and when you image screen next to source of light with any camera - and you apply its CCM - you get same linear RGB values for both monitor and light source.

Hm, that got me thinking - maybe we can do CCM like that. Take image and display it on mobile phone. Take image of it. Display that image on computer screen and take another image of both computer screen and phone - refine CCM iteratively until you get same raw values for both screen and phone?

Link to comment
Share on other sites

3 minutes ago, sharkmelley said:

If you want the colours in the image that appears on your monitor to identical to the emitted light from an astronomical object then you are right - you need to apply a D65 white balance and use a D65 CCM.   If you are using PixInsight's PhotometricColourCalibration to perform the white balance then you would choose an appropriate white reference - maybe an F-type star.

The only problem with that approach is that G2V stars like our sun would not appear to be white in the displayed image because of the way the eye will adapt to the D65 reference white of the monitor.

Mark

I don't think you can actually take any star to be reference white as plankian locus does not intersect with D65 white point.

Screenshot_2.jpg.35cd6c95df76e7f0be4a1cff505975ee.jpg

None of standard illuminants is actually on plankian locus.

Link to comment
Share on other sites

46 minutes ago, vlaiv said:

My point being - we want CCM that will do following:

When you take a picture of light source and you display it on screen, two things happen:

you see the color of light source and image on screen being the same color (regardless of the actual color) and when you image screen next to source of light with any camera - and you apply its CCM - you get same linear RGB values for both monitor and light source.

Yes this should be possible.  Essentially you need to take your image data and apply a white balance consistent with D65 plus a CCM consistent with D65.  But don't attempt to generate a CCM from a photo taken of a screen.  Instead you need something like a ColorChecker illuminated with a proper D65 broadband light source.

BTW you're right that we won't find a D65 star for our PixInsight PhotometricColourCalibration!

Mark

Link to comment
Share on other sites

Just now, sharkmelley said:

Yes this should be possible.  Essentially you need to take your image data and apply a white balance consistent with D65 plus a CCM consistent with D65.  But don't attempt to generate a CCM from a photo taken of a screen.  Instead you need something like a ColorChecker illuminated with a proper D65 broadband light source.

BTW you're right that we won't find a D65 star for our PixInsight PhotometricColourCalibration!

Mark

I just realized that we don't need to do any of that.

Apparently, dcraw is capable of producing XYZ output, and then we can just simply use XYZ to sRGB transform matrix.

I'm trying to do it now, but having some trouble ...

Link to comment
Share on other sites

And success!

image.png.8730b68b98d5802bc4bc86b7701c3fe5.png

Here is recipe that I used for this:

get dcraw and convert your raw image to 16bit XYZ image (not doing any special processing like application of white balance or anything, I used following command line: dcraw -o 5 -4 -T IMG_0942.CR2 where XYZ color space is listed as 5 in list of supported color spaces: -o [0-6]  Output colorspace (raw,sRGB,Adobe,Wide,ProPhoto,XYZ,ACES))

This makes tiff image that is actually completely wrong in color - because it is in XYZ and not sRGB color space.

Then it is easy - you just apply transform from XYZ to sRGB given by this:

image.png.e18b590041a67fcc68f710c9da8652ce.png

I used approximation gamma of 2.2 rather than above 2.4 with 0.0031308 cut off, but that is close enough for this test. I also made image brighter by not paying much attention to black/white point - I just used 0 and 16384. That is of course of no consequence for astrophotography  - as we set those in processing anyway.

Link to comment
Share on other sites

9 minutes ago, vlaiv said:

And success!

Here is recipe that I used for this:

get dcraw and convert your raw image to 16bit XYZ image (not doing any special processing like application of white balance or anything, I used following command line: dcraw -o 5 -4 -T IMG_0942.CR2 where XYZ color space is listed as 5 in list of supported color spaces: -o [0-6]  Output colorspace (raw,sRGB,Adobe,Wide,ProPhoto,XYZ,ACES))

 

I have a question.  What colour temperature (i.e. white balance) is DCRaw using when you execute that command?  The white balance (e.g. Daylight, Cloudy, Tungsten etc.)  is required to transform from camera RGB to CIE XYZ.

Mark

Link to comment
Share on other sites

3 minutes ago, sharkmelley said:

I have a question.  What colour temperature (i.e. white balance) is DCRaw using when you execute that command?  The white balance (e.g. Daylight, Cloudy, Tungsten etc.)  is required to transform from camera RGB to CIE XYZ.

Mark

None - camera raw RGB (not to be confused with RGB models) is very similar to XYZ. Both represent absolute QE type curves over visible spectrum. Both have the same way of producing a value.

Here is how XYZ values are produced:

image.png.66683c6d422ed97fe1a890046477455a.png

X value is simply x(dash) times light spectrum integrated over visible range. x(dash) is "QE of X" (where QE is percentage and actual curve used here is number without unit).

Same thing happens when we want to find out what our camera sensor for red channel will produce - it will integrate over visible spectrum - spectrum of our source times QE of red filter.

Camera raw space (although we call it RGB) is absolute space not relative. It also does not have white point. For this reason you don't need white point when converting between raw RGB and XYZ.

In fact, if one produced sensor that has QE curves that are identical to x(dash), y(dash), z(dash) - there would be no need to transform between the two, or transformation would be identity matrix.

It is also reason why you can't directly use raw RGB values as color space RGB values and you need to do color correction. When people do this - their images turn all green or red and they need to "color balance" when in reality they need to do matrix transform raw -> XYZ -> sRGB (linear)

Since these are just two matrix multiplications - you can combine them into one - by pre multiplying matrices and do straight raw -> sRGB, but since we don't have actual raw->XYZ for DSLR cameras, we then do two step thing (actually such matrix must exist in raw file as dcraw is using it to produce XYZ values).

Link to comment
Share on other sites

I may be missing something, so let me ask the question in a different way.  If you look at your raw data you will almost certainly find that the green channel has much higher pixel values than red or blue for any colour that is approximately white or grey.  Why doesn't the bottom row of your colour chart appear green?  So where does the white balance take place and what white balance is used?

Mark

Link to comment
Share on other sites

19 minutes ago, sharkmelley said:

I may be missing something, so let me ask the question in a different way.  If you look at your raw data you will almost certainly find that the green channel has much higher pixel values than red or blue for any colour that is approximately white or grey.  Why doesn't the bottom row of your colour chart appear green?  So where does the white balance take place and what white balance is used?

Mark

Actual XYZ data when directly mapped to sRGB looks like this (notice darker tone due to missing gamma):

image.png.d75efffeba9418a7569d9756abbf6fb1.png

Colors are messed up but white is still "white". Or close to white.

That is because most illuminats are in fact very close to what we perceive as white in sRGB space.

image.png.6bc76547ef470cbeefcbc4229e6cd7e0.png

here is RGB of 1,1,1 with D65 - XYZ is not mostly "green" - it is very close and it is somewhat bluish white if we assign XYZ to RGB directly.

Here we are using D65 white point implicitly in conversion matrix. XYZ is absolute color space - meaning it actually measures number of photons in absolute terms not relative to some white point or triplet of colors. sRGB is representing colors as fractions of three colors that have specific xy coordinates. Only thing that is missing is "max" value - or what color is at the corner of RGB cube - again on xy chromaticity diagram. That is white point of color space and for sRGB it is D65.

In order to answer your question - think following:

You have same computer screen and it shows RGB = 1,1,1 as some color. Most of the time, that color will look as pure white. But if you are in well lit room that is using mostly bluish lights, then RGB(1,1,1) on your computer screen will look yellowish - it will no longer be pure white.

Similarly in well lit room that is illuminated by warm light - computer screen will start looking too cold - bluish white.

Nothing changed in light coming from computer screen - it has same spectrum as it had previously. White point is used to compensate for that.

White point is also used to compensate for something else - that is our perception. Many people mistakenly think that just because RGB(1,1,1) is white - you need to mix equal parts of red, green and blue - but that is not correct.

gg-color-wheel-luminance-2.gif

Here we can see that green takes 80%, red takes 54% and blue only 44% of "brightness". Fact that we are having 0-1 range for R, G and B - is just convention that white is the brightest thing that should be displayed and how much blue and how much red and how much green go into particular white - dictates actual physical ranges of brightness of each component.

You need 4 set of numbers to define transform matrix - you need primary R, G and B in XYZ color space and you also need coordinates of white point.

In that sense "white balance" is encoded in XYZ -> sRGB transform matrix.

 

Link to comment
Share on other sites

You've still missed my question.  The raw data has a very strong green channel.  In the TIFF that DCRaw produces, do the white and grey squares of the colour chart look green?  If not, has it been white balanced and where did the white balance come from?

I'm guessing it might be a D50 white balance since XYZ-D50 is typically used as the Profile Connection Space.

Mark

Edited by sharkmelley
Link to comment
Share on other sites

5 hours ago, vlaiv said:

This part I'm having issue with - astronomical object don't have original light source that shines on it that you need to create CCM for. You need to create CCM for D65 in order for light that is emitted by object to have same Cie XYZ coordinates as computer screen emitted light of image of astronomical object. Right?

You want light from object and light from computer screen when viewed "next to each other" to appear as same color - regardless of what our brain sees that color to be - pure white, yellowish tint or whatever.

In order for us to see them as same color regardless of what that color is - we need to match not spectrum but CieXYZ coordinates. Coordinates are matched if same white point is used - in camera and on screen - in this case D65.

Or am I missing something?

I have interesting experiment, let me process the raw data ...

vlaiv, I love reading all your posts but have to admit that if I were able to understand all of what you were saying, I mean really understand to the point of being able to raise an interesting question, I would consider myself to be very smart and as such would be churning out jaw dropping images of the Horsehead nebula instead of pathetic images of M42.

The following will serve as an example of my level of understanding of colours. My wife asked me to go to B&Q online and pick some paint for our bathroom, she wants a shade of grey (no this is not leading into a naughty joke) so I found one I thought suitable. To my surprise she actually liked it but made a valid point that what we see on the screen is only a representation of the real thing and may be far different to the actual paint.
I then thought how NASA solved the problem of getting the colours correct from the images returned by a particular Mars rover, although forget which. The solution was surprisingly simple. A printed colour strip was made of a range of colours, one half kept here while the other half was put on the rover. Once on Mars they only had to turn the rover camera onto the colour strip and then adjust the home based computer until the colours they saw on the Mars colour strip matched with the home card standing next to it, easy peasy, the colours as sent from Mars were the actual correct colours.

Surly it shouldn’t be too much trouble to have a national or international agreed colour chart that could be purchased for a small fee and used to set your monitor so that it would reveal the true colour of the chart, and hence the colour of the product you are buying. Would it?

Hurts my brain all this thinking stuff, time for a dram of Scotland’s finest! 

Cheers, Happy and healthy 2021 with never ending perfect clear skies!

 

Edited by Moonshed
Link to comment
Share on other sites

55 minutes ago, sharkmelley said:

You've still missed my question.  The raw data has a very strong green channel.  In the TIFF that DCRaw produces, do the white and grey squares of the colour chart look green?  If not, has it been white balanced and where did the white balance come from?

I'm guessing it might be a D50 white balance since XYZ-D50 is typically used as the Profile Connection Space.

Mark

It has not been color balanced - it has been converted to XYZ color space.

I specified it in command line. I said I wanted XYZ color space as opposed to raw color space. This however is not white balance. White balance means something else.

This simply means that if you take one DSLR - like Canon 750d and other like Nikon D5300 and you shoot a light source with each and measure raw values - you will get different triplets. Each according to camera QE in each channel.

However if you take those raw images and ask dcraw to export XYZ images - you will get the same triplets of numbers for each camera.

If we had camera that has QE curves exactly the same as XYZ color matching functions - you would take such camera, record raw and get same numbers without any special conversion. It raw would be the same as XYZ color space.

Think of XYZ color space as "standard / universal" sensor with specific QE per each channel and channels are called X, Y and Z.

It just happens that D65 illuminant is close to 1, 1, 1 in XYZ color space, and if I don't do conversion to sRGB but rather just assign XYZ to RGB of sRGB color space - what is white will stay relatively white (there is actually a bit of difference).

 

 

Link to comment
Share on other sites

37 minutes ago, Moonshed said:

Surly it shouldn’t be too much trouble to have a national or international agreed colour chart that could be purchased for a small fee and used to set your monitor so that it would reveal the true colour of the chart, and hence the colour of the product you are buying. Would it?

There is such chart as a matter of fact. I used image of it (or rather image with very similar colors).

It's not cheap. There are "copies" of such chart - but all those charts are pigments. Pigments work differently then light (I'll explain later a bit more).

In any case - these are used for exactly what you encountered - look at image on screen, purchase paint and get what you asked for. That is called color matching.

There are many different things that go into that process. Paint is pigment and color of pigment depends on two things - pigment itself and light used to illuminate pigment. These two act together to produce spectrum of light that you see as color.

In order for you to actually see the same color on your computer screen, when image of pigment is taken - one needs to control illumination (this is illuminant that we are talking about).  What sort of light should one use to represent "natural" color of pigment? But what is natural color of pigment? Pigment will have different hues depending on light used to illuminate it.

Computer screen shows color of pigment if image was properly color managed and your computer screen is properly calibrated that pigment would show if it was illuminated with D65 illuminant (daylight in northern / central Europe is best match for this illuminant).

In astronomy - we don't have this problem. We don't have pigments - we have light. And only thing we need to agree is same viewing conditions - something that is natural, and luckily we don't need to worry about it - computer standards do that for us. sRGB color space is well defined and when we want to represent color of light in it - we know the math and there is just two things left to do:

- calibrate camera

- calibrate computer screen

Above with CieXYZ and color correction matrix - that is "calibrate camera" part. We want to take two cameras and shoot same light source and get same numbers. Well defined numbers.

Calibrate computer screen is usually done at factory - and not really good, mind you as most computer screens have brightness / contrast and other controls and people like to "adjust" things as they see fit. However, if one wants to calibrate monitor - there are solutions out there and I think anyone wanting to produce proper color (and especially professionals working with art / print / photography) need to calibrate their monitor.

Checkout X-Rite and SpyderX tools among others.

These are little USB devices that measure spectra of colors of your computer screen (or even phone screen) and let you set it properly and produce color profiles.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.