Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

Keep close to authentic look of object or create stunning, award winning and over processed photo?


Recommended Posts

8 minutes ago, sharkmelley said:

On the contrary, the images in the second row are white balanced.  DCRAW is performing a white balance using the default assumption that the scene is lit with the D65 illuminant because that is the only information the code has in the absence of reading the camera's white balance setting from the EXIF.

Mark

It is not assuming they are D65 white balanced - it just does not perform white balance and uses D65 when converting from XYZ to sRGB.

It leaves CA above as identity matrix and when converting XYZ coordinates it has D65 included in XYZ -> sRGB matrix.

Link to comment
Share on other sites

22 minutes ago, sharkmelley said:

On the contrary, the images in the second row are white balanced.  DCRAW is performing a white balance using the default assumption that the scene is lit with the D65 illuminant because that is the only information the code has in the absence of reading the camera's white balance setting from the EXIF.

Mark

I think that most confusion comes from the fact that cameras always think in terms of illuminants and reflectance model and we here discuss emission model.

In Adobe DNG they specify XYZ with D50 illuminant. Does that mean that data was somehow white balanced to D50? No it does not.

Please look carefully at this section on wiki:

https://en.wikipedia.org/wiki/CIE_1931_color_space#Computing_XYZ_from_spectral_data

In emissive case - there is no mention of illuminant - there is no need for it. We are only dealing with color of the light and not color of the objects

In reflective case - particular illuminant is used when wanting to record color of object. This is because object don't have color on their own - color is not in object - color is in light spectrum. We need to illuminate that object with particular illuminant and only then we get light spectrum.

Camera profiles are not made with emission light sources (for some reason) but rather by shooting calibration charts. Standard is that one should use D50 to illuminate those charts. I guess it is easier for camera industry to be done like that rather than use reference light sources and spectroscopy to define transform matrices.

Link to comment
Share on other sites

22 minutes ago, vlaiv said:

I think that most confusion comes from the fact that cameras always think in terms of illuminants and reflectance model and we here discuss emission model.

In Adobe DNG they specify XYZ with D50 illuminant. Does that mean that data was somehow white balanced to D50? No it does not.

On the contrary, I believe it does mean that the data has been white balanced to D50 and I'll explain why by means of an example.   Take two photos of a colour chart - one with tungsten light and one with daylight.  If you process your data with the correct settings then the resulting two images will look virtually identical.  A high level description of the Adobe Camera Raw and DCRAW processing sequence is the following:

CameraRGB -> XYZ(D50) -> sRGB     where XYZ(D50) is the all important profile connection space

We already know that the XYZ(D50)->sRGB matrix is a standard matrix, so if the two images look identical in sRGB they must also look identical in XYZ(D50).  So it's clear that both images received a different white balance during the CameraRGB -> XYZ(D50) operation.

Mark

Edited by sharkmelley
Link to comment
Share on other sites

6 minutes ago, sharkmelley said:

On the contrary, I believe it does mean that the data has been white balanced to D50 and I'll explain why by means of an example.   Take two photos of a colour chart - one with tungsten light and one with daylight.  If you process your data with the correct settings then the resulting two images will look virtually identical.  A high level description of the Adobe Camera Raw and DCRAW processing sequence is the following:

CameraRGB -> XYZ(D50) -> sRGB     where XYZ(D50) is the all important profile connection space

We already know that the XYZ(D50)->sRGB matrix is a standard matrix, so if the two images look identical in sRGB they must also look identical in XYZ(D50).  So it's clear that both images received a different white balance during the CameraRGB -> XYZ(D50) operation.

Mark

Do the same with stars or torches that produce different color lights. What illuminant should you use for those?

Will the color of the light change if you change scene illumination? Look at two colored lights and change lighting in your room - will those colors change?

Link to comment
Share on other sites

37 minutes ago, vlaiv said:

Do the same with stars or torches that produce different color lights. What illuminant should you use for those?

Will the color of the light change if you change scene illumination? Look at two colored lights and change lighting in your room - will those colors change?

For scenes with purely emissive light sources (e.g. night-time city-scapes, fireworks, astrophotography) photographers typically use the daylight setting.  Experienced astrophotographers are sometimes a bit more picky and will choose a certain star class or average galaxy as their white point.  

However, if you want your monitor with a D65 white point to faithfully display your emissive colours (rather than relying on the eye/brain to adapt to D65) then you would choose D65 as your white point. 

Mark

Edited by sharkmelley
Link to comment
Share on other sites

6 minutes ago, sharkmelley said:

I really don't see what's difficult about any of this.

Except there is no white balancing involved when you want to faithfully show emission type image.

White balancing is ambiguous. I have image of white piece of paper. Which white balanced image is correct:

image.png.598905c6003ba1f086877766fc350393.png

image.png.df4d84d7400ee7bdaee22f60f82d76ec.png

Or original unbalanced image:

image.png.8281626a9a7b5418384a8aa83a446c79.png

Yes, it is white piece of paper - illuminated with reddish light on one side and bluish light on the other side. What is it that we want to show in the image? That piece of paper is white? How do we choose which illuminant is correct one for this scene?

Want to show what really happened - we don't balance image.

Link to comment
Share on other sites

11 minutes ago, vlaiv said:

Except there is no white balancing involved when you want to faithfully show emission type image.

White balancing is ambiguous. I have image of white piece of paper. Which white balanced image is correct:

Yes, it is white piece of paper - illuminated with reddish light on one side and bluish light on the other side. What is it that we want to show in the image? That piece of paper is white? How do we choose which illuminant is correct one for this scene?

Want to show what really happened - we don't balance image.

Standard raw converters such as Photoshop/CameraRaw, RawTherapee, Capture One etc.  require a colour temperature (and tint) to be chosen (implicitly or explicitly) as the white point.  Using those tools you can't "opt out" of white balancing and the artist is therefore forced to choose the effect they desire.  If you want to faithfully reproduce emissive colours then you must choose a colour temperature appropriate to your output colour space and display e.g. use D65 for sRGB and AdobeRGB on a properly calibrated display.

On the other hand if you want to build your own processing engine from scratch, you can design it any way you wish.  You don't even need to use a standard colour space - for instance I once designed my own "MarkRGB" colour space with extreme wide gamut primaries, linear gamma and an appropriate ICC profile. But if you want other people to take your images and see them the same way you saw them then you will need to hope that the ICC profile is correctly interpreted by the application that opens the image and that colours beyond the gamut of the display are treated sympathetically. 

Mark

Link to comment
Share on other sites

1 minute ago, sharkmelley said:

Standard raw converters such as Photoshop/CameraRaw, RawTherapee, Capture One etc.  require a colour temperature (and tint) to be chosen (implicitly or explicitly) as the white point.  Using those tools you can't "opt out" of white balancing and the artist is therefore forced to choose the effect they desire.  If you want to faithfully reproduce emissive colours then you must choose a colour temperature appropriate to your output colour space and display e.g. use D65 for sRGB and AdobeRGB on a properly calibrated display.

On the other hand if you want to build your own processing engine from scratch, you can design it any way you wish.  You don't even need to use a standard colour space - for instance I once designed my own "MarkRGB" colour space with extreme wide gamut primaries, linear gamma and an appropriate ICC profile. But if you want other people to take your images and see them the same way you saw them then you will need to hope that the ICC profile is correctly interpreted by the application that opens the image and that colours beyond the gamut of the display are treated sympathetically. 

Mark

Let me ask you following.

If I take my camera and select custom white balance and leave everything at 0 and export to XYZ color space:

- Will there be any white point applied

- Will there be any white balancing done

Link to comment
Share on other sites

Forgetting the camera / eyeball for a moment I had always thought that for emissive objects the Sun, G2V, was by definition white. Not sure if the definition is outside the atmosphere or at unit air mass. Hence the use of G2V stars in image processing (with due regard to interstellar reddening).

Regards Andrew 

Link to comment
Share on other sites

8 minutes ago, andrew s said:

Forgetting the camera / eyeball for a moment I had always thought that for emissive objects the Sun, G2V, was by definition white. Not sure if the definition is outside the atmosphere or at unit air mass. Hence the use of G2V stars in image processing (with due regard to interstellar reddening).

Regards Andrew 

That is step in right direction, but again, if we want to be correct - we must not think in terms of white and yellow - but rather spectra.

If you think that G2V star should have 1:1:1 coordinates in the image - that again depends on what color space you select. With sRGB - neither direct G2V nor G2V when looked thru the atmosphere will be exactly 1:1:1.

This is because spectrum of D65 - which has 1:1:1 in sRGB looks like this:

image.png.3a3e572a6bb47c26ad5de3f46f8cbd5c.png

Sun has following spectrum:

image.png.fee05a87a7775c58060c9403bd0603c2.png

I cut roughly at 400-700nm and Red is at sea level while yellow is on top of the atmosphere.

Here is what wiki says for D65:

Quote

D65 corresponds roughly to the average midday light in Western Europe / Northern Europe (comprising both direct sunlight and the light diffused by a clear sky), hence it is also called a daylight illuminant.

It is averaged from spectrometric data.

In color space with D50 illuminant as white point - it will have these coordinates

image.png.e252fd4b86de55385a9c3a8a857754c1.png

It does not mean that this color suddenly become different - it is still same spectral distribution - it just means that color space has 1:1:1 placed at different point / different spectral distribution - by convention.

In any case - taking a star and trying to white balance on it - will not give you proper colors - only relatively close colors.

Taking bunch of stars and their spectra and calculating their XYZ values from their spectra and then deriving transform matrix from your raw measured values to those XYZ values by method of least squares and transforming your image using that matrix and applying XYZ to sRGB conversion matrix and applying sRGB gamma - will produce proper colors :D

Problem is that we don't have exact spectra for stars in our image and best we have is spectral class, or effective temperature with Gaia DR2 data.

 

 

Link to comment
Share on other sites

Beware Effective Tempetature. It is the temperature of a black body that has the same  luminosity  per unit area as the star. For many classes of star, especially the cooler ones, it does not match the spectrum of the star as the spectrum is far from that of a black body.

Regards Andrew 

  • Like 1
Link to comment
Share on other sites

Just now, andrew s said:

Beware Effective Tempetature. It is the temperature of a black body that has the same  luminosity  per unit area as the star. For many classes of star, especially the cooler ones, it does not match the spectrum of the star as the spectrum is far from that of a black body.

Regards Andrew 

Out of interest, how similar are spectra of particular stellar class.

Say you take couple of A0V (or some different stellar class) and you compare their spectra. How much difference there will be - and especially if you integrate signal over some range for example every 10nm (so you have ~30 samples over visible range 400-700nm).

We don't have to use effective temperature - we can take reference spectra for each class and calculate XYZ coordinates or xy chromaticity for each class (provided that different stars of same stellar class give same integral values).

Link to comment
Share on other sites

Quick go at this need to check I did the sums correctly for the plank curves. The plots are relative so the A0V spectra can be translated vertically with respect to the plank curves Teff for A0V is 9727k

Regards Andrew1861018138_PlankVA0V.thumb.png.53552388c033bcf09252070fa75f642a.png 

Link to comment
Share on other sites

12 minutes ago, andrew s said:

uick go at this need to check I did the sums correctly for the plank curves. The plots are relative so the A0V spectra can be translated vertically with respect to the plank curves Teff for A0V is 9727k

What is that sudden drop in intensity below about 380nm? Is that spectrum of A0V star from earth and that is atmospheric attenuation of UV or is that actual spectrum of star as would be recorded outside of atmosphere - and why does it differ significantly from black body?

Link to comment
Share on other sites

I think you are not far off with that effective temperature thing, here is comparison between Sun and Vega:

sensors-19-05355-g006-550.jpg

Sun is pretty close but Vega is quite off ...

Then I guess we have to figure out how similar are spectra of the same stellar class (across stellar classes).

Link to comment
Share on other sites

17 minutes ago, vlaiv said:

What is that sudden drop in intensity below about 380nm? Is that spectrum of A0V star from earth and that is atmospheric attenuation of UV or is that actual spectrum of star as would be recorded outside of atmosphere - and why does it differ significantly from black body?

No the drop is real. It due called the Balmer jump it due to electrons being ionised directly from the second level in the H atom causing a strong absorption. 

Regards Andrew 

  • Like 1
Link to comment
Share on other sites

23 minutes ago, vlaiv said:

I think you are not far off with that effective temperature thing, here is comparison between Sun and Vega:

sensors-19-05355-g006-550.jpg

Sun is pretty close but Vega is quite off ...

Then I guess we have to figure out how similar are spectra of the same stellar class (across stellar classes).

As the image shows the black body that fits the continuum will have a much higher bolometric output than the star due to the Balmer jump.

The stars from the same class ( both primary and luminosity) will be  much the same but for interstellar reddening though they can be de-reddened using V-B etc. There can be small differences due to metalicity. 

Regards Andrew 

Edited by andrew s
  • Like 1
Link to comment
Share on other sites

4 minutes ago, andrew s said:

As the image shows the black body that fits the continuum will have a much higher bolometric output than the star due to the Balmer jump.

The stars from the same class ( both primary and luminosity) will be  much the same but for interstellar reddening though they can be de-reddened using V-B etc. There can be small differences due to metalicity. 

Regards Andrew 

That helps quite a bit. We just need this library:

https://www.eso.org/sci/facilities/paranal/decommissioned/isaac/tools/lib.html

and some code to calculate XYZ / xy chromaticity of different stellar classes.

Now, on to the final piece of puzzle - where to find accurate sensor QE curves, or at least easy method to measure it (like using Star Analyzer and affordable light source of known spectrum - is there such thing?).

Link to comment
Share on other sites

You might want to look at C Buil's ISIS software. It has several catalogues built in and the tools to manipulate them.  Getting the true QE of the whole imaging train will be difficult as it's not just the sensor but absorption and reflectivity of the lenses and mirrors.

You could take images through filters of spectroscopic stars and then compare the values derived from published spectra and the filter profiles (assuming these are accurate). You will still have the issue of airmass!

It a very hard problem to do accurately. Clearly PixInsight put a lot of effort in even if, as you feel, it's not right to use white point. 

Regards Andrew

Link to comment
Share on other sites

3 minutes ago, andrew s said:

You might want to look at C Buil's ISIS software. It has several catalogues built in and the tools to manipulate them.  Getting the true QE of the whole imaging train will be difficult as it's not just the sensor but absorption and reflectivity of the lenses and mirrors.

You could take images through filters of spectroscopic stars and then compare the values derived from published spectra and the filter profiles (assuming these are accurate). You will still have the issue of airmass!

It a very hard problem to do accurately. Clearly PixInsight put a lot of effort in even if, as you feel, it's not right to use white point. 

Regards Andrew

Indeed it is.

One could to sensor raw to XYZ transform solely on stars for example, however that is only very narrow sample to derive accurate transform as most stars will be concentrated around Plankian locus and that is only very small section of chromaticity diagram.

For that reason, I would like to have way to derive transform independently and then just use stars to remove atmospheric reddening of image.

If I had more confidence in XYZ produced with DSLR - that would be simple way to calibrate things. Maybe I could use computer screen as reference once I calibrate it properly for sRGB with calibration tool?

I did try to calibrate my screen with DSLR - and white point part was not hard, I managed to set proper white point for my screen using RGB controls (at least according to reading from my DSLR if it is to be believed) however, I have issues with gamma. There is no simple way to set that, but when I measure gamma - it is not sRGB standard one.

I create say 5-10 grey squares and assign what should be linear light intensity 0, 20, 40, 60, 80 and 100. When I take image of my screen and measure and normalize values I get deviation - 20% is for example 18% and 60% is closer to 66% - so it is not just simple power law - gamma curve is "wavy" rather than wrong power - sometimes being over and sometimes under expected value.

  • Like 1
Link to comment
Share on other sites

6 hours ago, vlaiv said:

Let me ask you following.

If I take my camera and select custom white balance and leave everything at 0 and export to XYZ color space:

- Will there be any white point applied

- Will there be any white balancing done

I'm not quite sure what you are asking so I hope the following helps.

If you look in the DNG header you will typically find a couple of XYZ(D50) to CameraRGB matrices - one for illuminant A and and one for illuminant D65.  If your custom white balance is near illuminant A it will use the first matrix and if your custom white balance is near D65 it will use the second.  So the same raw camera data will produce very different results in XYZ(D50) space depending on which custom white balance you set.

Mark

Link to comment
Share on other sites

1 minute ago, sharkmelley said:

So the same raw camera data will produce very different results in XYZ(D50) space depending on which custom white balance you set.

What if I set 0 custom balance?

white-balance-axis_1263-2.jpg

So no shift to any side - just 0/0?

Link to comment
Share on other sites

13 minutes ago, vlaiv said:

What if I set 0 custom balance?

white-balance-axis_1263-2.jpg

So no shift to any side - just 0/0?

My guess is that 0/0 corresponds to the default "Daylight" setting for the camera.  Unfortunately the colour temperature corresponding to the camera's "Daylight" setting varies from camera to camera.  For instance "Daylight" on my Sony A7S corresponds to 4800K when opening the image "As Shot" in Adobe Camera Raw and on my Canon 600D it corresponds to 5200K.  In both cases a matrix interpolated between illuminant A and illuminant D65 will be used.

Mark

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.