Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Problem getting colours right with ZWO533mc


Recommended Posts

Last year I started to use a ZWO533mc colour camera, but I do not get the colours right at all.
Up to now I used (ZWO183) b/w  camera's with LRGBHa filters, and also (modified) Canon colour camera's. Up to the Corona closure of my observing site, I used a C14/Hyperstar telescope, but now for the time being smaller backyard refractors. Images are stored in FITS format. Stacking was done with DSS, APP, and Nebulosity.
The images are horribly greenisch beyond repair. There must be a reason for this and I tried what I could think of. Debayer? I am more an observer than an image processor, still I feel stupid I cannot trace a cure. Can someone show me a way out?
The attached picture is a simple stack, without any calibration, but shows my problem.
Johan.

 

M42 stack jpg.jpg

Link to comment
Share on other sites

Bayer matrix is RGGB - so that might be a problem, but in general - OSC cameras need to have color correction performed on their data to get good color.

DSLR has color correction matrix embedded depending on which lighting you select (when you choose white balance - you choose particular color transform).

Very rudimentary color correction is performed on OSC sensors when you do flat exposure if your flat source is white (and in principle it should be).

If you want to do such color correction without taking flats (and I strongly advocate use of proper calibration with flats) - you can do photometric measurement on star of your choice and then scaling R, G and B channels accordingly. Mind you - you need to perform these measurements on star that is not clipped and while data is still linear prior to stretching.

Any star that is close to 6500K is good calibration source (F2 for example). If you can't find F2 star you can try with G star and adjust color temperature later in processing.

Ideal way to get good color would be to derive color transform matrix. If you have DSLR - you can use any color chart - even your smart phone showing a color chart like this:

image.png.0e4db69774a4497ae1b94931932b474c.png

You shoot your color chart target with both DSLR and ASI533. Leave white balance on DSLR on "authentic" (or however the setting is called - one that does not apply particular white balance). Use DCRAW to extract XYZ image from DSLR. Measure values of colors while data is still linear and then derive color transform matrix between two sets of colors using least squares method. Multiply resulting matrix with XYZ -> sRGB color matrix to get RAW -> sRGB matrix for ASI533.

 

Link to comment
Share on other sites

On an image like that, the RGB histogram peaks will be the 'sky' background. As a basic starting point, just try setting the peak of each channels histogram to the same value using your processing software (by just offsetting each channel's data and not by changing the RGB ratios as that may cause more issues).

Link to comment
Share on other sites

I had this with my ASI294MC Pro but I could process it out with PixInsight using the DynamicBackgroundExtraction tool. Then, when playing with the settings in DSS I found a setting which removed most of the green cast.

In DSS, go to the Settings Menu > Stacking Settings > Light tab > click where it says "RGB Channels Background Calibration" and you get a drop-down menu, click on "RGB Channels Background Calibration" if there isn't a tick next to it and then click on "Options". In the Background Options window I have the Calibration Method set to "Rational" and the RGB Channels Background Calibration set to "Minimum".

Click OK to save the setting changes and run the stack again and most of the green cast should have been removed. ;)

Link to comment
Share on other sites

A green-dominant OSC/DSLR dataset tends to be a good sign.  All you need to do is establish and subtract - in the linear domain - the bias for each channel (indeed ABE/DBE or Wipe will do this). Then you color balance - again in the linear domain - by multiplying the individual channels by a single factor per channel. You will now get expected colouring (in the linear domain).

Macbeth charts and/or calibration against them are irrelevant to astrophotography where objects are mostly emissive and cover a huge (virtually infinite) dynamic range. Nothing is lit by a uniform light source at a single brightness, nothing neatly reflects light in the same way, while there is no common agreed upon white balance either. There is a reason why this functionality is not offered in AP software (StarTools excepted, but I implemented this more as a curiosity for terrestrial purposes and debugging, rather than as something that is a must/recommended) and why instrument manufacturers don't bother with providing such a matrix; its assumptions and tightly controlled conditions don't apply.

Hope this helps!

Edited by jager945
Link to comment
Share on other sites

7 hours ago, jager945 said:

Macbeth charts and/or calibration against them are irrelevant to astrophotography where objects are mostly emissive and cover a huge (virtually infinite) dynamic range. Nothing is lit by a uniform light source at a single brightness, nothing neatly reflects light in the same way, while there is no common agreed upon white balance either. There is a reason why this functionality is not offered in AP software (StarTools excepted, but I implemented this more as a curiosity for terrestrial purposes and debugging, rather than as something that is a must/recommended) and why instrument manufacturers don't bother with providing such a matrix; its assumptions and tightly controlled conditions don't apply.

Light has exactly defined color - in fact color is light of a certain spectrum.

It is big fallacy that there is no proper color balance / accurate color in astrophotography.

Every manufacturer could produce RAW to XYZ color transform matrix as that absolutely does not depend on any conditions - it can be mathematically derived from sensor characteristics - spectral sensitivity curve. However manufacturers don't do that.

RAW to XYZ matrix is the only thing one needs to accurately match recorded color within camera gamut and gamut of display device - for example sRGB.

Link to comment
Share on other sites

1 hour ago, vlaiv said:

Light has exactly defined color - in fact color is light of a certain spectrum.

It is big fallacy that there is no proper color balance / accurate color in astrophotography.

Every manufacturer could produce RAW to XYZ color transform matrix as that absolutely does not depend on any conditions - it can be mathematically derived from sensor characteristics - spectral sensitivity curve. However manufacturers don't do that.

RAW to XYZ matrix is the only thing one needs to accurately match recorded color within camera gamut and gamut of display device - for example sRGB.

I'm afraid none of that is true. I recall a similar exchange with you some time ago in which I urged you to study up on this and gave you the sources to do so. It saddens me you have not taken that opportunity.

Link to comment
Share on other sites

6 minutes ago, jager945 said:

I'm afraid none of that is true. I recall a similar exchange with you some time ago in which I urged you to study up on this and gave you the sources to do so. It saddens me you have not taken that opportunity.

That is quite condescending.

In any case, I felt obligated to write above so that people reading this thread would not be mislead by false information and further - everyone is free to research for themselves.

For those interested, here are relevant links:

https://en.wikipedia.org/wiki/CIE_1931_color_space

https://en.wikipedia.org/wiki/SRGB

 

Link to comment
Share on other sites

9 hours ago, vlaiv said:

That is quite condescending.

In any case, I felt obligated to write above so that people reading this thread would not be mislead by false information and further - everyone is free to research for themselves.

For those interested, here are relevant links:

https://en.wikipedia.org/wiki/CIE_1931_color_space

https://en.wikipedia.org/wiki/SRGB

 

Then let's add some slightly more relevant links as well;

https://jakubmarian.com/difference-between-violet-and-purple/ (contradicting the incorrect assertion that "in fact color is light of a certain spectrum")

http://www.cvc.uab.es/color_calibration/SpaceConv.htm

(contradicting the incorrect assertion that "every manufacturer could produce RAW to XYZ color transform matrix as that absolutely does not depend on any conditions")

If anyone is interested, in good faith, in more information or further explanations, do let me know.

Link to comment
Share on other sites

6 minutes ago, jager945 said:

https://jakubmarian.com/difference-between-violet-and-purple/ (contradicting the incorrect assertion that "in fact color is light of a certain spectrum")

It absolutely does not contradict that statement.

It states that violet is spectral color - meaning produced by single wavelength of spectrum, while purple is not - which means it consists of multiple wavelengths.

However - both have certain spectrum. Violet spectrum consisting of a single wavelength, while purple - consisting out of multiple wavelengths.

Here is quote from article:

Quote

In other words purple is not a spectral colour. You can have a source of monochromatic violet light (i.e. a source producing just a single wavelength), but everything that looks purple must emit both red and blue light.

You are confusing terms spectral color - and color of certain spectrum. First one means that color exists as a single wavelength (exists in rainbow spectrum) and second means that certain spectrum - in this case energy distribution over range of wavelengths defines color that we see.

11 minutes ago, jager945 said:

http://www.cvc.uab.es/color_calibration/SpaceConv.htm

(contradicting the incorrect assertion that "every manufacturer could produce RAW to XYZ color transform matrix as that absolutely does not depend on any conditions")

Again it does not - I did not say that such color transform matrix will be 100% accurate - in fact, I did mention that both target color space and camera have gamut and that colors are matched between those. If camera does not record wavelengths above say 600nm - it can't record spectral red for example - it lies outside of camera gamut.

12 hours ago, vlaiv said:

RAW to XYZ matrix is the only thing one needs to accurately match recorded color within camera gamut and gamut of display device - for example sRGB.

 

Link to comment
Share on other sites

Mate, the fact that you commented within minutes on a massive long dissertation about color space conversions means you are either an amazing speedreader of scientific literature, or more interested in "being right" than actually truly furthering your understanding of the subject matter.

The comprehensive color space conversion research page is literally titled "Camera to CIE XYZ conversions: an ill-posed problem". It literally goes into all the different subjective decision that need to be made when constructing a solution to the matrix. I cannot fathom how you can maintain construction of the transform matrix "absolutely does not depend on any conditions" when (some of ) those conditions and constraints are spelled out for you. AP puts another heap of subjective variables into the equation (such as picking an illuminant), but we haven't even made it past you acknowledging the basics yet. 😕

Colours don't "have spectrum" (unless you're just simply asserting that visible light falls on the electromagnetic spectrum, which is pretty self evident?). Colours are a human construct, (e.g. to a human "parts of the electromagnetic spectrum appears to have colors under certain conditions" if you really must put it in those terms), which purple vs violet neatly demonstrates (brown is another fun one). Case in point, purple and violet appear - to humans, but no other species -  to be shades of the same colour, yet are be made by mixing completely different parts of the spectrum.

Like last time, it is hard not to conclude that you are not here to learn or ask questions in good faith, and I don't think furthering this discussion is going to be productive. You are free to "believe" about colour what you want, but please don't confuse or misinform others. 

Link to comment
Share on other sites

On 02/04/2021 at 14:51, Johanvk said:

still I feel stupid I cannot trace a cure. Can someone show me a way out?

Don't lose sight of the fact that it could actually be a fault with the camera.

My ZWO2600MC started showing a similar effect intermittently which was a problem with the camera, image attached.

There is a thread on Cloudy Nights discussing a similar problem with ZWO MC cameras, sorry didn't keep a link., something to do with different colour balance between native drivers and ASCOM drivers.

Dave

Just an out of focus image taken indoors in daylight

449981879_ZWOASI2600MCGreen2.PNG.952150797b9470e5dede30dab5cd74b6.PNG

Edited by Davey-T
  • Like 1
Link to comment
Share on other sites

2 minutes ago, jager945 said:

The comprehensive color space conversion research page is literally titled "Camera to CIE XYZ conversions: an ill-posed problem". It literally goes into all the different subjective decision that need to be made when constructing a solution to the matrix. I cannot fathom how you can maintain construction of the transform matrix "absolutely does not depend on any conditions" when (some of ) those conditions and constraints are spelled out for you. AP puts another heap of subjective variables into the equation (such as picking an illuminant), but we haven't even made it past you acknowledging the basics yet.

Again condescending tone, but let me address your comment.

I acknowledge that Camera to CIE XYZ conversion is ill-posed problem and that you can't map complete space with 100% accuracy. However, what you can do is take response curves of actual sensor and CIE XYZ standard observer and take a set of spectra that is good representative of your working data and create conversion matrix that will minimize deltaE for example - perceptual color error. In astronomy, Plankian locus together with prominent emission lines would be good candidate for deriving transform matrix. Problem is that sRGB color space that is most widely used on internet (and any non color managed image is assumed to be encoded in sRGB) - does not render any of of the spectral colors, so alternative would be to use sRGB colors that are closest (deltaE metric) to spectral colors of prominent emission lines.

On the other hand - every camera manufacturer can do the same using a standard set of spectra and do basic Raw to XYZ transform matrix that covers most of raw gamut equally.

In fact - they do that and you can extract XYZ color information for every DSLR. Here is part of DCRAW command line utility:

image.png.e76f12bd2b47a8950a15249726f5266d.png

It knows how to extract XYZ color information from camera RAW because it has required transform matrix.

If color matching theory did not exist - we would not be able to take our DSLR cameras and shoot scenes and capture faithful color (not color balanced - but faithful - meaning not color of object, which does not exist on its own, but the way - but color of light).

Further - Illuminant is irrelevant for astrophotography as astrophotograpy deals with emissive case rather than reflective case. We are not interested in color of object under certain illuminant - we are interested in color of the light reaching us.

18 minutes ago, jager945 said:

Colours don't "have spectrum" (unless you're just simply asserting that visible light falls on the electromagnetic spectrum, which is pretty self evident?). Colours are a human construct, (e.g. to a human "parts of the electromagnetic spectrum appears to have colors under certain conditions" if you really must put it in those terms), which purple vs violet neatly demonstrates (brown is another fun one). Case in point, purple and violet appear - to humans, but no other species -  to be shades of the same colour, yet are be made by mixing completely different parts of the spectrum.

No but spectrum is has color. In fact - there is infinitely many spectra that will produce same non spectral color to our vision system (spectral colors are only colors that have single spectrum that defines them). From that it leads that any color you pick can have associated spectrum - any of those spectra.

This is a basis for color matching - we don't need to record actual spectrum - we only need to record information that will allow us to generate spectrum that will trigger the same color response in our visual system (eye/brain system).

I don't see any problem in mixing wavelengths arbitrarily in spectrum - again, such spectrum is completely defined - and again, there will be be a color behind that stimulus. Even brown, what is problematic with brown? Fact that it does not appear on rainbow? Well many colors don't appear on rainbow - white color is not on rainbow.

Can you see the brown? Can your computer screen produce brown? Well - then there is spectrum that produces that stimulus in your brain. Camera can record that spectrum and monitor can reproduce it.

35 minutes ago, jager945 said:

Like last time, it is hard not to conclude that you are not here to learn or ask questions in good faith, and I don't think furthering this discussion is going to be productive. You are free to "believe" about colour what you want, but please don't confuse or misinform others. 

In this particular case - I don't feel the need to ask any questions - as I have solid understanding of the topic. I gave answer to the question at the beginning of the thread. It is you who started misleading others in your response.

Link to comment
Share on other sites

In any case, for all interested, here is an example that I did with ASI178mc-cool camera

I took this image:

reference.png

(same one that I shown in my first response on the topic) and displayed it on my phone screen. Then I took ASI178mc-cool with lens and made recording of that screen.

Here is raw data from that recording:

linear_unbalanced.png

Colors have distinct green cast and don't have proper gamma. Any camera that has high sensitivity in green (and most do since our vision is most sensitive in green so no need to make different sensors) will have such green cast if we take raw data. Exceptions are cameras that have high sensitivity in red like ASI224.

I then applied derived transform and properly encoded color in sRGB color space and this is result:

calibration.png

Image is a bit brighter than original due to used exposure and colors are not 100% the same - because this is recording of my phone's screen - I'm not sure how calibrated and accurate it is in color reproduction. In any case - this shows that you can reproduce colors fairly accurately - and those from emission source - colors coming from screens are pure light - not reflected of objects under certain illuminant.

Link to comment
Share on other sites

Oof... my head hurts from the Gish gallop and contradictions, again without actually perusing the information shared with you. :(

Quote

Further - Illuminant is irrelevant for astrophotography as astrophotograpy deals with emissive case rather than reflective case. We are not interested in color of object under certain illuminant - we are interested in color of the light reaching us.

If we are exclusively dealing with emissive cases (we are not), then why on earth are you calibrating against an exclusively reflective target like a Macbeth chart, purely relying on subtractive colour, as lit by a G2V type star at noon under an overcast sky (e.g. the D65 standard illuminant as used by sRGB)? None of your calibration targets here are emissive!

Also, you are flat out wrong that the illuminant is irrelevant. Not only is its influence explained in the color space conversion article, more importantly, astrophotography deals with more than just emissions (most of which are monochromatic mind you). Take for example reflection nebulosity (hint; it's in the name!). It's just that the illuminants are not located with in 150,000,000km from the camera but much further, and may vary per location in the image (both in brightness and power spectrum). E.g. we are dealing with multiple, vastly different illuminants, and thus an "average" reference illuminant needs to be chosen. Popular techniques include choosing a G2V star in the field, the average of all foreground stars, or a using nearby spiral galaxy. All are equally valid choices. All will yield different white references and power spectrum assumptions.

Quote

Can you see the brown? Can your computer screen produce brown? Well - then there is spectrum that produces that stimulus in your brain. Camera can record that spectrum and monitor can reproduce it.

If you had bothered to watch the video rather than instinctively responding straight away, you would have learned that perceiving brown (or orange) is entirely dependent on its context within the image. You can't "measure" orange, you can't measure "brown". What looks like brown in one image, looks orange (or even red) in another, even though the RGB or XYZ values are exactly the same. It has nothing to do with "spectrum". It is not a signal that is recordable. These colours only manifest themselves once they are viewed in the context of a scene. Even with more "conventional" colours, it's not just wavelength that defines colour;

Quote

It is you who started misleading others in your response.

Mate, I implement this stuff for a living. StarTools incorporates all DSLR matrices found in DCRAW and more (the matrices are hardcoded in the executable/libraries - with only a few DSLR model exceptions, they are rarely encoded in the actual RAW file). As I mentioned, StarTools is the only software for astrophotography (that I know of) that bothers to incorporate matrix correction at all as part of its colour calibration module (e.g. not at the debayering stage where light pollution and uneven lighting is still part of the signal). I implemented actual color calibration routines that do all this on stretched data (so luminance signal contamination can be avoided). And even I think it's an inconsequential feature. The matrices are irrelevant for use in AP. This is not terrestrial photography, and none of its assumptions with regards to colour apply. That's not even mentioning that no one stretches their image with exactly the ~2.2 sRGB gamma curve either.

Quote

I have solid understanding of the topic.

I will let others (and history) be the judge of that. I give up.

Link to comment
Share on other sites

8 hours ago, jager945 said:

If we are exclusively dealing with emissive cases (we are not), then why on earth are you calibrating against an exclusively reflective target like a Macbeth chart, purely relying on subtractive colour, as lit by a G2V type star at noon under an overcast sky (e.g. the D65 standard illuminant as used by sRGB)? None of your calibration targets here are emissive!

Because split on emissive and reflective case is artificial - you always work with "emissive" case - you always work with emitted light. Reflective case is distinguished just because people thing that object have "inherent" color - they don't.

When recording a scene with camera - you are always working with light. Regardless of how that light reached you - directly from source or reflected of something.

Selecting of distinct calibration spectra will only make your color calibration more accurate in one region of human vision gamut and less accurate in another region of human vision gamut.

Using Macbeth chart - either in reflective (using any of standard illuminants) or in emissive mode (like displayed on computer / phone screen) is just a selection of calibration spectra that will work good for sRGB gamut. You can equally use something like this:

image.png.e973f087b529d102d6c4e69b908292c3.png

It is just the matter of knowing what you are doing and having a good

reference (calibrated computer/phone screen, standard illuminant or DSLR to calibrate against).

8 hours ago, jager945 said:

If you had bothered to watch the video rather than instinctively responding straight away, you would have learned that perceiving brown (or orange) is entirely dependent on its context within the image. You can't "measure" orange, you can't measure "brown".

I did watch the video. Of course that you can measure brown/orange. You can measure physical quantity / spectrum of the light. You can't measure human response to it (maybe you can if you scan the brain or stick electrodes in it) - but you can reproduce human response.

Here we are talking about measuring physical quantities and then replaying those to people so that their visual system can do the rest. That is all.

Don't confuse psychological response to a stimulus with measurement and reproduction of that physical quantities that represent that stimulus.

In any case - look my previous post. Top left corner is brown. My phone showed it as brown, my camera recorded it, I color calibrated the data and now it shows on our screens as brown. Same goes for orange - one square below.

8 hours ago, jager945 said:

Mate, I implement this stuff for a living. StarTools incorporates all DSLR matrices found in DCRAW and more (the matrices are hardcoded in the executable/libraries - with only a few DSLR model exceptions, they are rarely encoded in the actual RAW file). As I mentioned, StarTools is the only software for astrophotography (that I know of) that bothers to incorporate matrix correction at all as part of its colour calibration module (e.g. not at the debayering stage where light pollution and uneven lighting is still part of the signal). I implemented actual color calibration routines that do all this on stretched data (so luminance signal contamination can be avoided). And even I think it's an inconsequential feature. The matrices are irrelevant for use in AP. This is not terrestrial photography, and none of its assumptions with regards to colour apply. That's not even mentioning that no one stretches their image with exactly the ~2.2 sRGB gamma curve either.

I find it interesting that you implemented something in your software that you believe is wrong. Why did you do that?

By the way - from what I've seen, it looks like your implementation is wrong. Most if not all images that I've seen processed with StarTools don't produce accurate star colors.

Star colors fall in this range when viewed on sRGB screen (Plankian locus):

image.png.b1feaf16eb81d2f0ea1fe35a2694e31a.png

http://www.vendian.org/mncharity/dir3/starcolor/

Or from wiki on stellar clasification:

image.png.4eed24396286beaa126ff42670dac245.png

https://en.wikipedia.org/wiki/Stellar_classification

Although you sport "Scientific" color calibration in StarTools - it never seems to produce such results. I wonder why?

By the way - if you treat Luminance as intensity of light you can still maintain proper RGB ratio and apply 2.2 sRGB gamma, regardless of initial stretch on luminance. That way you get proper stellar color.

 

Link to comment
Share on other sites

11 hours ago, Davey-T said:

Don't lose sight of the fact that it could actually be a fault with the camera.

My ZWO2600MC started showing a similar effect intermittently which was a problem with the camera, image attached.

There is a thread on Cloudy Nights discussing a similar problem with ZWO MC cameras, sorry didn't keep a link., something to do with different colour balance between native drivers and ASCOM drivers.

Dave

Just an out of focus image taken indoors in daylight

449981879_ZWOASI2600MCGreen2.PNG.952150797b9470e5dede30dab5cd74b6.PNG

Like Dave, I suspect a problem beyond simple colour calibration here. I  wonder why the area around the Trapezium is a totally different colour from the rest of the nebula, when they should be similar.

If the OP posted a link to a linear TIFF we could run it through Pixinsight's DBE. Although my own dark-site data is never far out when R, G and B are weighted at parity, I have processed data sent from heavily light polluted sites and never seen any of them defeat DBE. My hunch (no more than that) is that DBE will not rectify this image.

Olly

 

Edited by ollypenrice
Link to comment
Share on other sites

15 minutes ago, ollypenrice said:

Like Dave, I suspect a problem beyond simple colour calibration here. I  wonder why the area around the Trapezium is a totally different colour from the rest of the nebula, when they should be similar.

If the OP posted a link to a linear TIFF we could run it through Pixinsight's DBE. Although my own dark-site data is never far out when R, G and B are weighted at parity, I have processed data sent from heavily light polluted sites and never seen any of them defeat DBE. My hunch (no more than that) is that DBE will not rectify this image.

Olly

One of the problems is that blue and red are completely compressed while green seems stretched:

image.png.94b41e890f206846351e410dc6d5bcf9.png

image.png.0c352f1af73d752a50a5956cc2fd6434.png

Trying to correct anything on attached image is not going to work as red and blue are effectively destroyed by 8bit being squeezed in lower part of histogram (effectively posterized to handful of values).

Indeed - we would need linear data in order to properly diagnose things.

  • Like 1
Link to comment
Share on other sites

When I use my 553C in Maxim I find that it can be very green biased if I use a Tri band or L enhance filter, but if I use my CLS filter is more neutral. 

I really think colour casts should be dealt with at the debayer stage not allowed to get into the stack, in Maxim this is easy as the colour convert tool has colour scaling, so depending on what filter im using I just lower the green value by trial and error until the image looks roughly neutral, you only have to do it for 1 sub, then close the window, then when stacking the stack window uses whatever value was last set in the colour convert window. For the Triband I find 65-70% for green is about right. it hasn't got to be perfect just as long as its roughly neutral then it can be tweaked in post. 

Ive noticed a lot of tutorials skip this step especially ones for pix insight, then they try and correct later with other tools but they always look horrible to me. I think you get a much better result if you start with a neutral stack.

Here is a screen shot showing what I mean.

Screenshot 2021-04-04 at 15.03.24.png

Edited by Magnum
  • Like 3
Link to comment
Share on other sites

30 minutes ago, Magnum said:

When I use my 553C in Maxim I find that it can be very green biased if I use a Tri band or L enhance filter, but if I use my CLS filter is more neutral. 

I really think colour casts should be dealt with at the debayer stage not allowed to get into the stack, in Maxim this is easy as the colour convert tool has colour scaling, so depending on what filter im using I just lower the green value by trial and error until the image looks roughly neutral, you only have to do it for 1 sub, then close the window, then when stacking the stack window uses whatever value was last set in the colour convert window. For the Triband I find 65-70% for green is about right. it hasn't got to be perfect just as long as its roughly neutral then it can be tweaked in post. 

Ive noticed a lot of tutorials skip this step especially ones for pix insight, then they try and correct later with other tools but they always look horrible to me. I think you get a much better result if you start with a neutral stack.

Here is a screen shot showing what I mean.

If you shoot with broad band filters you can actually find scaling factors from F2 class star in the image - it needs to be roughly white - so measure pixel values in such star (it's best to measure average value on star selection in each channel) and derive scaling factors from that.

Alternatively - what you are doing is also fine - fiddle around with values until you find what looks natural. Problem is that what we think is natural is often impacted by other images of the object that we've seen so far - and those might not been properly color calibrated - so our sense of right might be wrong :D

With this technique you can get pretty good results - I did it as a step in above example to demonstrate - and it was rather easy as I had gray patches in the image:

reference.png

reference again, and just simple RGB balance:

linear_simple_balance.png

gray patches are grey - so that is fine - but colors don't match as well as they could. Overly saturated, some are darker and some have noticeable different hye as well. Gamma is also not good - which can be seen in gray row gradient - it is not uniform as in reference image.

 

Link to comment
Share on other sites

48 minutes ago, vlaiv said:

If you shoot with broad band filters you can actually find scaling factors from F2 class star in the image - it needs to be roughly white - so measure pixel values in such star (it's best to measure average value on star selection in each channel) and derive scaling factors from that.

Alternatively - what you are doing is also fine - fiddle around with values until you find what looks natural. Problem is that what we think is natural is often impacted by other images of the object that we've seen so far - and those might not been properly color calibrated - so our sense of right might be wrong :D

With this technique you can get pretty good results - I did it as a step in above example to demonstrate - and it was rather easy as I had gray patches in the image:

reference.png

reference again, and just simple RGB balance:

linear_simple_balance.png

gray patches are grey - so that is fine - but colors don't match as well as they could. Overly saturated, some are darker and some have noticeable different hye as well. Gamma is also not good - which can be seen in gray row gradient - it is not uniform as in reference image.

 

Yes star calibration is the most accurate for broadband. Maxim is very good for dealing with scaling factors, I like the way you can set in one tool then the other tools can make use of it and it also remembers the setting until you change. So if working on a project for weeks I know that each time I add more data the colour stays the same and don’t have to think of it again until I change to another filter or project.

Lee

Link to comment
Share on other sites

Please post a Raw sub file (tiff of fits) so we can have a look at it. Will be easy enough to find out if it can be calibrated or if your camera needs to be fixed.

Edited by gorann
  • Like 3
Link to comment
Share on other sites

20 hours ago, vlaiv said:

Because split on emissive and reflective case is artificial - you always work with "emissive" case - you always work with emitted light.

I'm not sure if you are joking here by playing word games? :( The alternative is that you really don't understand (or even acknowledge) the massive difference between the partially reflected light from an illuminant, its power spectrum qualities, and, say a direct monochromatic light source.

Let's try a thought experiment; I go into a large empty hall, somewhat dimly lit by 100+ light fixtures with old incandescent bulbs powered at unknown different voltages. Some are brighter (and thus whiter/yellower), some are dimmer (and thus redder). In the room, is a rare rainbow-coloured bird-of-paradise flying around, dodging the fixtures as it flies around. It has never been seen outside the room. I then proceed to take 10 pictures of it flying around. I come out of this room. I give my 10 RAW images to you.

How is your daylight Macbeth-calibrated matrix going to help you colour-correct the 10 pictures to reveal the "true" colours of that bird accurately in all 10 shots? Are you just going to use your D65-acquired matrix and tell me the bird I saw was a drab yellow/brown/orange? Would you maintain the colors are correct? Would you attempt to colour balance? If so, what would your white point be when doing so? Would you colour balance the 10 images differently, depending on what type of light lit the bird the most during its flight? Would you just average out all light temperatures for all shots and use that as you white balance? If you decided to color balance with a custom whitepoint, would you maintain the Macbeth D65-illuminant derived color matrix is still applicable?

I'd really like to know where the disconnect is here.

20 hours ago, vlaiv said:

I find it interesting that you implemented something in your software that you believe is wrong. Why did you do that?

Where did I say I implemented a feature I believe is "wrong"? I used the word "inconsequential". That's because, per the actual title of one of the articles I gave to you, colour matrix construction for a tri-stimulus camera is an ill-posed problem (particularly for AP purposes).

If you don't know what "ill-posed" means, it is a term used in mathematics to describe a problem that - in its mildest form - does not have one unique solution (deconvolution is another example of an ill-posed problem in astrophotography).

In other words, a DSLR-manufacturer matrix-corrected rendition in ST, is just one more of an infinite set of plausible colour renditions (even within the more stringent terrestrial parameter confines such as a single light source of a known power spectrum) that no-one can claim is the only, right solution. Ergo, the inclusion of this functionality in ST is neat, but ultimately inconsequential (but not "wrong!"); it is not "better" than any other rendition. Being able to use the manufacturer's terrestrial D65-based color matrix is particularly inconsequential outside the confines of terrestrial parameters. But if you really want to, now you can.

20 hours ago, vlaiv said:

By the way - from what I've seen, it looks like your implementation is wrong. Most if not all images that I've seen processed with StarTools don't produce accurate star colors.

From what you've demonstrated here, it strongly appears to me you are currently ill-equipped to make such a judgement. Worse, making that statement ("don't produce accurate star colors") belies a lack of understanding of what star colour accuracy really means or how it comes about. Which, suitably, brings us to this;

20 hours ago, vlaiv said:

Star colors fall in this range when viewed on sRGB screen (Plankian locus):

There is no range. There are no prescribed values. There is no one accurate star colour. These tables don't take into account relative brightnesses, alternative saturation or - crucially - alternative white reference, because - obviously - the table would be infinite. Only a click away (http://www.vendian.org/mncharity/dir3/starcolor/details.html) you can read this massive disclaimer by the author;

Quote

All these numbers are approximate!

These are not magic values for pixel color. Each time I change the details of how things are handled, and then regenerate these web pages, different colors fall out. Quite similar, but different.

The choice of white point (D65 vs D50) makes a big difference. I.e., is the Sun bluish or pinkish. sRGB and Rec.709 both use D65.

The choice of gamma correction (sRGB vs Rec.709) makes some difference. And the chromaticities I derive from the various sources are not very tightly clustered. And I may well be making mistakes.

For each type/class, I am blindly averaging (in XYZ space) together the chromaticities of the spectra I have available ( Kurucz, Silva, Pickles ), or of the blackbody chromaticities ( Handbook, Tokunaga ) if I have none. This approach unfortunately creates color discontinuities as one jumps among data sets. I am also missing data on assorted class/types.

What are possible improvements? There are few enough class/types that one could hand craft a set of colors. I am unlikely to do this. One might interpolate missing class/type colors to get a more complete set. Look more carefully at the spectra to determine the sources of variance. Get more complete spectra sets (Kurucz comes to mind). Provide ranges of colors rather than single points (solves a different problem). Broaden the set of class/types covered.

Unless I receive feedback to the contrary, I am likely to leave the current color selection approach, and worry about other things. Like adding D50 colors.

I.e. it sums up all the stuff I keep trying to explain to you - picking an illuminant/white reference matters to colour, picking your stretch matters to colour, picking your colourspace/tri-stimulus conversion method matters to colour, picking your luminance/chrominance compositing matters to colour. And none of these choices in AP are prescribed, nor standardised, nor fixed to one magic value or number or method or technique.

And to drive this all home, on yet another page (http://www.vendian.org/mncharity/dir3/blackbody/) you can read;

Quote

Update (2016-Mar-30): For most applications where a human is looking at a screen, and thus judging colors against screen white, a nonstandard but white D58 whitepoint is preferable to a standard but bluish D65 whitepoint. A modified blackbody color tool hack was used to create a colorized datafile for sRGB primaries and gamma, but substituting a nonstandard D58 whitepoint for sRGB's D65. Using some CMF files carelessly grabbed from the web. This datafile was just a quick kludge, with no quality control. End of update.

E.g. the creator of the table you posted as an "accurate star color range", now as of 2016, in hindsight, prefers a whopping 700K yellower D58 rendition of his table for use on his sRGB D65 screen. And that is his absolute right, as it is just as valid of a choice.

21 hours ago, vlaiv said:

Although you sport "Scientific" color calibration in StarTools - it never seems to produce such results. I wonder why?

I have no idea what "scientifically color calibrated" result would even mean or what that would look like. Where did you read/hear that?

Perhaps you are referring to the optional/default "scientific colour constancy" mode? It helps viewers and researchers more easily see areas of identical hue, regardless of brightness, allowing for easy object comparison (chemical makeup, etc.). It was created to address the sub-optimal practice of desaturating highlights due to stretching chrominance data along with the luminance data. A bright O-III dominant area should ideally - psychovisually - show the same shade of teal green as a dim O-III emission dominant area. It should not be whiter or change hue. The object out there doesn't magically change perceived colour or saturation depending on whether an earthling chose a different stretch or exposure time. An older tool you may be more familiar with, that aimed to do the same thing, is an ArcSinH stretch. However that still stretches chrominance along with colour, rather than cleanly separating the two.

I implore you to please read and watch the materials provided thoroughly before commenting - particularly the sources you cited yourself. Not doing so just makes for strange, credibility-damaging posts for no good reason. I have seen you give good advice and make positive contributions here on SGL over the years, but this colour stuff is such a strange hill to die on. This is simply not an intuitive subject, and things are much more subtle and non-deterministic than you seem to think.

Anyhow, getting on topic;

On 03/04/2021 at 00:51, Johanvk said:

Can someone show me a way out?

If you could share with us a the linear stack (Dropbox, Google Drive, WeTransfer, OneDrive etc.), fresh from the stacker without any stretches applied, we can have a look!

As said, to me this just looks like the expected green bias (and increased green signal) from this particular OSC, though the image you posted seems to have been processed, so it is hard to draw conclusions from that.

  • Like 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.