Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

Why are my colour images coming out in black and white?


Recommended Posts

1 hour ago, rojach78 said:

a new stack

Hi

Had a quick coffee-break go. I'm not sure it's much different, but do bear in mind my total colour deafness!

Oh, don't forget to dither between frames and stack using a clipping algorithm... satellites;)

47 minutes ago, vlaiv said:

StarTools processing

I like StarTools. What's wrong with its colour?

Cheers

n22.thumb.jpg.d4c59866654ad30011b0dcbd6d90f39d.jpg

 

 

n2_01.thumb.jpg.894d662ddd70d3621e54b7ea75abd0f1.jpg

Edited by alacant
  • Thanks 1
Link to comment
Share on other sites

5 minutes ago, alacant said:

I like StarTools. What's wrong with its colour?

I personally think it is wrong in how it is doing color processing in the images. I had quite extensive discussion with the author of StarTools here on SGL and we did not seem to come to conclusion. He is under impression that I've got it wrong and similarly, I'm believe he is wrong.

I don't want to get into that sort of discussion again, I'm just going to say what I believe is wrong - there are no teal or greenish looking stars.

Here is what main sequence stars look like in color:

image.png.088d21254e7ac5bc26417d98702c01d8.png

Or perhaps this image:

TernaryColorTmap.PNG

This is so called Plankian locus - what sort of colors you can get from black body of a different temperature. Stars are very very similar in their spectrum to black body and hence will have very similar color to Plankian locus (in most cases indistinguishable to human eye).

Good color correction for astro image should produce majority of stars with above colors. Having teal or greenish tint in stars is unrealistic.

Mind you - we are talking here about accurate color representation - and not artistic side of things. If someone wants their stars to be pinkish and feels it is their artistic expression - then by all means, but I'm talking of color matching and accuracy.

 

Link to comment
Share on other sites

2 hours ago, rojach78 said:

I have added flats and dark flats but there is a huge dust spot that you are refering too, i will be cleaning before i image again

Looking forward to seeing your cleaned up image. Its looking lovely despite the dust motes.

If you view the image histogram the dust mote becomes more apparent. I suspect there is another larger one next to it as well, but much lighter. Here is a picture I took using Siril.

728616695_Screenshotfrom2021-04-2110-46-20.png.e0a72c403afb8c5cb2153768cb52e4ed.png

Link to comment
Share on other sites

@rojach78

On account of those dust particles - your flat correction seems to be working fine regardless of the fact we are seeing these.

These just mean that dust has moved between lights and flats - or that you changed something to your setup between lights and flats. You should take flats in exact same setup as lights - even focus position. Don't rotate anything, don't adjust clamping, don't fix focus position - everything should remain exactly the same for flats as it was for lights.

If you loose focus during the night and refocus while taking the lights (this can happen due to cool down if temperature drops over night) - use last good focus position - the way you finished your lights session and take flats straight away at the same ambient temperature.

Link to comment
Share on other sites

3 hours ago, vlaiv said:

there are no teal or greenish looking stars

The OP said there was no colour in his image. There is. Whether it's the correct colour, I don't know. 

Please show us the image correctly processed:)

Cheers

 

Link to comment
Share on other sites

1 minute ago, alacant said:

Please show us the image correctly processed:)

I'll show you what I believe is "correct color processing", rather than correct processing of the image (don't want to get involved in discussion on "correct processing" of astronomical images :D ) - at least basic version of that - one star color calibration.

Multiple star color calibration is much more involved and I would rather use dedicated software for that than do it "by hand". I would also rather have pre measured RAW to XYZ matrix for that - which I don't since I don't have that particular camera.

Will post my processing in a few hours.

Link to comment
Share on other sites

@vlaiv please could you use the OP's image? I'm sure that would be far more useful. Siril has a good photometric star colour calibration if you need a reference. It only takes a minute or so.

Cheers 

Edited by alacant
Link to comment
Share on other sites

5 minutes ago, alacant said:

@vlaiv please could you use the OP's image? I'm sure that would be far more useful. Siril has a good photometric star colour calibration if you need a reference. It only takes a minute or so.

Cheers 

Of course I'll use OP image - the latest one.

I don't know how photometric star colour calibration works in Siril - I'll have to examine that. I've looked at PixInsight feature called the same - and unfortunately it has nothing to do with actual color. It uses B-V index as color (which is also called stellar color) and calibrates against that. That is not useful if you want to make accurate images.

By the way - by color accuracy - I mean following: Imagine daytime (or nighttime, it does not matter) scene that is illuminated by some sort of light and you take picture of it. I want to reproduce actual colors as they were and not "color balanced" version of the image. For example if you have a camp fire that is illuminating the scene and you have white piece of paper in that scene. I don't want that paper to end up being white in final image - but rather orange because it is illuminated by camp fire.

Many people get confused by color balance / white balance in the whole story and that is not what we need here. White balance is way of showing objects as they would appear under what is common illumination to us. I maintain that object don't actually have color - they have color under certain illumination. It is light that has color - and as such it does not depend on illumination. But this is completely different topic.

I'll post steps of my processing with emphasis on color handling.

  • Like 2
Link to comment
Share on other sites

5 minutes ago, vlaiv said:

Many people get confused by color balance / white balance in the whole story and that is not what we need here. White balance is way of showing objects as they would appear under what is common illumination to us. I maintain that object don't actually have color - they have color under certain illumination. It is light that has color - and as such it does not depend on illumination. But this is completely different topic.

 

I realise I am likely opening a can of worms but if an object doesn't have colour and two objects are illuminated by the same source, what phenomenon makes one object appear a different "colour" to another?

Link to comment
Share on other sites

12 minutes ago, scotty38 said:

I realise I am likely opening a can of worms but if an object doesn't have colour and two objects are illuminated by the same source, what phenomenon makes one object appear a different "colour" to another?

Not a can of worms - regular question and answer is simple - reflectance.

https://en.wikipedia.org/wiki/Reflectance

Quote

The reflectance of the surface of a material is its effectiveness in reflecting radiant energy. It is the fraction of incident electromagnetic power that is reflected at the boundary. Reflectance is a component of the response of the electronic structure of the material to the electromagnetic field of light, and is in general a function of the frequency, or wavelength, of the light, its polarization, and the angle of incidence. The dependence of reflectance on the wavelength is called a reflectance spectrum or spectral reflectance curve.

Most materials have dependence only on frequency (that is what we call matte color or diffuse reflection), but some have on other items as well, that is why we get specular highlights for example - dependence on angle of incidence. There are special types of material that even have highlights that depend on direction of light. That is why we have special "candy colors" popular in cars:

image.png.97705f541c85c45de50500b09aa9d10b.png

This is actually the same color - that changes depending on observing angle. What color is this object then? :D

(check out this short youtube video as well: https://www.youtube.com/watch?v=oqNNdCWs0c4)

In any case - there is similar curve to filter response curve. Filter response curve specifies how much of each wavelength is passed thru while reflectance curve specifies how much (what percentage) of particular wavelength is reflected.

 

If you take single light source and apply different filters to it - you'll get different color of light coming out at the other end. In the same way - use same light source to illuminate two different materials and reflected light will be of different spectrum.

Formula is the same - light spectrum * reflectance curve (or in filter case: light spectrum * transmission curve).

It is this reflected light that we see as color of object. This is why object appears to have different color under different lighting as base spectrum is different (but modulated with same reflectance curve). It is in effect light that we see as color and not object itself.

  • Like 1
Link to comment
Share on other sites

1 hour ago, alacant said:

@vlaiv please could you use the OP's image? I'm sure that would be far more useful. Siril has a good photometric star colour calibration if you need a reference. It only takes a minute or so.

Cheers 

Here is step by step guide and result:

1. As a first step, I opened TIF in Gimp and separated channels into mono images with Colors / Components / Decompose (use RGB model and turn off decompose to layers)
2. I then export these as fits since I like working with fits files :D
3. I fire up ImageJ (or rather Fiji - which is distribution of ImageJ with preloaded plugins) and load fits files
4. First step here is to crop edges off and remove stacking artifacts (Images to stack, rectangle select, right click and make a copy - whole stack. Close old one)
5. Next step is to bin x2 as image is from OSC sensor and it has been interpolated to this resolution anyway. I like working with smaller but sharper stars. (image / transform / bin - X and Y set to 2 and method average)
6. Next step is to remove background. I use my own plugin for this that removes gradients as well, but you can just make rectangular selection on piece of background and run stats on it. Then use Process/Math/Subract and subtract median value of background. Do this for each image (each channel)

Here are two rounds of my plugin on Red channel - left is gradient and right is what plugin considers background / foreground:

Screenshot_1.jpg.aaf4e292e80091de11c0c7a5c5ca7af0.jpg

Screenshot_2.jpg.47148b2a67e7aa2ce43edc2821e27eff.jpg

7. Fire up stellarium and find a star that has B-V index of 0.32 (or 0.35) to be your reference star. These stars are roughly white in color and will help us to do simple RGB weighing

Screenshot_3.jpg.d6c0ffb29133ab03e45c17f3549b986e.jpg

8. Select said star using elipse tool in ImageJ and do Image / Stacks / Measure stack

Screenshot_4.jpg.fd58f41d158b0ca54158a253fff9720f.jpg

9. Now we have relative weights for our channels - or rather inverse of those. You need to divide each channel with corresponding mean value (use Process / Math / Divide on each sub - remember to remove selection first),
10. If we now again measure that star - we should get roughly the same values. Don't worry if you don't get exactly 1:1:1 as we have different selection and noise is going to skew values somewhat

Screenshot_5.jpg.ccdef67fe0b534ca620c8ae70f13ee1c.jpg

11. Now we have to "equalize" the subs. This is due to Gimp and how it reads subs. We need to make sure that we have the same max and min values on all three color subs for it to match them when it loads them (to apply same 0-1 stretch on them). Undo selection on all three files and measure them. Set the largest of three minima to be minimum on each image - here -0.1096 and set lowest of the three maxima to be maximum on each image - here 57.4

Screenshot_6.jpg.d1047059583a071142e88ab82ad31040.jpg

Screenshot_7.jpg.2c34ab2945e5d5c8c686da1a6011ea10.jpg

12. Save each fits file

13. Now we open each of them in Gimp back again. Note that Gimp says it will scale fits values - and since we are opening each channel independently - it would apply different scaling had we not made sure each of them has the same min and max value

Screenshot_8.jpg.5b2fc98efc5e30090af66c7e382fbc9e.jpg

14. Do channel compose again (opposite of channel decompose we did in first step)
15. Extract luminance information by now decomposing that RGB image in LAB model. Keep L component and discard a and b

Screenshot_9.jpg.ad4fbc25d7e732c88aa733371b0ef150.jpg

16. Stretch and process luminance to your liking. I'll do basic three step stretch:
- step 1: use levels and bring down top slider so that galaxy core starts to show (apply levels)

Screenshot_10.jpg.1138386cb39bfa22c5c5751e3b334a13.jpg
- step 2: move middle slider so that galaxy is nicely visible (again apply levels)

Screenshot_11.jpg.54329f68875cdc667d1261da2c335555.jpg
- step 3: move bottom slider up to the foot of histogram (again apply levels)

Screenshot_12.jpg.5abfc436f550877c8f29f4c2b20f95ea.jpg

17. You can also apply denoising at this stage or whatever you want - but I won't as this is tutorial for color processing.
18. Switch back to original image and apply one round of levels - but making sure you enter value of 2.2 in middle box. This simulates gamma of 2.2 for sRGB color space

Screenshot_13.jpg.2a0ab131a027bb209a8b79b55a94ac1f.jpg
19. Copy stretched image and paste it as layer over original. Set layer mode to luminance.

Here is the final result of this operation:

result.thumb.jpg.859f40658e57e927362312b524d7a737.jpg

Now, this is approximate method as it uses only single star (and B-V index in Stellarium is not very reliable). For best results - data should be first corrected with derived color matrix and then calibrated against multiple stars of known stellar class - but that is much more complicated procedure.

 

  • Like 3
  • Thanks 3
Link to comment
Share on other sites

Your latest dataset is a massive improvement! There are some small remaining issues, but they mostly have to do with the stacker. These issues are channel misalignment (causing colored fringing) and aberrant chromatic information in the highlights (over exposed areas).

You can see these issues here;

Selection_583.thumb.jpg.47eb46d26cca6dbe36f31d85f564a1f9.jpg

There is much I can say about some of the grave misconceptions above, but hopefully the most obvious, thing to anyone, is that space is not brown - not at any chosen white reference - and that stars are not colourless, nor are light-reflecting objects.

As mentioned (but not demonstrated) above, ideally your stars should show colouring that roughly follows the black body radiation curve. Where an individual star sits on the curve (redder or bluer) depends on the chose white reference of course. Regardless, all star temperatures should be roughly represented in your image, from red->orange->yellow->white->blue). E.g. something like this would be reasonable (but by no means the only "correct" answer);

Selection_585.thumb.jpg.7ac0b3f2ec83dfd185176f40c308c3f3.jpg

Incidentally it is not too far off of the image in Stellarium (not that that is canonical in any way or was intentional). A blow-up of that, showing star colours a bit better;

Selection_586.thumb.jpg.5907b973b8a8c2aa36c79de5beabdb37.jpg

Colour  balancing for DSOs is not an exact science, and there is no one white reference that is canonical. However, there are various well published and - most importantly - substantiated techniques to choose a suitable white reference in your image (none of which, to my knowledge involve targeting stars with a B-V value of 0.3-0.35 which is very blue - well over 7000K!). These techniques include;

Finally, you can also achieve a good ballpark colour balance, with a little knowledge of some of the astrophysics going on in an area. If you know you have multiple, distinct features going on, you can simply make sure all features are showing up well. Examples include, HII areas (pink/purple), O-III emissions (teal/green), dust lanes (red/brown), galaxy cores (yellower due to older stars remaining), galaxy rims (bluer due to younger stars), OB-class stars reflecting their blue light in nearby dust, dust obscured areas will tend redder, and so on. (text colouring for demonstration purposes only!)

As rightfully mentioned above, with the exception of O-III dominant areas (such as M42's core), green channel dominance is very rare in outer space, so if you measure (for example with a dropper) the green channel being dominant across a large area in your image, you know that the green multiplier is probably set too high.

Lastly, it should be emphasised that colour balancing - if this was not clear to someone - needs to be done in the linear domain.

In case anyone is interested in how the full imaging pipeline in consumer DSLR cameras work, this work by Dr. Michael S. Brown is one of the most comprehensive and informative presentations I know on the topic. It goes through all the considerations and transformations that are relevant to converting incoming photons to pixels on your screen, via the various colour spaces involved. Note that this is just for terrestrial scenes for starters! Things get even more subjective when dealing with many different illuminants in one scene (e.g. stars with various power spectra, narrow-line ionization emissions, etc.), noting that comparatively few objects we image in outer space are reflective (let alone reflecting the single power spectrum of a G2V star's daylight filtered by an earth-like atmosphere at noon). The latter is why mono OSC/CCD imagers (and AP software at large) dispenses with camera matrix corrections for DSO scenes, or why NASA keeps re-calibrating its shots depending on lighting conditions on Mars; the lighting conditions in the scene at the time of recording matter!

Of course, now that you have read all this, I have to regretfully inform you that white balancing and its intricacies are stupid, and that I, nor the PI team, nor NASA, nor people like Mr. Charity know what we're doing and that all of the above is incorrect and a waste of time. 🤐

Edited by jager945
  • Like 2
Link to comment
Share on other sites

1 hour ago, alacant said:

Hi

Thank you for postingyour version.

It seems however that your method colours only the galaxy, leaving the stars white:(

Cheers 

Above was just approximation to the method that I propose - that people can use.  Actual method is much more involved and ideally should be implemented in software rather than done manually (as it is tedious work).

Having said that - take a look at this:

image.png.950b5b3e04868e391baa1e1c02d474ae.png

Stars that are bluish in their nature account for only ~ 0.73% of all the stars out there - with majority of them being blue - white. Of course - it will depend on region where you look - if you look at young cluster of hot stars - well most of them will be bluish, but in general direction - odds are, you won't find many, if any truly blue stars.

On the other hand - most stars that you will see will have yellowish / orange tone to them. About 96% of them. However, they tend to be much less bright then other stars and are often at low magnitude scale.

If you look at image that I presented - you'll see that it is indeed the case. Faint stars tend to have yellowish orange tint.

I think that your expectations of what stars should look like in image is biased with all the images that you've seen over the years and in many of those images - color management was not followed and often people boost saturation to get rich colors / colorful stars. Fact is that most stars out there are yellowish white without much saturation and there is much more orange ones - but they are fainter and often won't be captured in images (or will appear brownish because of how faint they are).

 

Link to comment
Share on other sites

2 minutes ago, alacant said:

Which software?

Any software. I did not mean that it should be added to any specific software - but rather that such operation is much more easily done in software than manually.

I can describe what needs to be done and how I do it manually, so you can see the number of steps involved.

You need to select a number of stars in the image. For each of those stars you need to find effective temperature in Gaia DR2 database. I use Simbad / Aladin to do that:

image.png.e95d4a9acb8ecaa99ed83032bb48c646.png

For example, above image is Aladin lite showing star that we used to do simple RGB balance on that has effective temperature of 6250K

Next step is to derive linear rgb components of sRGB space for black body of that temperature. This is done via XYZ color matching functions and black body curve and then transforming XYZ color space into sRGB linear part. It is well known method, but you can use this online calculator (probably uses approximation - but good enough):

http://www.brucelindbloom.com/index.html?ColorCalculator.html

image.png.4dd5f3f9a6b46268ce1a17dfcbec514d.png

Now we have sRGB rgb linear triplet for that color temperature. Record it. Take actual image and your raw_r, raw_g and raw_b images and do photometric measurement (with say AstroImageJ) of values for that star.

Record both of these values in a spread sheet. Repeat for as much stars as you can find in image that have temperature recorded in Gaia DR2 dataset.

Do least squares fitting between two sets of vectors (triplets measured from image and triplets derived from effective temperature) to find transform matrix. Use that transform matrix to transform raw_r, raw_g and raw_b into linear_r, linear_b and linear_g for color information in linear sRGB space.

 

Link to comment
Share on other sites

41 minutes ago, alacant said:

Is your method distinct to that which the photometric colour calibration in e.g. Siril (example in this thread) does? 

I don't know. Can you do photometric color calibration in Siril of above image so we can compare color of stars?

All you need to do is perform color calibration then take color of star of known temperature and compare to actual color of black body of that temperature and see how well they match.

 

Link to comment
Share on other sites

1 hour ago, alacant said:

Is your method distinct to that which the photometric colour calibration in e.g. Siril (example in this thread) does? 

I tried to do photometric color calibration in Siril - but failing. Can you give me some tips for that?

image.png.db4c643c9ed335ed3023e5da4d06ef4e.png

Similar thing happens with other catalog:

image.png.643d142a20029ba939380093670892ba.png

  • Like 1
Link to comment
Share on other sites

22 minutes ago, alacant said:

As Ivo explains, colour calibration must be performed upon linear data. Your workflow alters the data before you calibrate it.

Not sure who Ivo is, but that is besides the point. I tried to do calibration on linear data and not altered one - fits files before I started any sort of non linear processing.

I just wiped the background (removing additive constant keeps data linear).

I was using 0.9.12 version of Siril (one that I have installed - I don't use it otherwise, just installed it at some point to give it a go).

25 minutes ago, alacant said:

Here is a successful photometric calibration, including the resultant .tif

Is that bottom image result of photometric calibration? It looks remarkably like result I got.

 

Link to comment
Share on other sites

@alacant

Given what I've seen Siril do in your example - I would say we are doing the same thing. However, this is just basic color calibration.

Better color correction is achieved if instead of only 3 coefficients (K0,K1 and K2 in Siril) we derive full 3x3 transform matrix. Method is the same except we do least squares fitting instead to get the matrix.

Link to comment
Share on other sites

17 minutes ago, vlaiv said:

Not sure who Ivo is

Sorry: @jager945

17 minutes ago, vlaiv said:

I just wiped the background

This is almost certainly why your colour calibration does not work.

17 minutes ago, vlaiv said:

Is that bottom image result of photometric calibration?

Yes.

 

17 minutes ago, vlaiv said:

It looks remarkably like result I got.

Process it. You'll see the difference;)

Cheers and thanks for your time.

 

Edited by alacant
Link to comment
Share on other sites

7 minutes ago, alacant said:

This is almost certainly why your colour calibration does not work.

No reason why should removing an offset skew up color calibration.

13 minutes ago, alacant said:

Process it. You'll see the difference;)

image.png.12cd8b7ccb9f80199e0aec1b4326de11.png

Again, I get the same result even with your calibrated data, only this time background is not wiped.

Could you post your processing if it is significantly different?

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.