Jump to content

SkySurveyBanner.jpg.21855908fce40597655603b6c9af720d.jpg

Colour considerations


neil phillips

Recommended Posts

34 minutes ago, neil phillips said:

Hi Harvey good to see you around. Yes, it is a real minefield isn't it.  Would be interesting if astronauts ever went there for confirmation of its true colour.

Yes, moved to Suffolk 3 years ago now. Dont go out much mind with poor health and covid. But still love astronomy

Sorry to hear about your health Neil. I can see you've got right back into astro with the excellent stuff you've been posting. Keep up the great work!

All the best

Harvey

Link to comment
Share on other sites

2 hours ago, ONIKKINEN said:

Is there an explain like im 5 years old type of explanation for the colour issue? This topic is always difficult to follow.

What is exactly about color issue that is confusing you?

I'm going to give it a go of explaining things in simplest of the terms.

First I'm going to define the problem and it would be best stated as "Why do people imaging Jupiter end up with different looking images, while people imaging coca cola can don't"? Would that be best description of the color issue?

Explanation can be split into two parts

1. Measurement

2. Interpretation

We all measure light that is reaching us, but we use different sensors to do so. It's like trying to measure length of a London bridge - but each of us using different stick. We would all get different numerical value for the length of the London bridge. I'd get 300 sticks, someone else would get 250 sticks and so on. No one is wrong although our results are different.

If we want to get the same numerical value - we must calibrate our measuring devices. We must determine length of each stick - or create mapping between stick and predefined length used for measurement.

This is sensor to XYZ mapping. XYZ is standard for measurement and each sensor needs to be mapped to it. This will provide us with same numerical values when recording.

Second part is interpretation (and it can be used or not - depends on the effect we want to achieve).

In order to best describe this, I'm going to ask you to imagine what 20C feels like. But I'm going to ask you to imagine what 20C feels like in two different circumstances.

Imagine you went ice swimming

image.png.e3d3cc6217a23887d5e66d9d1b640cae.png

and you immediately enter a room with 20C temperature. How will that temperature feel to you? Warm? Cold?

Now imagine you were sitting in a sauna

image.png.d94c821500051a96b6920d1aee8fee9b.png

and you enter a room with 20C temperature. How will that temperature feel to you? Warm? Chilly?

Same numerical value from calibrated device measured same physical quantity, however, we can experience it quite differently given our conditions.

Same happens with the color. Our visual system adapts to environment and we perceive same light as having different color depending on conditions (like with temperature above).

These things combine to create difficulty of capturing image and reproducing it faithfully.

What we often call "color balance" - can actually be split into two components.

First is color calibration and second is perceptual transform.

When we shoot simple image with our phone - software automatically does this for us. Sensor is color calibrated at factory for that particular smart phone model (or camera) and we do perceptual transform in part by choosing "white balance" - to reflect shooting conditions. We must just remember that "white balance" is only half of perceptual transform and other part is "implied" by sRGB standard, but we can choose to perform whole transform.

What does perceptual transform mean? It can be explained with temperature analogy.

Imagine following scenario - there is device that both records temperature of 20C in that room - but it also records that you were in sauna just minute a go.

Some time in the future - you want to "feel" the same temperature feeling from that recording, but this time you are siting in comfortable environment of 25C. 20C is not much colder than 25C but we have additional information that helps us calculate that you must have felt much colder when going to 20C from sauna - and this transform gives us something like 12C - so we cool an glass of water to 12C and say - you dip your hand in it - and you feel the same sensation as you did when going from sauna to 20C

So this part is not about actual numbers but about what we feel.

Back to the color - we see colors differently given ambient conditions. Perceptual transform will try to keep what we see constant and not numerical values that are recorded (physical quantity related to light spectrum) given two sets of conditions.

It works like this:

Given set of conditions and XYZ measurement - what is XYZ triplet of values that will produce same mental response in different conditions.

White balance is very simplified version of this and it does:

given source light temperature and XYZ - what is XYZ triplet when viewing image on computer screen in normally lit room (office ambient).

In fact - sRGB standard defines ambient values:

image.png.3e57b29ded3c13b6b725275516445dc9.png

More complex perception models exist that let you map between different environments - but I propose not to use those - specifically because no one was floating in orbit around Jupiter to be able to compare "feeling" of colors to those on computer screen.

This however does not mean that we can't make recording and reproduction of the color - in terms of physical values. Appearance of color does not change with distance we take image at (with exclusion of atmosphere and effects of atmospheric extinction), so we can be confident that what we record from this distance from Jupiter will be same as recording from space craft in Jupiter's orbit.

Furthermore - by following set of standards - we can also ensure that light coming from computer screen creates same stimulus as one coming from Jupiter. That is color matching without perceptual matching (we made our ball be 20C temperature and room and ball feel the same temperature regardless if you would describe that temperature as cold or hot to the touch).

We can even rely on our model and attempt to recreate "feel" of the color - but in order to do that we must define environmental factors in which observing of Jupiter will take place. Are you in space suit floating around, or are you in a space ship that has dim lights on and you are looking at Jupiter thru a window or is inside of space craft brightly lit and what is temperature of that ambient illumination and so on ....

 

  • Like 2
  • Confused 1
Link to comment
Share on other sites

3 hours ago, vlaiv said:

What is exactly about color issue that is confusing you?

I'm going to give it a go of explaining things in simplest of the terms.

First I'm going to define the problem and it would be best stated as "Why do people imaging Jupiter end up with different looking images, while people imaging coca cola can don't"? Would that be best description of the color issue?

Explanation can be split into two parts

1. Measurement

2. Interpretation

We all measure light that is reaching us, but we use different sensors to do so. It's like trying to measure length of a London bridge - but each of us using different stick. We would all get different numerical value for the length of the London bridge. I'd get 300 sticks, someone else would get 250 sticks and so on. No one is wrong although our results are different.

If we want to get the same numerical value - we must calibrate our measuring devices. We must determine length of each stick - or create mapping between stick and predefined length used for measurement.

This is sensor to XYZ mapping. XYZ is standard for measurement and each sensor needs to be mapped to it. This will provide us with same numerical values when recording.

Second part is interpretation (and it can be used or not - depends on the effect we want to achieve).

In order to best describe this, I'm going to ask you to imagine what 20C feels like. But I'm going to ask you to imagine what 20C feels like in two different circumstances.

Imagine you went ice swimming

image.png.e3d3cc6217a23887d5e66d9d1b640cae.png

and you immediately enter a room with 20C temperature. How will that temperature feel to you? Warm? Cold?

Now imagine you were sitting in a sauna

image.png.d94c821500051a96b6920d1aee8fee9b.png

and you enter a room with 20C temperature. How will that temperature feel to you? Warm? Chilly?

Same numerical value from calibrated device measured same physical quantity, however, we can experience it quite differently given our conditions.

Same happens with the color. Our visual system adapts to environment and we perceive same light as having different color depending on conditions (like with temperature above).

These things combine to create difficulty of capturing image and reproducing it faithfully.

What we often call "color balance" - can actually be split into two components.

First is color calibration and second is perceptual transform.

When we shoot simple image with our phone - software automatically does this for us. Sensor is color calibrated at factory for that particular smart phone model (or camera) and we do perceptual transform in part by choosing "white balance" - to reflect shooting conditions. We must just remember that "white balance" is only half of perceptual transform and other part is "implied" by sRGB standard, but we can choose to perform whole transform.

What does perceptual transform mean? It can be explained with temperature analogy.

Imagine following scenario - there is device that both records temperature of 20C in that room - but it also records that you were in sauna just minute a go.

Some time in the future - you want to "feel" the same temperature feeling from that recording, but this time you are siting in comfortable environment of 25C. 20C is not much colder than 25C but we have additional information that helps us calculate that you must have felt much colder when going to 20C from sauna - and this transform gives us something like 12C - so we cool an glass of water to 12C and say - you dip your hand in it - and you feel the same sensation as you did when going from sauna to 20C

So this part is not about actual numbers but about what we feel.

Back to the color - we see colors differently given ambient conditions. Perceptual transform will try to keep what we see constant and not numerical values that are recorded (physical quantity related to light spectrum) given two sets of conditions.

It works like this:

Given set of conditions and XYZ measurement - what is XYZ triplet of values that will produce same mental response in different conditions.

White balance is very simplified version of this and it does:

given source light temperature and XYZ - what is XYZ triplet when viewing image on computer screen in normally lit room (office ambient).

In fact - sRGB standard defines ambient values:

image.png.3e57b29ded3c13b6b725275516445dc9.png

More complex perception models exist that let you map between different environments - but I propose not to use those - specifically because no one was floating in orbit around Jupiter to be able to compare "feeling" of colors to those on computer screen.

This however does not mean that we can't make recording and reproduction of the color - in terms of physical values. Appearance of color does not change with distance we take image at (with exclusion of atmosphere and effects of atmospheric extinction), so we can be confident that what we record from this distance from Jupiter will be same as recording from space craft in Jupiter's orbit.

Furthermore - by following set of standards - we can also ensure that light coming from computer screen creates same stimulus as one coming from Jupiter. That is color matching without perceptual matching (we made our ball be 20C temperature and room and ball feel the same temperature regardless if you would describe that temperature as cold or hot to the touch).

We can even rely on our model and attempt to recreate "feel" of the color - but in order to do that we must define environmental factors in which observing of Jupiter will take place. Are you in space suit floating around, or are you in a space ship that has dim lights on and you are looking at Jupiter thru a window or is inside of space craft brightly lit and what is temperature of that ambient illumination and so on ....

 

Ok cheers for the explanation. But as I've said would be great to have the procedures on video. So, for example one can go back to it when attempting to try what you're suggesting. Its getting more of a lesson than a solution at this stage. Not that i mind a bit of background. But just the procedures on video probably would be the most helpful for a lot of people including me. Its up to you. Not a problem if you would rather not ?

Link to comment
Share on other sites

4 hours ago, vlaiv said:

What is exactly about color issue that is confusing you?

I'm going to give it a go of explaining things in simplest of the terms.

First I'm going to define the problem and it would be best stated as "Why do people imaging Jupiter end up with different looking images, while people imaging coca cola can don't"? Would that be best description of the color issue?

Explanation can be split into two parts

1. Measurement

2. Interpretation

We all measure light that is reaching us, but we use different sensors to do so. It's like trying to measure length of a London bridge - but each of us using different stick. We would all get different numerical value for the length of the London bridge. I'd get 300 sticks, someone else would get 250 sticks and so on. No one is wrong although our results are different.

If we want to get the same numerical value - we must calibrate our measuring devices. We must determine length of each stick - or create mapping between stick and predefined length used for measurement.

This is sensor to XYZ mapping. XYZ is standard for measurement and each sensor needs to be mapped to it. This will provide us with same numerical values when recording.

Second part is interpretation (and it can be used or not - depends on the effect we want to achieve).

In order to best describe this, I'm going to ask you to imagine what 20C feels like. But I'm going to ask you to imagine what 20C feels like in two different circumstances.

Imagine you went ice swimming

image.png.e3d3cc6217a23887d5e66d9d1b640cae.png

and you immediately enter a room with 20C temperature. How will that temperature feel to you? Warm? Cold?

Now imagine you were sitting in a sauna

image.png.d94c821500051a96b6920d1aee8fee9b.png

and you enter a room with 20C temperature. How will that temperature feel to you? Warm? Chilly?

Same numerical value from calibrated device measured same physical quantity, however, we can experience it quite differently given our conditions.

Same happens with the color. Our visual system adapts to environment and we perceive same light as having different color depending on conditions (like with temperature above).

These things combine to create difficulty of capturing image and reproducing it faithfully.

What we often call "color balance" - can actually be split into two components.

First is color calibration and second is perceptual transform.

When we shoot simple image with our phone - software automatically does this for us. Sensor is color calibrated at factory for that particular smart phone model (or camera) and we do perceptual transform in part by choosing "white balance" - to reflect shooting conditions. We must just remember that "white balance" is only half of perceptual transform and other part is "implied" by sRGB standard, but we can choose to perform whole transform.

What does perceptual transform mean? It can be explained with temperature analogy.

Imagine following scenario - there is device that both records temperature of 20C in that room - but it also records that you were in sauna just minute a go.

Some time in the future - you want to "feel" the same temperature feeling from that recording, but this time you are siting in comfortable environment of 25C. 20C is not much colder than 25C but we have additional information that helps us calculate that you must have felt much colder when going to 20C from sauna - and this transform gives us something like 12C - so we cool an glass of water to 12C and say - you dip your hand in it - and you feel the same sensation as you did when going from sauna to 20C

So this part is not about actual numbers but about what we feel.

Back to the color - we see colors differently given ambient conditions. Perceptual transform will try to keep what we see constant and not numerical values that are recorded (physical quantity related to light spectrum) given two sets of conditions.

It works like this:

Given set of conditions and XYZ measurement - what is XYZ triplet of values that will produce same mental response in different conditions.

White balance is very simplified version of this and it does:

given source light temperature and XYZ - what is XYZ triplet when viewing image on computer screen in normally lit room (office ambient).

In fact - sRGB standard defines ambient values:

image.png.3e57b29ded3c13b6b725275516445dc9.png

More complex perception models exist that let you map between different environments - but I propose not to use those - specifically because no one was floating in orbit around Jupiter to be able to compare "feeling" of colors to those on computer screen.

This however does not mean that we can't make recording and reproduction of the color - in terms of physical values. Appearance of color does not change with distance we take image at (with exclusion of atmosphere and effects of atmospheric extinction), so we can be confident that what we record from this distance from Jupiter will be same as recording from space craft in Jupiter's orbit.

Furthermore - by following set of standards - we can also ensure that light coming from computer screen creates same stimulus as one coming from Jupiter. That is color matching without perceptual matching (we made our ball be 20C temperature and room and ball feel the same temperature regardless if you would describe that temperature as cold or hot to the touch).

We can even rely on our model and attempt to recreate "feel" of the color - but in order to do that we must define environmental factors in which observing of Jupiter will take place. Are you in space suit floating around, or are you in a space ship that has dim lights on and you are looking at Jupiter thru a window or is inside of space craft brightly lit and what is temperature of that ambient illumination and so on ....

 

Thank you for taking time to write an explanation. I think i understand the why part. Just scratching my head on the how part now.

I carry my kit outdoors to use it every time anyway so in theory i could haul it somewhere i can hang a phone off a tree far away or something like that. I just need to understand the actual process of what im doing in what software and what exactly the phone screen should be showing and then ill do it with my scope and the 678MC. Not sure i will bother doing that in processing, but then again i said that the first 3 times you mentioned split debayering and here i am, splitting a thousand subs regularly as part of the process so maybe this will be one of those extra "hoops" to jump through in the end.

Link to comment
Share on other sites

Nice image(s) Neil.

The colour question always interests me wrt Jupiter. Without a scope, I see J as a blue white object, similar in colour to Vega perhaps. If I process images without messing around with the colour at all, I get a slightly blue background for the disk, with the main bands a dark chocolate colour. NASA images, and for example the simulation on SkySafari always look more yellow, with the main disk set to the colour balance perhaps of sunlight. However (with my spectroscopist hat on), looking at the spectrum of J, it is of sunlight minus bands mainly in the red end of the spectrum, indicating that the true colour is indeed blueish, not yellow-white like the sun.

Maybe it comes down to personal preferences; I like it looking as close to reality as possible, but I can fully understand folk who want it to look more like the NASA images and more yellow for aesthetic reasons.

Chris

  • Like 1
Link to comment
Share on other sites

12 hours ago, neil phillips said:

Not a problem if you would rather not ?

It's not the matter of wanting or not - I do want. It's more matter of finding time and sticking to it to get it done. To be honest there is also somewhat scary part of exposing myself to the public eye (not very extrovert person) thru the videos, but I think that wont be much of a problem once I get going.

I'll get it going as soon as I find some spare time today - at least to check out all the ways that I can record video and edit it (must switch between phone camera and DSLR for video recording since I'll be using phone in procedure and I can't shoot video with it at that point).

  • Like 2
Link to comment
Share on other sites

1 hour ago, vlaiv said:

It's not the matter of wanting or not - I do want. It's more matter of finding time and sticking to it to get it done. To be honest there is also somewhat scary part of exposing myself to the public eye (not very extrovert person) thru the videos, but I think that wont be much of a problem once I get going.

I'll get it going as soon as I find some spare time today - at least to check out all the ways that I can record video and edit it (must switch between phone camera and DSLR for video recording since I'll be using phone in procedure and I can't shoot video with it at that point).

Ok Vlaiv. There's no rush. When you have time of course. None of us are in desperate need. So, what's best for you is how it should be. If you would rather stay off camera as in just hands or what not, also no problem. Again of course whatever is comfortable for you. Doesn't matter how long it takes, can just check back on here once in a while. If you change your mind for whatever reason. And would rather not that's your business and nobody else's. 

  • Like 1
Link to comment
Share on other sites

4 hours ago, chiltonstar said:

Nice image(s) Neil.

The colour question always interests me wrt Jupiter. Without a scope, I see J as a blue white object, similar in colour to Vega perhaps. If I process images without messing around with the colour at all, I get a slightly blue background for the disk, with the main bands a dark chocolate colour. NASA images, and for example the simulation on SkySafari always look more yellow, with the main disk set to the colour balance perhaps of sunlight. However (with my spectroscopist hat on), looking at the spectrum of J, it is of sunlight minus bands mainly in the red end of the spectrum, indicating that the true colour is indeed blueish, not yellow-white like the sun.

Maybe it comes down to personal preferences; I like it looking as close to reality as possible, but I can fully understand folk who want it to look more like the NASA images and more yellow for aesthetic reasons.

Chris

Yeah, I totally get that Chris. Of course, if it's a preference. And not a bad days processing. It's all good by me. A lot of beginners are probably helped by discussion. But if it is a true preference and not just a bad day's processing (we all have them) Then that's their business and their work and nobody else's. It's all good.

  • Like 2
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.