Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Astrophotography pop art - critique of the process


vlaiv

Recommended Posts

9 minutes ago, vlaiv said:

However, I will not forfeit my right to criticize what I believe is wrong approach.

I'm not suggesting that you should. 🙂 in fact, quite the contrary. You have started a very thought-provoking thread.

I don't necessarily agree with you, that's all.

I think that your remark about mastering technique first is difficult to apply for most people. Many of us are learning as we go along and are simply striving to get a 'better' image than last time. For those advanced enough with their processing skill to enter competitions, I don't know how it would work except for comparing the submitted image against a 'correct' example of that image and see who got closest to it. - pointless!

Link to comment
Share on other sites

13 minutes ago, Astro Noodles said:

For those advanced enough with their processing skill to enter competitions, I don't know how it would work except for comparing the submitted image against a 'correct' example of that image and see who got closest to it. - pointless!

You raise a very valid point - how can we tell if our color rendition is "good" or "bad".

I think there are two approaches, one, probably foreign to most people would be to rely completely on theory of color and light.

Luckily there is other approach - the way most people learn the skill of processing and that would be by trial and error.

We can all do the following:

Shoot a daytime scene with a device that gives us good color rendition (DSLR / compact camera / smart phone). We can see color with our eyes in real time and one captured by device. We can verify that they agree.

Next step would be to take bunch of images with our astrophotography camera. We can make aperture stop to limit amount of light and we can use very short exposures - to get exposure compatible with AP levels of light.

Then we take those raw exposures and stack them (no need for alignment as scene is stationary and there are no stars), we can apply all the calibration frames we apply in our regular AP framework and in the end - we process that image.

Will we be able to reproduce colors taken with reference device (and seen by our eyes).

For all those that worry about color balance (actually color adaptation transform) - don't shoot illuminated scene - shoot your computer monitor in dark room - that will simulate emission scenario that is closer to real astronomy scene.

In fact - I can see a competition, if someone shoots relevant data, they can share it for all the others to attempt to process image so that colors are close match to reference image (but point being not to rely on reference in processing stage - only verification once you are done).

  • Like 1
Link to comment
Share on other sites

This thread serves little purpose? Color rendering (and also perception) is highly subjecttive.

This excellent presentation by Michael Brown will teach you everything you ever wanted to know about the entire imaging pipeline.

This, amongs other things, includes the influence of color space conversions,/white references and tone curves, all of which  directly and dramatically influence color rendition.

All of these are fair game and are even more arbitrary when dealing with astronomical images (there is no one canonical white reference or illuminant for astrophotographical scense, nor is there one "ballpark" presecribed tone curve/stretch).

That's not to say anything goes, but the continuum of solutions is vast.

Link to comment
Share on other sites

38 minutes ago, vlaiv said:

In fact - I can see a competition, if someone shoots relevant data, they can share it for all the others to attempt to process image so that colors are close match to reference image (but point being not to rely on reference in processing stage - only verification once you are done).

Doesn't sound like much fun. 😄 I mean, there might be some people who want to devote time and effort to it. And someone has to decide on the correct reference image. It might produce some interesting results.

One might ask whether there is a case to be made for removing the human element entirely, to stop people interfering in the process and ruining the colour etc.

There surely has to be some level of interpretation. we are human and individual and perceive things differently. The most sophisticated equipment we use is inside our skulls and no-one really understands how it works.

 

  • Like 1
Link to comment
Share on other sites

2 hours ago, vlaiv said:

image.png.09b98555c5a01d912d80f45b1befba65.png

Luckily it is a bit more easy than that. It turns out that our eyes work a bit like very low resolution spectroscopes - producing only 3 numbers instead of hundreds of measurements along 400nm-700nm range. Same is true for cameras - they also produce three values and it turns out that these three values are enough to reproduce (more or less accurately) original color.

In fact - we have very well defined XYZ color space that does exactly that - pretend to be very low resolution spectroscope that is compatible with our vision system and if you take any two light sources that produce same XYZ value although they might have difference on finer scale of spectrum - people will see those two as the same color.

With the exception that colour vision changes with age.  Our lens naturally yellows as we age but also there can be degenerative diseases that can affect our perception of colour (e.g. Changes in Color Perception | MacularDegeneration.net).  As such you may well find that some people 'boost' the contrast of their images to compensate (unknowingly).  I would expect that is some cases colour contrast is boosted less by a younger person compared to an older person for some colours.  As such you can never really use a person's eye as the low resolution spectroscope because each one is likely to perceive the colour slightly differently - in fact arguably we have no idea how anybody other than ourselves perceive colour because you can't plug yourself into someone else's brain.  To get a precise colour position you'd need to use specific energy levels.  

Really though it doesn't really matter what the colours are.  They are more representation of the data and what is going on in that location.  As long as the data itself is not being altered colour in the form we know it is just a perception - an alien species may not see it in the same way (lets say they see colour by change if brightness gradients) etc. You can make a star forming region red, blue, or luminous yellow as long as it is consistent it is still providing the same information.  In the same way that narrowband colour representation is usually 'wrong' but still provides real information on the object.

  • Like 1
Link to comment
Share on other sites

13 minutes ago, Whirlwind said:

With the exception that colour vision changes with age.  Our lens naturally yellows as we age but also there can be degenerative diseases that can affect our perception of colour (e.g. Changes in Color Perception | MacularDegeneration.net).  As such you may well find that some people 'boost' the contrast of their images to compensate (unknowingly).  I would expect that is some cases colour contrast is boosted less by a younger person compared to an older person for some colours.  As such you can never really use a person's eye as the low resolution spectroscope because each one is likely to perceive the colour slightly differently - in fact arguably we have no idea how anybody other than ourselves perceive colour because you can't plug yourself into someone else's brain.  To get a precise colour position you'd need to use specific energy levels.  

Really though it doesn't really matter what the colours are.  They are more representation of the data and what is going on in that location.  As long as the data itself is not being altered colour in the form we know it is just a perception - an alien species may not see it in the same way (lets say they see colour by change if brightness gradients) etc. You can make a star forming region red, blue, or luminous yellow as long as it is consistent it is still providing the same information.  In the same way that narrowband colour representation is usually 'wrong' but still provides real information on the object.

I think you are somewhat missing the point here.

I'm not seeking a way for all of us to get the same perception - it is individual thing.  I'm just suggesting that we should be able to do color matching - much like in above example that I gave with:

image.png

Although we might perceive colors differently as individuals - all of us are quite capable of saying - these two colors are the same or very close - or these two colors are rather different.

One of the aims of photography in general is to match the color. If something is blue, you like it to be blue in the photo as well, you don't feel comfortable if it turns orange.

46 minutes ago, jager945 said:

This thread serves little purpose?

Possibly.

In some way, you contributed greatly by offering resource for anyone interested in color management to follow up.

By raising awareness that color should be managed in amateur AP workflow we can bring the field forward?

  • Like 3
Link to comment
Share on other sites

35 minutes ago, Astro Noodles said:

And someone has to decide on the correct reference image.

Here is something that really bothers me :D

Can we do the same for say temperature or length? Can we say - let us each measure length of some object and then someone will decide what correct length of reference object?

We have standards, and we all know how long one meter is, and we can each measure object and given that there is defined standard - we can compare our measurements and we can reproduce given length.

Now, I can feel that 1 meter is too long and someone else can feel that 1 meter is rather short - but that won't change our measurements or ability to reproduce correct 1 meter.

Same holds for color. We have measurement standard for it. We can reproduce it after we measure it. Someone will perceive it as being orange, others as being brown (dark or lit up room) - but it will be the same color.

Perception aside - we have standard and way to measure and reproduce things, yet will still feel like there is a need for arbiter to decide what is right.

  • Like 1
Link to comment
Share on other sites

Years ago I was trying to explain DSO astrophotography colour to a co worker. Went through the use of filters, the colour attached to them and why the images look like they do. He said, "oh you add the colour?" the images are black and white originally?

Then he said " obviously the colour is fake".

People can do what they want but a realistic representation of image colour would excellent,and if its realistic they should all be similar IMHO.

Edited by jetstream
Link to comment
Share on other sites

2 minutes ago, jetstream said:

Years ago I was trying to explain DSO astrophotography colour to a co worker. Went through the use of filters, the colour attached to them and why the images look like they do. He said, "oh you add the colour?" the images are black and white originally?

Then he said " obviously the colour is fake".

People can do what they want but a realistic representation of image colour would excellent,and if its realistic they should all be similar IMHO.

Indeed, for some reason we miss the ability of repeated measurement as far as color goes.

Take any two images of a galaxy - any galaxy or other object, and if I give you distance in arc seconds between two stars in the image - you'll be able to provide me with size of object in arc seconds.

No matter what people do in their processing - measurement of size of object will not change between images.

I can do the same thing with color - given the color / temperature of these stars in the image - give me XYZ ratio of particular part of object. Answer should be the same across images.

We are not talking about perception - we are talking about measurement of physical quantity.

Almost no images are calibrated properly so we can do above color thing. We simply don't care about color to that extent that even from same set of data we end up with vastly different colors.

 

  • Like 2
Link to comment
Share on other sites

 

19 minutes ago, vlaiv said:

By raising awareness that color should be managed in amateur AP workflow we can bring the field forward?

That's a great motivation, as color is a super important, but often overlooked aspect of AP. 👍

12 minutes ago, vlaiv said:

Here is something that really bothers me :D

Can we do the same for say temperature or length? Can we say - let us each measure length of some object and then someone will decide what correct length of reference object?

We have standards, and we all know how long one meter is, and we can each measure object and given that there is defined standard - we can compare our measurements and we can reproduce given length.

Now, I can feel that 1 meter is too long and someone else can feel that 1 meter is rather short - but that won't change our measurements or ability to reproduce correct 1 meter.

Same holds for color. We have measurement standard for it. We can reproduce it after we measure it. Someone will perceive it as being orange, others as being brown (dark or lit up room) - but it will be the same color.

Perception aside - we have standard and way to measure and reproduce things, yet will still feel like there is a need for arbiter to decide what is right.

Color cannot be measured like that, simply because there is no one canonical reference. What looks white to you in daylight over there in Europe, looks yellow to me right now here in dark Australia.

Color cannot be measured. Some things that cause color can however be measured, however their interpretation is not set (see presentation).

Edited by jager945
Link to comment
Share on other sites

3 minutes ago, Dean Hale said:

Most are apod images - collected and presented by various people - and again, hence the variance.

If I had to choose, based on my (limited) knowledge of astrophysics and color theory, this version:

image.png.79d80b95fb7d07e94cd6f389b2881af8.png

  • Like 2
Link to comment
Share on other sites

2 minutes ago, jager945 said:

Color cannot be measured like that, simply because there is no one canonical reference. What looks white to you in daylight over there in Europe, looks yellow to me right now here in dark Australia.

Color cannot be measured. Some things that cause color can however be measured, however their interpretation is not set (see presentation).

Color can be measured as XYZ tristimulus value.

Perceptual response will be different - but if I show you two light spectra both having same XYZ tristimulus value next to each other - in any one conditions, you'll conclude that those colors match.

In different conditions, you'll perceive different color - and that is fine - we have perception that depends on a lot of things. Sometimes particular temperature feels cold and sometimes hot - depending on our surroundings. Sometimes same sound level seems loud and sometimes quiet. Perception depends on conditions - but that does not mean that physical phenomena related to our perception can't be measured, reproduced and matched.

  • Like 1
Link to comment
Share on other sites

7 minutes ago, jager945 said:

Color cannot be measured like that, simply because there is no one canonical reference. What looks white to you in daylight over there in Europe, looks yellow to me right now here in dark Australia.

Color cannot be measured. Some things that cause color can however be measured, however their interpretation is not set (see presentation).

Just realized that we said the same thing - only use word color to mean different thing (both valid).

You use it to denote perception of light stimuli, while I use it to denote physical quality of light stimuli.

First can't be measured while latter can.

  • Like 1
Link to comment
Share on other sites

3 minutes ago, vlaiv said:

Just realized that we said the same thing - only use word color to mean different thing (both valid).

You use it to denote perception of light stimuli, while I use it to denote physical quality of light stimuli.

First can't be measured while latter can.

Indeed, color can not be measured in XYZ tristumuls values, but can be converted into XYZ tristumulus values.

The problem is in the measuring domain, and the (arbitrary) assumptions that the measuring necessarily entails.

E.g. this is the  reason why monochromatic emission lines (H-alpha,  O-III) cannot be expressed in XYZ values.

Link to comment
Share on other sites

3 minutes ago, jager945 said:

E.g. this is the  reason why monochromatic emission lines (H-alpha,  O-III) cannot be expressed in XYZ values.

Yes they can - same process as with any other spectra.

We integrate line spectrum of Ha or OIII emission and we get X, Y and Z. In fact - given that we have only one line, X, Y and Z values will be values of X, Y and Z matching curves at respective wavelengths times source intensity.

Issue with monochromatic light sources is that they lay outside of gamut of most display devices. We can't represent them as tristimulus RGB values, but we can as XYZ.

Link to comment
Share on other sites

46 minutes ago, jetstream said:

Years ago I was trying to explain DSO astrophotography colour to a co worker. Went through the use of filters, the colour attached to them and why the images look like they do. He said, "oh you add the colour?" the images are black and white originally?

Then he said " obviously the colour is fake".

People can do what they want but a realistic representation of image colour would excellent,and if its realistic they should all be similar IMHO.

I have this conversation endlessly since I'm an astrophotography provider. Firstly, many people misunderstand filters, so I begin there. Many assume that a red filter 'turns things red,' so I explain that, no, the red filter blocks the other colours so that the red component arriving at each part of the picture can be measured.  When we have, similarly, measured the green and the blue we can weight each one correctly in a very good approximation of the human eye-brain's perception of the scene. I also explain that this is exactly how every digital image they have ever taken with camera or phone was created. They've been using the technique for years without questioning it and without knowing it. I'll then take a detour into the Bayer Matrix.

I find that this explanation generally carries the day - or they just say 'yes' to shut me up. 🤣

Olly

Edited by ollypenrice
typo
  • Like 6
Link to comment
Share on other sites

1 minute ago, vlaiv said:

Yes they can - same process as with any other spectra.

We integrate line spectrum of Ha or OIII emission and we get X, Y and Z. In fact - given that we have only one line, X, Y and Z values will be values of X, Y and Z matching curves at respective wavelengths times source intensity.

Issue with monochromatic light sources is that they lay outside of gamut of most display devices. We can't represent them as tristimulus RGB values, but we can as XYZ.

You are absolutely right of course that they can be expressed as XYZ values (apologies for not being precise enough).

It's super easy in fact - the coordinates fall on the spectral locus itself (e.g. they fall on the circumference of a CIE XYZ chromaticity diagram), which as you point out is indeed outside of real-world gamuts to begin with.

What I meant is that the monochromatic light we record (in raw, camera-specific RGB) cannot be expressed as one pre-determined XYZ value, due to color space conversion (which necessitates picking a white point, applying response correction matrices - if you so choose - etc.).

E.g. you cannot say, "this XYZ triplet is wrong for the camera-space RGB values/data you recorded".

Link to comment
Share on other sites

52 minutes ago, jager945 said:

What I meant is that the monochromatic light we record (in raw, camera-specific RGB) cannot be expressed as one pre-determined XYZ value, due to color space conversion (which necessitates picking a white point, applying response correction matrices - if you so choose - etc.).

E.g. you cannot say, "this XYZ triplet is wrong for the camera-space RGB values/data you recorded".

You are of course right, but I wonder how much error there is depending on selected transform.

If you think about it - even scientific photometric measurements are done in one system or another - for example UBVRI or ugriz. These have specific frequency response (much like different camera sensors). No sensor has exact response like these filters, yet photometry is readily done in different photometric systems and conversion between systems performed.

Scientists must correct for their imaging sensor to get compatible results withing system and also cross system conversion needs to be performed.

This is for scientific data and not color reproduction. If this area can tolerate errors in measurement - I don't see why we in astrophotography can't.

Indeed, if you take two sensors and record XYZ value with each (take raw and then convert to XYZ using selected transform matrix) how different XYZ values will be? We can certainly call it error of measurement rather than error in procedure and difference between XYZ values may well be within deltaE for most spectra?

  • Like 1
Link to comment
Share on other sites

2 hours ago, tomato said:

This was my latest attempt at my M81 data, a 100% qualitative approach.
Now where is brown slider for the mid tones…

976ACB0E-9948-48B3-BBA8-63FBA2A76B89.jpeg.e218ff583a6e8a43f44730f12131ba83.jpeg

 

Interesting of course, there is no such colour as Brown - objectively. It is an entirely subjective colour percieved through context.

weird eh.

here's a good explanation here:

 

  • Like 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.