Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

LDN 1228 - more data made a big difference


gorann

Recommended Posts

1 minute ago, gorann said:

So then, what is the correct rendention Vlaiv? According to your suggestion LDN1228 will be brown when seen from earth, and that is from where we are looking at it and take pictures of it....

The way I see it, there are at least three "correct" renditions - depending what you want to show.

1. Image from surface of the earth. I think this one is probably the least interesting as it does not leave much room for comparison. Take image of object at elevation of 100m with target being at azimuth of about 45 degrees and compare it with someone else's image taken at 2500mm above sea level with target being in zenith - there will be difference in colors. Probably significant difference.

In general we want to eliminate atmosphere from our images, I would say, although it is correct rendition in sense that object really has that color observed from surface of the earth (if color correction has been applied and all that ...)

2. What the object would look like image from a space craft orbiting earth or sun or somewhere in our stellar neightborhood

Here you need to compensate for earth's atmosphere and remove its influence.

3. What the object would look like if imaged from relative proximity - meaning its stellar neighborhood

Here you need to compensate for earth's atmosphere as well as interstellar reddening if there is any in direction of the object

I think that all of the above is theoretical rather than being "mandatory" or even recommended for AP, since most people don't know how to properly process color from their camera to start with, let alone do above corrections.

Camera sensor does not see color the way we see it, and first thing to be done is color transform matrix - usually split into white balancing and color correction matrix (I learned that recently). Next thing is to apply proper sRGB gamma transforms to colors if we want to encode images for display on computers etc ...

Most people don't bother with above two steps and adjust curves "by hand" until they are happy with what image looks like - and after most will say that "there is no proper color in AP anyway" - for some reason.

Link to comment
Share on other sites

2 minutes ago, andrew s said:

I think it could be Rayleigh scattering in in the nebula rather than our atmosphere. If in our atmospheric then the field G2V stars would be yellow. This could be a check on which it is.

Regards Andrew 

It happens both in nebula and our atmosphere I would say - with a difference.

Scattering in nebula is what is producing glow of nebula - so one could expect shorter wavelengths to be dominant there as they are more scattered. Most stars around nebula will be on longer part of spectrum since bluish stars are very rare and majority of stars in universe have reddish color.

image.png.2448103f302d4701b3999b49a570b7b7.png

97% of stars is in fact on yellowish / orange side. Nebula scatters mostly blue part of that so it sort of balances out (or maybe not - I'm just guessing here) - producing gray color. Bright reflection nebulae that have bluish color - have it because they have rather luminous but rare bluish star illuminating them.

Then this gray color hits our atmosphere and another round of scattering occurs. This time blue light is scattered away - leaving yellow/orange cast on the color in question (if it was gray to start with).

You know - this thing happens:

image.png.4fedc67843929ba381ce739d8bea9a20.png

Link to comment
Share on other sites

6 minutes ago, vlaiv said:

The way I see it, there are at least three "correct" renditions - depending what you want to show.

1. Image from surface of the earth. I think this one is probably the least interesting as it does not leave much room for comparison. Take image of object at elevation of 100m with target being at azimuth of about 45 degrees and compare it with someone else's image taken at 2500mm above sea level with target being in zenith - there will be difference in colors. Probably significant difference.

In general we want to eliminate atmosphere from our images, I would say, although it is correct rendition in sense that object really has that color observed from surface of the earth (if color correction has been applied and all that ...)

2. What the object would look like image from a space craft orbiting earth or sun or somewhere in our stellar neightborhood

Here you need to compensate for earth's atmosphere and remove its influence.

3. What the object would look like if imaged from relative proximity - meaning its stellar neighborhood

Here you need to compensate for earth's atmosphere as well as interstellar reddening if there is any in direction of the object

I think that all of the above is theoretical rather than being "mandatory" or even recommended for AP, since most people don't know how to properly process color from their camera to start with, let alone do above corrections.

Camera sensor does not see color the way we see it, and first thing to be done is color transform matrix - usually split into white balancing and color correction matrix (I learned that recently). Next thing is to apply proper sRGB gamma transforms to colors if we want to encode images for display on computers etc ...

Most people don't bother with above two steps and adjust curves "by hand" until they are happy with what image looks like - and after most will say that "there is no proper color in AP anyway" - for some reason.

And then there is NB imaging where everything is allowed😉. I have to admit I belong to the group (possibly the majority) that go for a colour balance that look pleasing and believable. After all, we completely distort the relative brightness of the image when we start stretching it, so maybe we could also defend a rather relaxed relation to the colours, unless that is the goal of the imager.

  • Like 1
Link to comment
Share on other sites

21 minutes ago, vlaiv said:

It happens both in nebula and our atmosphere I would say - with a difference.

Scattering in nebula is what is producing glow of nebula - so one could expect shorter wavelengths to be dominant there as they are more scattered. Most stars around nebula will be on longer part of spectrum since bluish stars are very rare and majority of stars in universe have reddish color.

image.png.2448103f302d4701b3999b49a570b7b7.png

97% of stars is in fact on yellowish / orange side. Nebula scatters mostly blue part of that so it sort of balances out (or maybe not - I'm just guessing here) - producing gray color. Bright reflection nebulae that have bluish color - have it because they have rather luminous but rare bluish star illuminating them.

Then this gray color hits our atmosphere and another round of scattering occurs. This time blue light is scattered away - leaving yellow/orange cast on the color in question (if it was gray to start with).

You know - this thing happens:

 

Not sure of your point. Scattering will leave a reddish brownish colour as in the image. Regards Andrew 

Rereading your post. I don't think the scattering is as you propose. The scattering will in an optically thick nebula will be roughly into 4 pi steradians at short wavelength scattering  blue out of the line of sight the red less so leaving the reddish brown colour. 

 

Edited by andrew s
Link to comment
Share on other sites

8 minutes ago, andrew s said:

Not sure of your point. Scattering will leave a reddish brownish colour as in the image. Regards Andrew 

Atmospheric scattering will.

And my point is - what do we want to show in the astro image - object as viewed from the earth in given imaging conditions or object as is?

If we want to show object as is - without filter imposed by our atmosphere - we need to do inverse of what atmospheric scattering does to color - we need to boost blue part of the spectrum - do color conversion to our image.

Once we do that - we get actual color of nebulosity - which I believe is much more gray than depicted in above image.

Why do we have white balence on our cameras? Because we want to see "actual color of the object" (in this particular case color of object under most common lighting conditions - daytime lighting).

If you look at this image:

image.png.f0645ce689a9779433ffdbb42cc41270.png

and you look at the left side - you would not say, oh, I'm happy with this image - look how nice this person's orange skin tone is! You know that this is not what skin color is and you would correct the image with white balance.

In above case, we might not know what the actual nebulosity color is, but we do know filters that are at play and if we want to find out and depict "natural" color of nebulosity - we need to color correct for what we know is happening - atmospheric scattering and possibly interstellar reddening.

 

Link to comment
Share on other sites

2 minutes ago, vlaiv said:

Atmospheric scattering will.

And my point is - what do we want to show in the astro image - object as viewed from the earth in given imaging conditions or object as is?

If we want to show object as is - without filter imposed by our atmosphere - we need to do inverse of what atmospheric scattering does to color - we need to boost blue part of the spectrum - do color conversion to our image.

Once we do that - we get actual color of nebulosity - which I believe is much more gray than depicted in above image.

Why do we have white balence on our cameras? Because we want to see "actual color of the object" (in this particular case color of object under most common lighting conditions - daytime lighting).

If you look at this image:

image.png.f0645ce689a9779433ffdbb42cc41270.png

and you look at the left side - you would not say, oh, I'm happy with this image - look how nice this person's orange skin tone is! You know that this is not what skin color is and you would correct the image with white balance.

In above case, we might not know what the actual nebulosity color is, but we do know filters that are at play and if we want to find out and depict "natural" color of nebulosity - we need to color correct for what we know is happening - atmospheric scattering and possibly interstellar reddening.

 

Sorry, I updated my post while you were posting this.

My issue is what is the "colour" of the light leaving the nebula not what colour do I like. You propose it is due to our atmosphere I that it could just be the nebula.

The way to tell would be to look at foreground G2V stars and see if they are changed by our atmospheres you propose.

Regards Andrew 

Link to comment
Share on other sites

1 minute ago, andrew s said:

Sorry, I updated my post while you were posting this.

My issue is what is the "colour" of the light leaving the nebula not what colour do I like. You propose it is due to our atmosphere I that it could just be the nebula.

The way to tell would be to look at foreground G2V stars and see if they are changed by our atmospheres you propose.

Regards Andrew 

Good point - just what I would expect from a "color corrected" workflow - while data is still in linear stage, measure star RGB ratios and compare them with expected linear RGB ratios.

I did that on several occasions and I even did color calibration on "ground bound" color reference. In both cases I get significant color shift towards red part of spectrum due to atmosphere.

For example here is ground reference on Jupiter (which turned out very yellow):

 

  • Like 1
Link to comment
Share on other sites

2 hours ago, andrew s said:

Sorry, I updated my post while you were posting this.

My issue is what is the "colour" of the light leaving the nebula not what colour do I like. You propose it is due to our atmosphere I that it could just be the nebula.

The way to tell would be to look at foreground G2V stars and see if they are changed by our atmospheres you propose.

Regards Andrew 

Yes, is it not possible that the nebula is scattering light just like our atmosphere so that it scatters the blue light in the direction of the stars behind it so that mainly red light gets through in our direction? @vlaiv Looking close at the nebula to the right in my LDN1228 image (the most reddish one) there is a clear red glowing spot in it. Maybe there is some strong Ha object in there that lits it up with red light.

426485616_Skarmavbild2020-08-28kl_15_31_01.png.27cce81252292bd723074dd2343ec5ba.png

I also see it in other images of LDN1228, like this one:

 

 

Edited by gorann
Link to comment
Share on other sites

10 minutes ago, gorann said:

Yes, is it not possible that the nebula is scattering light just like our atmosphere so that it scatters the blue light in the direction of the stars behind it so that mainly red light gets through in our direction?

I don't see why it would scatter blue light in direction of stars but not in our direction.

12 minutes ago, gorann said:

Maybe there is some strong Ha object in there that lits it up with red light.

This is a possibility.

Link to comment
Share on other sites

1 minute ago, vlaiv said:

I don't see why it would scatter blue light in direction of stars but not in our direction.

 

Well, between us and the sun and universe is our atmosphere and it scatters blue light away from us so we get more of the red. Is it then not analogous that the nebula dust is between us and the stars behind it and that is scatters away the blue from us and lets through the red?

Link to comment
Share on other sites

4 minutes ago, gorann said:

Well, between us and the sun and universe is our atmosphere and it scatters blue light away from us so we get more of the red. Is it then not analogous that the nebula dust is between us and the stars behind it and that is scatters away the blue from us and lets through the red?

Nebula will make stars behind it look redder - in same way that our atmosphere makes our Sun look yellow/orange/red (depending on how much atmosphere it hits).

Color of nebula on the other hand will be same as our atmosphere - blue - or rather light from stars (not only those directly behind it - but in every direction) will be scattered of it and blue components will be scattered more than red.

We have G2V star illuminating our atmosphere - it is white and star turns yellow while sky turns blue.

I linked above how frequent are stars of each color - about 96-97% of all stars has yellow / orange / red tone to them - which means that their light is no white. Once you take such light and you scatter it - it will turn "white" - same process that makes our sky blue as it turn white light into blue.

This process of scatter moves light in following diagram "to the right"

image.png.f9ed059a4362d1352452f1dbd640bb7e.png

So if original light is white - which is middle of this diagram - our atmosphere will scatter such light and color of it will move to the right - thus we end up with blue sky.

If original light is orange and nebula scatters it it will also move right - it can end up being pale yellow, white or very light blue. This is what I would expect from nebula that scatters mostly orange light of surrounding stars - to be on average "gray" (or white - same thing, different intensity).

Link to comment
Share on other sites

I don't think a nebula will look blue in general. Clearly some do as in the blue reflection nebula where they are illuminated by a hot blue star.

What you get depends on the specific type and geometry of illumination and the details of the size of the scattering particles.

Dark nebula don't transmit in the visible and don't seem blue as there are no hot stars close enough to light them up.

Similarly with the "brown" nebula there don't seem to be any hot stars to make them blue. 

As you pointed out most star are red.

Regards Andrew

Link to comment
Share on other sites

11 minutes ago, andrew s said:

Similarly with the "brown" nebula there don't seem to be any hot stars to make them blue. 

But are those really "brown" nebulae - or simply gray nebulae for the reasons I described above - that turn gray once they pass our atmosphere?

Link to comment
Share on other sites

7 minutes ago, vlaiv said:

But are those really "brown" nebulae - or simply gray nebulae for the reasons I described above - that turn gray once they pass our atmosphere?

Why would they be gray? Most star are far from gray and similarly scattering cross section tend to be strongly wavelength dependant so I find it hard to see how they would be gray. I might well be wrong though.

Only careful measurement could tell.

Regards Andrew 

Link to comment
Share on other sites

17 minutes ago, andrew s said:

Why would they be gray? Most star are far from gray and similarly scattering cross section tend to be strongly wavelength dependant so I find it hard to see how they would be gray. I might well be wrong though.

Only careful measurement could tell.

Regards Andrew 

 

28 minutes ago, vlaiv said:

But are those really "brown" nebulae - or simply gray nebulae for the reasons I described above - that turn gray once they pass our atmosphere?

If most stars as you say Vlaiv are yellow and very few blue, then would we not expect a lot of nebulosity lit up by these to be brownish? You also say that very few are blue and that fits with the fact that there are very few blue reflection nebulas out there, and the few there are often end up on our astrophotos.

Link to comment
Share on other sites

16 minutes ago, andrew s said:

Why would they be gray? Most star are far from gray and similarly scattering cross section tend to be strongly wavelength dependant so I find it hard to see how they would be gray. I might well be wrong though.

Only careful measurement could tell.

Regards Andrew 

I agree that best place to start would be measurement.

There is however reason to believe it could be gray(ish). Look at these:

image.png.12932627db20eab973d313cd354b46cc.png

6000K is white. This is power / energy curve - we need to convert it to photon curve, but it is clear that "red" side of spectrum is going to dominate (and it dominates for 4000K and less anyway - even in this graph).

Now we take such light that is dominant in red - has higher photon count there and do this with it:

image.png.42f68e1abf8f7f9dc80c0bc844b0098e.png

We scatter more blue and less red.

We have more red and less blue to start with and we scatter more blue and less red - my reasoning is that it will sort of balance out and it will leave us with about same amount of blue and red in scattered light - sort of more * less = less * more situation.

Equal amounts of blue and red will (and about the same amount of green - as it is in "between" of these two extremes) - will create rather balanced RGB composition - a gray color - or at least grayish.

Link to comment
Share on other sites

@vlaiv, I accept  that could be the case but it would require a special set of circumstances. 

Most stars are Red Dwarfs with 3000k ish photospheres and very little blue to start with, however, we could trade ideas forever without more data!😉

Regards Andrew 

 

Link to comment
Share on other sites

6 minutes ago, andrew s said:

@vlaiv, I accept  that could be the case but it would require a special set of circumstances. 

Most stars are Red Dwarfs with 3000k ish photospheres and very little blue to start with, however, we could trade ideas forever without more data!😉

Regards Andrew 

 

Well - we have data set to work with, right?

One just needs to measure reference star in the image (or few) and measure light intensity in each channel in nebular - on linear wiped data.

This will tell us what sort of color we can expect on that nebula.

Here is handy website for star temperature to color conversion:

http://www.brucelindbloom.com/index.html?ColorCalculator.html

One just needs to enter star color and these settings:

image.png.cb45383dd434a9c34d93f19c49491463.png

to get RGB linear ratio (make sure you select D65 for sRGB and gamma 1.0 to leave it linear).

  • Like 1
Link to comment
Share on other sites

1 hour ago, vlaiv said:

Well - we have data set to work with, right?

One just needs to measure reference star in the image (or few) and measure light intensity in each channel in nebular - on linear wiped data.

This will tell us what sort of color we can expect on that nebula.

Here is handy website for star temperature to color conversion:

http://www.brucelindbloom.com/index.html?ColorCalculator.html

One just needs to enter star color and these settings:

image.png.cb45383dd434a9c34d93f19c49491463.png

to get RGB linear ratio (make sure you select D65 for sRGB and gamma 1.0 to leave it linear).

Vlaiv, here is the stacked file of my data. I would be pleased if you could use it to detmine the true colour of the nebula.

20200826-27 LDN1228 3days.tif

Link to comment
Share on other sites

Well, it's not gray :D

image.png.e40e8d730e62b77f6ca7111743dec259.png

Supposedly - it's above shades if my measurements and math are correct.

Reference star used was:

TYC 4590-1433-1

at 20:54:22.679+78:07:05.94

Gaia DR2 quotes this star to be 5570K of effective temperature. Corresponding sRGB linear triplet is:

1.135763
0.972170
0.875711

Photometric measurement of star gives:

68281.670806
84433.395310
96729.032408

and measured nebulosity linear RGB values are:

18.910516739
14.435421944
6.550098896

Resulting nebulosity RGB values are:

3.14548
1.6621
0.593

However, I'm rather skeptical about these results for two reasons:

1. Data is not calibrated (at least flat field is not applied - this makes background removal very difficult and probably not precise)

2. Data is 16bit. There is 6 hours of data in 4 minute exposures. This is 90 subs stacked and that is 6.5 bits improvement over camera bit count which is around 14 at gain 100 - so it's 20 bits. 4 Bits of precision lost due to 16 bit format.

 

 

 

  • Thanks 1
Link to comment
Share on other sites

On 27/08/2020 at 17:05, gorann said:

 

With regard to capturing photons, 6 hours at f/2 should equal 70 hours at f/7, if I got it right. Someone may correct me like @ollypenrice or @vlaiv......

Well, let's say you use this scope at F2...

spacer.png

It's probably the smallest Newtonian in the world at 1.5cm aperture, made by my friend Ralf Ottow and being used here by Al Nagler. Are you suggesting that with this instrument and a reducer taking it to F2 you would catch your present image in 6 hours when it would take me 70 hours in my TEC140?

I really don't think you are asking that question!!

Super image. Red, less red? Still a super image. If in doubt, trust Maurice Toet.

😁lly

 

 

 

Edited by ollypenrice
False click.
  • Thanks 1
Link to comment
Share on other sites

1 hour ago, ollypenrice said:

Well, let's say you use this scope at F2...

spacer.png

It's probably the smallest Newtonian in the world at 1.5cm aperture, made by my friend Ralf Ottow and being used here by Al Nagler. Are you suggesting that with this instrument and a reducer taking it to F2 you would catch your present image in 6 hours when it would take me 70 hours in my TEC140?

I really don't think you are asking that question!!

Super image. Red, less red? Still a super image. If in doubt, trust Maurice Toet.

😁lly

 

 

 

Thanks Olly, but what are you looking at, in daytime on a bar stol?

Edited by gorann
Link to comment
Share on other sites

22 hours ago, gorann said:

With regard to capturing photons, 6 hours at f/2 should equal 70 hours at f/7, if I got it right. Someone may correct me like @ollypenrice or @vlaiv......

It was too tempting to let this one go unanswered. I think I have gotten myself into the mode of "f value has no meaning on photons". I have an 11" RASA f/2 and a 12" RC f/8, 620mm and 2400mm. The number of photons are purely a matter of aperture. The number of photons per pixel is a matter of the camera and the aperture and is ultimately the only factor that makes sense. A 1m mirror with a camera that has a resolution of 0.52"per pixel collects more photons per pixel than my 12" with a resolution of 0.52"per pixel. Notice how they will have the same resolution and if the cameras have the same number of pixels, they will have the same image.

Contrary, if your RASA8 has a camera with 0.3"per pixel vs a 71mm refractor rig with a 2.5"per pixel camera, you're not saving all that time. You will have a much higher resolution, but SNR won't be improved 12 times. However, make both have 2.5"per pixel and yours will have a 12 times higher SNR.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.