Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

A Galaxy and a green blob


alan4908

Recommended Posts

The Galaxy NGC3938 is located in the Ursa Major Cluster and is approximately 60 million light years from Earth. It has a large number of H II regions showing as pinkish regions on the LRGB image below.  I've also shown an annotated image which has identified some of the background galaxies that are also in the frame.  The galaxy doesn't seem to be often imaged, which perhaps is because it is relatively small, with an apparent size of 5.4 x 4.9 angular minutes.  The image below represents just over 13 hours integration time and was taken with my Esprit 150.

What is somewhat intriguing is the green(ish) blob which I've highlighted below.  Since you don't often encounter green objects in deep space, at first I thought it might be a processing artifact but I checked a higher resolution image from Adam Block and sure enough it is there, looking even greener.  The only thing that I could find at this location is the NED object SSTSL2 J115300.08 +440700.8, however, I not entirely sure what this object is - can anyone assist ?

Alan

NGC3938

558731919_26_crop.thumb.jpg.b90c35f2e4a9b0fe073e41649578afa5.jpg

 

NGC3938 (annotated)

_26_crop_Annotated.thumb.jpg.d0df09f04d28fe202a8ce7a9dfc1a41c.jpg

 

Mysterious green blob

645377544_27.greenobject.jpg.90ba335267af8fc16bbb7dab14c81a92.jpg

 

LIGHTS: L:32, R:11, G:21, B:15 x 600s, DARKS: 30, BIAS:100, FLATS:40 all at -20C.

  • Like 5
Link to comment
Share on other sites

Could be that the green thing is an artifact of some sorts? I mean, object is real, but the color ....

Here is image that I found:

image.png.527dc9f0666223cfe15baed05d599c6b.png

Something similar happened to me when I imaged M101 - I also had bluish green blob thing:

image.png.dda3e0b028f708c3d0463a4300375c6b.png

But on some other images it is not green :D

image.png.d70e69072a31823c7ffa66833cca3527.png

image.png.d23d8652866239fea798e9ebe90d5b79.png

Link to comment
Share on other sites

4 hours ago, vlaiv said:

Could be that the green thing is an artifact of some sorts? I mean, object is real, but the color ....

Hi Vlaiv

Yes, the colour is definitely interesting. 

Here's the image from Adam Block taken through a 20inch RC at the top of a mountain, I think he won an APOD for this so, I'm assuming that the colours are accurate.

I've used Registar to align with my own attempt and marked the location of the green blob. As you can see very green. :hello: 

564846011_adamblockNGC3938_reg.thumb.jpg.456043147325e98a2869933131eccc62.jpg

3 hours ago, ollypenrice said:

Check it out for supernovae. I had a weird blue-green star in an M101 image which turned out to be an SNR remnant!

Yes - that is a thought. 

Link to comment
Share on other sites

14 hours ago, alan4908 said:

Hi Vlaiv

Yes, the colour is definitely interesting. 

Here's the image from Adam Block taken through a 20inch RC at the top of a mountain, I think he won an APOD for this so, I'm assuming that the colours are accurate.

I've used Registar to align with my own attempt and marked the location of the green blob. As you can see very green. :hello: 

 

 

I would not assume that since an image is published as a NASA APOD that the colors are "accurate." The astronomers (Nemiroff and Bonnell) do not check for this, but are as knowledgeable as anyone in our active community that would spot the weird stuff. I do not believe this image was published as an APOD...but I could be wrong. A *better* assumption is that I took great care in the fidelity of the details and color for images I publish. 😁

Indeed, small green blobs that look like HII regions are, though uncommon, not really rare. It is all too easy to "remove green" blindly (or nowadays SCNR it out) and miss out on some interesting astrophysical things! This is where examining the data closely and letting it be is a good skill (one you have shown with your image!).  

 

Another good example of these green blob things is this image:

http://www.caelumobservatory.com/gallery/n6240.shtml

It happens to be in this galaxy that I discovered my own Supernova..but it was not related to these blobs!

 

-Adam Block

  • Like 1
Link to comment
Share on other sites

1 hour ago, ngc1535 said:

I would not assume that since an image is published as a NASA APOD that the colors are "accurate." The astronomers (Nemiroff and Bonnell) do not check for this, but are as knowledgeable as anyone in our active community that would spot the weird stuff. I do not believe this image was published as an APOD...but I could be wrong. A *better* assumption is that I took great care in the fidelity of the details and color for images I publish. 😁

Indeed, small green blobs that look like HII regions are, though uncommon, not really rare. It is all too easy to "remove green" blindly (or nowadays SCNR it out) and miss out on some interesting astrophysical things! This is where examining the data closely and letting it be is a good skill (one you have shown with your image!).  

 

Another good example of these green blob things is this image:

http://www.caelumobservatory.com/gallery/n6240.shtml

It happens to be in this galaxy that I discovered my own Supernova..but it was not related to these blobs!

Thanks for the response Adam and highlighting more green blobs...However, I'm still somewhat puzzled about what these green blobs actually are.

Alan

Link to comment
Share on other sites

I have an idea.

Looking at the image can be misleading. We don't know what sort of color processing has been done on the image. Was there color management involved and if image is boosted in saturation and such.

Simple intensity measurement in R, G and B channels recorded - raw data, together with camera specification and filter type will be a good starting point to see what sort of color object actually has, and to try to determine something about the spectrum of the light given off by object.

That would give us some idea of what object might be, or at least a clue about the nature of light it is emitting.

Link to comment
Share on other sites

In my early stage of imaging I captured a set of frames for M33 and asked fellow Astronomy members at my society what was causing the two green areas in the image. In the end someone decided it was probably the ZWO filters in my 1600mm camera. I still don’t understand why it is in two localised areas and the rest of the image colour seemed ok, for my standard. 

It was processed with Neb4 with minor stretch, curves (contrast) used and saved JPEG.

 

832A0CD5-B85E-4122-909B-3D633A09007A.jpeg

  • Like 1
Link to comment
Share on other sites

8 hours ago, vlaiv said:

I have an idea.

Looking at the image can be misleading. We don't know what sort of color processing has been done on the image. Was there color management involved and if image is boosted in saturation and such.

Simple intensity measurement in R, G and B channels recorded - raw data, together with camera specification and filter type will be a good starting point to see what sort of color object actually has, and to try to determine something about the spectrum of the light given off by object.

That would give us some idea of what object might be, or at least a clue about the nature of light it is emitting.

If I look at the data at the stages after Pixinsight's DBE (image is linear), Photometric Colour Calbibration (image is linear) and the Final (LRGB non linear) I get these RGB values:

  DBE PCC Final
R 0.00664 0.00679 0.40200
G 0.00707 0.00727 0.64300
B 0.00683 0.00679 0.62000
       
RGB ratios with R normalized to unity
       
  DBE PCC Final
R 1 1 1
G 1.06 1.07 1.60
B 1.03 1.00 1.54

So, this particular green blob has quite a peaky spectrum with respect to the red.  

1 hour ago, Xsubmariner said:

In my early stage of imaging I captured a set of frames for M33 and asked fellow Astronomy members at my society what was causing the two green areas in the image. In the end someone decided it was probably the ZWO filters in my 1600mm camera. I still don’t understand why it is in two localised areas and the rest of the image colour seemed ok, for my standard. 

It was processed with Neb4 with minor stretch, curves (contrast) used and saved JPEG.

Hmm interesting, I don't recall seeing anything green in my M33 images but I shall go back and check !

Alan

Link to comment
Share on other sites

I would be interested to understand what caused the green in my image. Now I have just started learning Pixinsight, it would be great to see what an experienced user could do with the data.  

The data consists:Raw Fits Lu-180s-x9; Re-300s-x6; Gr-300s-x6; Bl-300s-x6 together with: Merged Fits master bias; master darks 180s+300s; master Lu-Re-Gr-Bl flats comes to a large 1.03GB folder. How do people manage the exchange of this data, any advice?

Link to comment
Share on other sites

7 hours ago, alan4908 said:
RGB ratios with R normalized to unity
       
  DBE PCC Final
R 1 1 1
G 1.06 1.07 1.60
B 1.03 1.00 1.54

So, this particular green blob has quite a peaky spectrum with respect to the red.  

Do you have any idea of what "units" PCC is in? Or rather what color space it is in?

I wonder how almost equal values in red green and blue (green being only 7% higher and red and blue equal) can suddenly change to red being 33% less than both G and B?

Link to comment
Share on other sites

The data was captured early after I starting imaging, so it’s likely the spectrum was impacted by something I did with Nedulosity 4. I will go back to the data and reprocess with basic functions to see what happens. I have just started to learn Pixinsight and will now use this as as a test piece. Thanks for your comments they are greatly appreciated.

Link to comment
Share on other sites

7 hours ago, vlaiv said:

Do you have any idea of what "units" PCC is in? Or rather what color space it is in?

Here's some documentation of the Pixinsight Photometeric Color Calibration tool which might help answer your question: https://pixinsight.com/tutorials/PCC/index.html

7 hours ago, vlaiv said:

I wonder how almost equal values in red green and blue (green being only 7% higher and red and blue equal) can suddenly change to red being 33% less than both G and B?

I presumed this was the consequence of the various non-linear operations performed between the linear to the final processed state. 

Alan

Link to comment
Share on other sites

2 hours ago, alan4908 said:

Here's some documentation of the Pixinsight Photometeric Color Calibration tool which might help answer your question: https://pixinsight.com/tutorials/PCC/index.html

I'm having difficulty understanding their explanation, some of the things they've written don't make much sense to me. I'll outline what confuses me, maybe someone will understand and explain to us what they meant so we can get to the bottom of this green color in the image.

Quote

PCC is a very special tool for several reasons. Besides the quality of our implementation, what really makes PCC unique is the fact that it materializes our philosophy of color in deep-sky astrophotography, and primarily, our philosophy of image processing: astrophotography is documentary photography, where there is no room for arbitrary manipulations without the basic documentary criterion of preserving the nature of the objects represented.

So far, so good - I agree completely, if you want to document true color of the object - you can, and same as them, I object the notion that "color is arbitrary" in astrophotography. It can be arbitrary - but by choice only.

Quote

Following documentary criteria, such representation must be justified by properties of the objects photographed. This excludes, in our opinion, the classical concept of "natural color" based on the characteristics of the human vision, as applied to daylight scenes.

Emphasis on last sentence is added by me as it is root of my misunderstanding of what they are saying. They say that they want to exclude human vision component - that is ok, light reaching sensor is physical thing and as any thing in nature that we measure - we should exclude our subjective sense of it.

Quote

The goal of PCC is to apply an absolute white balance to the image.

This is where things go south ... In introduction we are talking about measurement and we are excluding notion of human vision and such, and yet tool itself does - what they describe as white balance and most of document is about choosing reference white balance value.

Here is the thing - white balance is directly tied to human vision and perception. Absolute color spaces like CieXYZ does not have a white balance. It does not need white balance. White balance is used to define how color of particular object would be perceived by observer under certain illuminant. Our brain is funny thing. In an environment where we don't have pure white color - we choose closest color and that becomes white balance reference for our brain. All other colors in that scene are perceived "shifted" in hue to match that white point. Our brain does a bit of color balancing - although we have same spectrum - we perceive color as being different.

Astronomical images don't need white balancing in this sense - this is typical sense of daytime photography - we do white balance to adjust colors and convert our perception from environment that image was taken in to environment that image is viewed in. This is why we have different presets in cameras - like sunny, cloudy, incandescent light, fluorescent light, etc ... to tell the "camera" what was illumination like and then camera will convert that to "standard" viewing conditions.

In astronomy we don't have illuminant - we have sources of light and those don't depend on if it is sunny or cloudy day or we are using artificial illumination. No white balance is necessary or wanted.

What we want to do in order to produce what we colloquially call color balanced image is color space transformation. From raw tristimulus values produced by our camera sensor - to some standard color space tristimulus values. One can either choose to perform conversion to CieXYZ or to sRGB Linear - as there is well known linear transform matrix between the two. For final color that is displayed on our computer screens we need to do sRGB standard gamma correction - and voila, we will get true color, or rather - we will see on our computer screen closest representation of that particular color.

If we don't want to go as far as display - we can stop at CieXYZ value - that describes color well enough and is standardized, or we can choose to represent color in some other "color space" like BVR from UBVRI where we would use BVR filter response as matching functions instead of XYZ matching functions of CieXYZ color space).

What we should not do - is take tristimulus value that we have - arbitrary assign that to RGB and then wonder why such RGB triplet is green when displayed on the screen.

Back to the actual color of that thing:

We have R_raw, G_raw and B_raw to be (1, 1.06, 1.03) and this is our starting point - we have camera, I suppose it is Starlight Xpress Trius SX-814, and we have Astrodon RGB filters that produced these values?

Link to comment
Share on other sites

On 22/02/2020 at 11:32, vlaiv said:

I'm having difficulty understanding their explanation, some of the things they've written don't make much sense to me. I'll outline what confuses me, maybe someone will understand and explain to us what they meant so we can get to the bottom of this green color in the image.

So far, so good - I agree completely, if you want to document true color of the object - you can, and same as them, I object the notion that "color is arbitrary" in astrophotography. It can be arbitrary - but by choice only.

Emphasis on last sentence is added by me as it is root of my misunderstanding of what they are saying. They say that they want to exclude human vision component - that is ok, light reaching sensor is physical thing and as any thing in nature that we measure - we should exclude our subjective sense of it.

This is where things go south ... In introduction we are talking about measurement and we are excluding notion of human vision and such, and yet tool itself does - what they describe as white balance and most of document is about choosing reference white balance value.

Here is the thing - white balance is directly tied to human vision and perception. Absolute color spaces like CieXYZ does not have a white balance. It does not need white balance. White balance is used to define how color of particular object would be perceived by observer under certain illuminant. Our brain is funny thing. In an environment where we don't have pure white color - we choose closest color and that becomes white balance reference for our brain. All other colors in that scene are perceived "shifted" in hue to match that white point. Our brain does a bit of color balancing - although we have same spectrum - we perceive color as being different.

Astronomical images don't need white balancing in this sense - this is typical sense of daytime photography - we do white balance to adjust colors and convert our perception from environment that image was taken in to environment that image is viewed in. This is why we have different presets in cameras - like sunny, cloudy, incandescent light, fluorescent light, etc ... to tell the "camera" what was illumination like and then camera will convert that to "standard" viewing conditions.

In astronomy we don't have illuminant - we have sources of light and those don't depend on if it is sunny or cloudy day or we are using artificial illumination. No white balance is necessary or wanted.

What we want to do in order to produce what we colloquially call color balanced image is color space transformation. From raw tristimulus values produced by our camera sensor - to some standard color space tristimulus values. One can either choose to perform conversion to CieXYZ or to sRGB Linear - as there is well known linear transform matrix between the two. For final color that is displayed on our computer screens we need to do sRGB standard gamma correction - and voila, we will get true color, or rather - we will see on our computer screen closest representation of that particular color.

If we don't want to go as far as display - we can stop at CieXYZ value - that describes color well enough and is standardized, or we can choose to represent color in some other "color space" like BVR from UBVRI where we would use BVR filter response as matching functions instead of XYZ matching functions of CieXYZ color space).

What we should not do - is take tristimulus value that we have - arbitrary assign that to RGB and then wonder why such RGB triplet is green when displayed on the screen.

Back to the actual color of that thing:

We have R_raw, G_raw and B_raw to be (1, 1.06, 1.03) and this is our starting point - we have camera, I suppose it is Starlight Xpress Trius SX-814, and we have Astrodon RGB filters that produced these values?

Hi Vlad

I thought you might be interested in seeing the Pixinsight screen show (below) which I've taken with the cursor hovering over part of the green blob located at x= 51 and y = 40. The image is after I've performed DBE but before any PCC. 

This shows what happens to the linear RGB values as they are put through the non-linear stretching function (called Histogram Transformation in Pixinsight). So, for example, the red value before the stretch is 0.0066  and goes to 0.48 when it is stretched. 

If you make a table of the values you get:

  DBE (linear) DBE(non-linear)
R 0.0066 0.4800
G 0.0071 0.5678
B 0.0068 0.5221
     
With R normalized to unity
  DBE (linear) DBE(non-linear)
R 1.00 1.00
G 1.08 1.18
B 1.03 1.09

So, at the linear stage the green blob is 8% more peaky than the red whilst introducing a non-linear stretch makes the Green component 18% more peaky than the red. At this point it looks green. This is without any increase of saturation, application of PCC etc.  

In subsequent processing I would boost saturation, perform noise reduction etc to create a final image, these steps further increase the differences between the values (in my case the final image as a green component that is 60% more than the red component.  

My point is that even a very small percentage differences when in the linear stage can lead to very large percentage differences in the non-linear stage. 

Alan

screen shot

Pixinsight_screen_shot_green_blob.thumb.png.f8e66ddfa04c13f8978f94d2ccede1a3.png

Link to comment
Share on other sites

55 minutes ago, alan4908 said:

My point is that even a very small percentage differences when in the linear stage can lead to very large percentage differences in the non-linear stage. 

This is quite right, it can be otherwise as well. I think that it depends on gradient of curve at particular points. Steep curve will make larger difference then it is (in percentages) while shallow gradient will make smaller difference thus de saturating color - that happens on very bright things if curve has common shape often used to stretch.

In my above post I wanted to say that green color in ending image might:

1. not be green color at all

2. could be quite mundane occurrence in universe.

Consider following - I'm not sure what exact filters and camera you used for above image, and I'll be using your signature - Astrodon RGB filters and SX Trius 814 camera.

Let's consider very simple nebula with dominant Ha/OIII emission. M57 type of object, or much larger molecular cloud complex like M42 - point is that emission from such source gives Ha, Hb and OIII lines in spectrum as dominant light because it contains lots of hydrogen and oxygen gas.

Now, first obstacle is finding exact QE curve for Sony ICX814 sensor - internet search gives different results - but let's go with this one:

image.png.4eb667c781e983239f5b8c23c449ce8b.png

and filters (lets go with gen2):

image.png.c2200ca446b52f60c6007fb2ca68c399.png

If you take a look at following lines - 656nm, 495/500nm and 486nm - Ha will go into red, OIII will be split between green and blue and Hb (H gamma and all other significant H jumps) will be blue.

QE of sensor is relatively same for these lines with small variations, so it is very easy to see that object consisting only out of two gasses and three spectral lines would produce:

R: 1.0 , G: 1.08 and B: 1.03

But what color should such light have? In the image it appears green, is that to be expected?

image.png.6b2cfa0daa6a857e0cdb425d3f402e31.png

This is Cie xy chromaticity diagram. On outer edge of this diagram - every point represents a single wavelength source, so our three lines will be on that outer edge. Planckian thermal sources lie on so called Planckian locus - also depicted on this diagram as line going thru red/orange/yellow part and ending in blue - it has temperature in kelvins marked onto it.

In this diagram - if you have three sources of light with known coordinates - all colors that we can see that are mix of these three colors lie in triangle inside of those three points (it holds for any two points - all colors that are made out of combination of two sources lie on line connecting those two points and by extension to three points as well).

From this, we can see that simple Ha (and Hb/H gamma and so on - all that are in visible part of spectrum)/OIII can produce - red, orange, white, bluish and greenish colors to human eye :D.

We can't see pure grass green nor deep blues or deep yellows from this combination. Btw - real colors in this diagram are more saturated on outer part of the diagram than monitor is able to display - rainbow has more saturated colors than this. In fact, most likely computer monitor will only display this region:

image.png.187b68226bcf045915296f78df43d11e.png

All other colors are beyond the gamut of sRGB capable display and can't be reproduced. Red, green and blue pixels that monitors use have coordinates of triangle points in above image - hence all colors monitor can produce lie inside that triangle (like what I've marked above).

So if we want to show what colors can be produced by Ha/OIII combination and also displayed on our monitor as distinct colors - we need intersection of two triangles like this:

image.png.59bde23b3d93fc5382af4275456f962e.png

So in principle - Ha/OIII region can be even turquoise / dirty aquamarine in our RGB images if we process data correctly (btw - this is where my color knowledge breaks down - once I'm expected to assign common name to particular hue :D ).

 

     
     
     
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.