Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

How do I get the right colour of nebula in imaging, editing etc?


Recommended Posts

I've been processing my images from January, managed to capture a lot, California, Orion, horsehead, monkeys head, bubble etc etc. Some of them come out great and just about the right colour to what I match on stellarium. But there's some that don't, for example the monkeys head, all the pictures i've seen of that is blue with orange/yellow around the edge of it, weird, because it's just red and uh lighter red for me. meanwhile I captured orion the same as everyone else does, I even tried the jellyfish and that came out brown as it should be. Do I need special filters for those? or is it all in the editing? I'm using an optlong ccd L-Pro i think, I don't use filter wheels etc I don't find it worth the money living in a bortle 8 zone. I've heard that certain nebulae produce certain glasses or something, I don't quite understand it all. Oh yeah I captured the blue from Christmas tree cluster and Pleiades. 

Link to comment
Share on other sites

Two things happen here.

First is that people use what is known as HST palette (or SHO) for narrowband images of some targets - or maybe bi color palette.

When you shoot narrowband images - you get monochromatic images per element (like Ha, OIII and SII). Both Ha and SII are deep red - and would not differentiate in the image - so they are assigned different base colors and what you see is combination of these colors.

image.png.444724e637fe610178c82955bce6014e.png

here is classic example of HST/SHO palette - where hydrogen is mapped to green, oxygen to blue and sulfur onto red (SHO = SII, Ha, OIII = RGB)

Other thing that happens that might confuse you is "artistic license" that people give them selves in processing which often leads to generally inaccurate color of celestial objects.

Monkey's head image that you saw is probably either bi-color composition (Ha + OIII) or inaccurate use of SHO/HST type palette where people try to do SHO but end up "killing green" for some reason for the image - probably "because it looks better" that way.

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

I think what you mean to say is you're seeing images captured in narrowband (sulphur 2, hydrogen alpha, oxygen 3 - SHO the Hubble pallette) which come out blue/green/yellow like my profile pic.

In colour typically most targets are red as hydrogen is the most prevalent gas in emission nebulae, sulphur is also close to red and oxygen is blue.

You need those specific filters to capture those band passes and then apply them respectively in software to represent r, g and b. Most of the time such captures are also done with mono cameras as each pixel is utilised, it's possible with colour cameras but the quality won't be as good (but very close), with colour cameras you have filters like the optolong lenhance, extreme, and ultimate which cut out most light other than that of hydrogen and oxygen (so useful in light polluted zones) but the image will still be in RGB colour. Software trickery can also get you the SHO pallette from colour data.

Note the Lpro filter is a general light pollution filter and will cut out a lot of signal on broadband targets like Pleiades or galaxies.

Also in light polluted zones, imaging narrowband is far better than broadband as you're blocking out everything you don't want (light pollution) and getting the narrowband signal you do want.

  • Thanks 1
Link to comment
Share on other sites

8 minutes ago, Quetzalcoatl72 said:

Some of them come out great and just about the right colour to what I match on stellarium. But there's some that don't, for example the monkeys head, all the pictures i've seen of that is blue with orange/yellow around the edge of it, weird, because it's just red and uh lighter red for me.

You don't say which camera you use. I assume it's a stock DSLR or a dedicated astro color camera. They present the images as is, that means they show you what you would see if your own eyes was super-sensitive. Or as close to that as possible. Trust that color rendition, and use it as a baseline. Make subtle changes in hue and chroma, but dont be tempted to make them anything else than what they are.  Use the background as a dipstick when you finetune the colors. Different things gives different color deviations: Light pollution often gives an overall white/yellow tint, while moonshine most often gives a blue tint. Have you astromodded your DSLR camera you will have a red tint. Learn to deal with it in post processing before you start adding filters in your light train. Many of the images you see around are shot in mono, and blended and composed with false colors in post-processing. Some are true to the original colors, and try to get their composed images as near the original colors as possible, while others process their images in a more "artistic" way, to get dramatic effects or conform to a specific  palette.   I.e. the Hubble palette - almost a standard today, not even close to the real colors.

  • Thanks 1
Link to comment
Share on other sites

9 minutes ago, vlaiv said:

killing green

Isn't it to do with green not being present (or very little) "out there"? Tbh Hubble pallet isn't even correct to "real" colour, HOO or HSO gives a more real colour rather than SHO. Sometimes it's interesting to mix it up a little as well as being able to gather fine detail from the specific gasses, otherwise everything would just likely be red/blue/orange/yellow all the while.

Edited by Elp
  • Like 1
Link to comment
Share on other sites

6 minutes ago, Elp said:

Isn't it to do with green not being present (or very little) "out there"? Tbh Hubble pallet isn't even correct to "real" colour, HOO or HSO gives a more real colour rather than SHO. Sometimes it's interesting to mix it up a little as well as being able to gather fine detail from the specific gasses, otherwise everything would just likely be red/blue/orange/yellow all the while.

I think it has more to do with lack of color calibration of astronomy cameras rather than anything else. If you don't color calibrate instrument - it will have a bias toward the most sensitive part of spectrum - and if you look at QE curves of sensors they peak around 500-550nm in green region.

For this reason raw R, G and B data - which is not true RGB of any particular RGB color space, will have that bias. People then "white balance" - which is wrong name and procedure to correct the problem, and if they don't do it - there is strong green cast - which they then want to remove as "there is no green in space" (even that is not completely true).

In any case - doing that on false color images completely makes no sense (even if it does not make sense to begin with as it is wrong way to go about it).

Link to comment
Share on other sites

1 hour ago, Elp said:

You need those specific filters to capture those band passes and then apply them respectively in software to represent r, g and b. Most of the time such captures are also done with mono cameras as each pixel is utilised, it's possible with colour cameras but the quality won't be as good (but very close), with colour cameras you have filters like the optolong lenhance, extreme, and ultimate which cut out most light other than that of hydrogen and oxygen (so useful in light polluted zones) but the image will still be in RGB colour. Software trickery can also get you the SHO pallette from colour data.

Note the Lpro filter is a general light pollution filter and will cut out a lot of signal on broadband targets like Pleiades or galaxies.

image.thumb.jpeg.e0067cf67ef19f68770dff8193413e3a.jpeg

Attached is my first real attempt in January this year of the christmas tree cluster.
I was impressed with the blue and red contrast, being an amateur I only do short exposures this was about 2 hours ish, i used to do only 1 because I wanted as many objects on my capture list as possible, but seeing what an extra hour can do I had to change. I have wasted precious time by trying to image other obscure ones as a test but either they don't show up or are really poor in single lights, for example Iris, catseye, owl, spider&the fly and elephant are a struggle for me, I can make them out but I'd probably need 3 hours+. I'd like to avoid buying a mono with filter wheels but do you think it is worth my time investing in one of those optlong filters you mentioned, as they are probably more expensive than my current one with was over £100. Different story if I lived in bortle 4-5 however, I also don't have much access to the sky due to north garden and internet cables ahead.(for reference the seagull nebula is never in reach). To get the blue from this image it became apparent during the star colour calibrations in APP. You also mentioned reduction in galaxy detail, would make sense since my m31 images are very underwhelming compared to my m42's which I never understood, with it being the largest galaxy. Thanks guys

Link to comment
Share on other sites

2 hours ago, Rallemikken said:

You don't say which camera you use. I assume it's a stock DSLR or a dedicated astro color camera. They present the images as is, that means they show you what you would see if your own eyes was super-sensitive. Or as close to that as possible. Trust that color rendition, and use it as a baseline. Make subtle changes in hue and chroma, but dont be tempted to make them anything else than what they are.  Use the background as a dipstick when you finetune the colors. Different things gives different color deviations: Light pollution often gives an overall white/yellow tint, while moonshine most often gives a blue tint. Have you astromodded your DSLR camera you will have a red tint. Learn to deal with it in post processing before you start adding filters in your light train. Many of the images you see around are shot in mono, and blended and composed with false colors in post-processing. Some are true to the original colors, and try to get their composed images as near the original colors as possible, while others process their images in a more "artistic" way, to get dramatic effects or conform to a specific  palette.   I.e. the Hubble palette - almost a standard today, not even close to the real colors.

I've used both camera types, dlsr was modded but I don't use that now, the ASI533-MC is my end game camera, so I wont be changing unless I move to a much dark area for it to be worth the investments. I prefer natural as best I can so no 'artistic' edits for me, I only want to increase my experience, speed and efficiency now :). Can't see investing much more money, besides maybe a new filter, some ease of life and probably collimation for my RC8, the nebulas on that 8" are pretty bad, blurry etc.

Link to comment
Share on other sites

The lenhance or lextreme are completely different from the Lpro and you'll see a massive difference on emission nebulae rich in hydrogen and oxygen, note only on certain emission nebulae, you can't use it on everything it won't work on the iris nebula for example which is a reflection nebula.

  • Like 1
Link to comment
Share on other sites

I started my imaging journey in west London, Bortle 8 on a good night. I didn't even consider OSC, went straight for NB mono. For fighting severe LP NB is the way to go, as narrow as you can afford. Nowadays you don't even have to mortgage your soul for Astrodon or Chroma, Antlia do 3nm at a more reasonable price.

  • Like 1
Link to comment
Share on other sites

On 08/03/2023 at 09:44, Elp said:

The lenhance or lextreme are completely different from the Lpro and you'll see a massive difference on emission nebulae rich in hydrogen and oxygen, note only on certain emission nebulae, you can't use it on everything it won't work on the iris nebula for example which is a reflection nebula.

I've went though all my sessions and found that they all habit the same colours, became a little tedious, however these are my first attemps on these objects and the detail is nice for what I can get from my backyard and limited exposures. Here is some of them, It feels like I'm having issues with flats and the light pollution removal tool, not sure. Looks like I will invest in a new filter though.

Flaming_Star-lpc-cbg-csc-St.jpg

Heart-crop-lpc-cbg-csc-crop-St.jpg

Monkey_s_Head-crop-lpc-cbg-csc--90degCCW-1.0x-LZ3-NS-St.jpg

NGC_1893-crop-lpc-cbg-csc-St.jpg

Link to comment
Share on other sites

  • 2 weeks later...

I have a 533MC Pro and image from bortle 7

I cannot praise the dual narrowband filters enough. 

This was less than 2 hours Antlia ALP-T (a friend did the processing to bring out the Oiii)

There are two reasons why one generally cannot get good images. One is issues in capture (incorrect exposure, light pollution, focus, trailing, seeing etc). The second is processing, which is a dark art

 

HaOiii.jpg

Link to comment
Share on other sites

On 08/03/2023 at 00:50, Quetzalcoatl72 said:

I was impressed with the blue and red contrast, being an amateur I only do short exposures this was about 2 hours ish, i used to do only 1 because I wanted as many objects on my capture list as possible, but seeing what an extra hour can do I had to change.

I think you need to be more realistic about the integration times you are using. One or two hours will give you very little to work with in terms of signal, especially on dimmer targets. There are a few exceptions where minimal exposure will be enough, but for most this is not the case. As a rule, I would give a minimum of 6 hours to most targets I image and often much more. As you say, going from 1 to 2 hours made a large difference - take that to 8 hours and it will be even more. Obviously, it depends on what you are aiming for in terms of quality of image. If quantity is more important, then that is fine.

I would suggest you look at some of the images from bortle 8 skies and you will see what is possible. You might need to go 'un-natural' with NB (dual), but I think it would be worth it.

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

+1 for increasing your integration times. When I started out I raced around the sky getting 45 mins to an hour on six different targets in one night but they were  nothing special after processing. Unless you have a RASA or something similar you need to collect the hours to have enough signal to process the channels and get a decent result.

  • Thanks 1
Link to comment
Share on other sites

  • 7 months later...
On 23/03/2023 at 11:21, tomato said:

+1 for increasing your integration times. When I started out I raced around the sky getting 45 mins to an hour on six different targets in one night but they were  nothing special after processing. Unless you have a RASA or something similar you need to collect the hours to have enough signal to process the channels and get a decent result.

That's exactly what I was doing, since this post I only settle for at least 60 240second subs. My sky is very limited being in the suburbs, pollution, obstructions and bad weather so I'm missing a lot of objects. At the moment I cannot see Saturn from my garden, any object below that is Impossible, I predict another 2 years and I may have a chance. Orion is the limit when it comes to that and even then I can only get maybe 3 hours of it before being obscured by neighbouring houses.

Link to comment
Share on other sites

On 23/03/2023 at 08:37, Clarkey said:

I think you need to be more realistic about the integration times you are using. One or two hours will give you very little to work with in terms of signal, especially on dimmer targets. There are a few exceptions where minimal exposure will be enough, but for most this is not the case. As a rule, I would give a minimum of 6 hours to most targets I image and often much more. As you say, going from 1 to 2 hours made a large difference - take that to 8 hours and it will be even more. Obviously, it depends on what you are aiming for in terms of quality of image. If quantity is more important, then that is fine.

I would suggest you look at some of the images from bortle 8 skies and you will see what is possible. You might need to go 'un-natural' with NB (dual), but I think it would be worth it.

I settle with 4 hours now since this post. I hardly see nights that last 6 hours or more and I could splice sessions together but again I've only seen a couple of times where there were multiple clear nights in the week, I can't leave my scope outside so it's not a permanent setup. I've noticed you have a RC8, I did my longest exposure for that scope which was about almost 4 hours on M106. It's never been collimated and my NEQ6 struggles with this load as 3minutes is the max it can go guided and even then you've got eggy stars. I don't feel like spending hundreds to get a collimation kit for it but I really want it to perform at it's best as I prefer using my other scopes that have more success.

M106_05_02_23.jpg

Link to comment
Share on other sites

On 20/11/2023 at 21:09, Quetzalcoatl72 said:

It's never been collimated and my NEQ6 struggles with this load as 3minutes is the max it can go guided and even then you've got eggy star

Collimation is tricky, but not too hard. You certainly don't need vast amounts of expensive kit. For mine I get the secondary aligned using a Reego (but you can do it with a Cheshire) then get the primary perfect using a star test. I found this to be the best guide:

A Procedure for Collimating Ritchey-Chrétien and Other Cassegrain Telescopes (deepskyinstruments.com)

With regards to the NEQ6 struggling, it should easily manage the RC8. Maybe the mount needs a bit of fettling? I have imaged with my RC8 on an HEQ5 (albeit it was at the limit) and it was OK.

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.