Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Mono for luminance, DSLR for colour?


BrendanC

Recommended Posts

15 minutes ago, tomato said:

Apologies Olly if I keep the Imaging Surgery theme going, but I wonder do you have any thoughts on my post 10 above on using channel extraction in Pixinsight to obtain RGB channels from a OSC image?

In truth I was making a rather arcane point about the fact that a colour channel extracted from a debayered OSC image is one which includes estimates for the pixel values of the channels other than those of that colour, for the pixels under the other Bayer filters. (That is a terrible sentence! I apologize for it.)  Let's try it another way and be less garbled: The blue channel extracted from an OSC image has real, unadulterated data for one pixel in four, those under the blue filters of the array. The debayering algorithm has estimated the values for blue for the pixels under the green-green-red filtered pixels based on trends seen in the wider blue situation. The algorithm 'sees' a larger shape formed under the separated blue pixels and 'fills it in' for the green and red pixels. It's not 'pure' blue data, it's blue-plus-estimate.  I think we would have to say that this estimation process is a source of noise in the sense that it is not pure signal. Not all debayering algorithms will give precisely the same estimation for the missing pixels.

Now, does it matter? Quite possibly not, since the debayering routines are very good, but in AP we take multiple exposures to improve the accuracy of each individual pixel value so it goes against the grain to introduce a new potential source of error in the form of the debayering algorithm. The test would be to extract colour channels from 3 hours of OSC and compare them with one hour channels from individual colour filters. I haven't done that test but I think we can say that the OSC-derived channels, in equivalent cameras, can only hope to be as good as the mono channels. They cannot hope to be better.*

Olly

* Unless you think that the softer colour cut-off of the Bayer filters is preferable to the sharp colour distinction of mono RGB filters. My view is that that, if this were true, OSC images would have better colour than mono images. Personally I think the reverse.

  • Like 2
Link to comment
Share on other sites

Thanks Olly, 

I now understand the point you are making, regarding the signal estimation introduced into the colour channels by way of the de-Bayering process. As you say, it can only strive to be as good as the ‘pure’ RGB collected data, never better.
Will it still be acceptable with regards to the quality of the final image? Hopefully when Autumn comes around I can contribute some evidence to the debate.

  • Like 1
Link to comment
Share on other sites

2 hours ago, ollypenrice said:

It's a good question and, since I wouldn't start to construct an image with a significant inequality between colour channels, I have no experience on which to draw. (Sorry if that answer sounds a bit 'holier than thou' but it's the truth. I don't recall trying this. I'll work on an image with, say, one sub short or maybe even two in a stack of twelve per channel but I don't think I've tried to fix a greater imbalance than that. )

I think, technically, that you can stretch the short channel to the same point as the others and will find that it's all there but will have far more noise. You could then noise reduce it and hope for the best. I'd be interested to see if @vlaiv agreed and suspect that he would, but I don't want to speak for him.

There's also a bit more to it, sometimes, viz, 1) using your RGB-only as you would use short subs for controlling a very bright part of the image, overexposed in your luminance. Yes, you could shoot short luminance but you may already have what you need in the RGB itself. Apart from something with the dynamic range of M42 I often find this works very sweetly, but it does require full quality RGB and precludes binning the RGB in most cases. 2) Stars are often much better in RGB than LRGB. They are both smaller and more colourful but, again, they need to be from a full quality RGB layer rather than a 'fixed' one.

Olly

Thanks Olly. It would certainly make for an interesting comparison I think. 

I don't get much opportunity to image these days (I completed all of 5 images in 2020, and none so far this year) so I guess I'll find out myself what the precise downsides to this are soon enough! 

  • Like 1
Link to comment
Share on other sites

2 hours ago, ollypenrice said:

 

 

2 hours ago, ollypenrice said:

 

Multiple (phone!) post. Mods please delete. 

Edited by Xiga
Link to comment
Share on other sites

6 hours ago, Xiga said:

@ollypenrice can i ask a technical question of you please? I'm about to get into LRGB imaging myself for the first time, and i was wondering, what happens if you have significantly less exposure in one of the colour channels? I know the answer is of course just to shoot more of it, lol, but for the sake of argument, let's say you wanted to make an image with what you had, but you only had 25% of Blue vs the other channels. When you do colour calibration (e.g PCC in P.I) will this account for things, to the extent that the image will have the 'correct' colour, but will just need the Blues saturated more, or will the colour balance be way out of whack? Or alternatively, would the Blue data need boosting at all? I know APP has the option to add a multiplication factor to any channel when combining the RGB together.

ps - Like Adam, i too bought a 268M but i haven't had a chance to use it yet. I'm already starting to wonder if i should have gone for the Colour version instead (not for quality reasons, just mainly so i can get some sleep while imaging! I don't have a permanent setup, so have to resort to using a filter drawer). 

There are several things I would like to answer here.

First - about amount of data in channel.

If you do proper color calibration - then although you spend say 1/4 of the time on blue - you will have "proper" amount of blue. Only difference will be in noise. For any given SNR you can boost your signal to needed level - but since you have fixed SNR with set number of subs stacked - you'll also amplify noise to match (SNR does not change when multiplying by a number).

However - above has nothing to do with saturation. Don't mix the two.

Color in astronomical images has been topic of much controversy and I'll offer my view on it.

You want to preserve RGB ratio in order to preserve physical property of recorded light. This is best we can do as far as color goes. This also means that we need to do color calibration properly if we want to be able to do that - and that means that we will end up with RGB ratio for each pixel that represents color of the light in that pixel.

If you only shot 1/4 of the time for blue channel - it will be multiplied with certain number to get accurate RGB ratio (color calibration) - that will also multiply the noise. You will end up with proper RGB ratio for your image after color calibration - but channel that lacks exposure will be noisy.

In fact - I would say that if you are going to sacrifice a channel for imaging time - let that be red channel.

Blue channel is attenuated the most by atmosphere. Sensors also tend to be the least sensitive in this part of spectrum. Even if you dedicate same time to each of channels - blue is going do get worst SNR because of that. There is also photon energy. Blue part of spectrum has shortest wavelengths. For same energy in R, G and B - given that shortest wavelength photons have highest energy - there will be the least photons in blue part of spectrum. That means the least signal as signal with photo detectors represents photon counts.

If you can - expose the blue channel the longest.

By the way PCC in PI will not produce accurate color.

In the end - you need to think how you ended up with less data.

There are multiple ways to end up with less data.

Say you only shot 1/4 of subs in blue in comparison to red and green channel - but you used average stacking method, and you used same sub duration.

Then you "don't need to do anything". One sub will contain the same amount of signal as stack of 20 or more subs using average method. It is only the noise that will be different (average of 20,20,20,20 is the same as single 20)

If you used 1/4 exposure time for blue channel than others - then you need to multiply value of blue subs with 4 to "equalize" them ( if you have 5,5,5,5 and you add average them you get 5, while others will be average of 20, 20, 20, 20 = 20 - so you need to multiply 5 with 4 to get 20).

If you used same duration subs - but used additive stacking - you again need to multiply blue channel with 4 to "equalize" things (again 5+5+5+5 is 20 - so is 4x20).

 

  • Like 2
Link to comment
Share on other sites

23 minutes ago, vlaiv said:

There are several things I would like to answer here.

First - about amount of data in channel.

If you do proper color calibration - then although you spend say 1/4 of the time on blue - you will have "proper" amount of blue. Only difference will be in noise. For any given SNR you can boost your signal to needed level - but since you have fixed SNR with set number of subs stacked - you'll also amplify noise to match (SNR does not change when multiplying by a number).

However - above has nothing to do with saturation. Don't mix the two.

Color in astronomical images has been topic of much controversy and I'll offer my view on it.

You want to preserve RGB ratio in order to preserve physical property of recorded light. This is best we can do as far as color goes. This also means that we need to do color calibration properly if we want to be able to do that - and that means that we will end up with RGB ratio for each pixel that represents color of the light in that pixel.

If you only shot 1/4 of the time for blue channel - it will be multiplied with certain number to get accurate RGB ratio (color calibration) - that will also multiply the noise. You will end up with proper RGB ratio for your image after color calibration - but channel that lacks exposure will be noisy.

In fact - I would say that if you are going to sacrifice a channel for imaging time - let that be red channel.

Blue channel is attenuated the most by atmosphere. Sensors also tend to be the least sensitive in this part of spectrum. Even if you dedicate same time to each of channels - blue is going do get worst SNR because of that. There is also photon energy. Blue part of spectrum has shortest wavelengths. For same energy in R, G and B - given that shortest wavelength photons have highest energy - there will be the least photons in blue part of spectrum. That means the least signal as signal with photo detectors represents photon counts.

If you can - expose the blue channel the longest.

By the way PCC in PI will not produce accurate color.

In the end - you need to think how you ended up with less data.

There are multiple ways to end up with less data.

Say you only shot 1/4 of subs in blue in comparison to red and green channel - but you used average stacking method, and you used same sub duration.

Then you "don't need to do anything". One sub will contain the same amount of signal as stack of 20 or more subs using average method. It is only the noise that will be different (average of 20,20,20,20 is the same as single 20)

If you used 1/4 exposure time for blue channel than others - then you need to multiply value of blue subs with 4 to "equalize" them ( if you have 5,5,5,5 and you add average them you get 5, while others will be average of 20, 20, 20, 20 = 20 - so you need to multiply 5 with 4 to get 20).

If you used same duration subs - but used additive stacking - you again need to multiply blue channel with 4 to "equalize" things (again 5+5+5+5 is 20 - so is 4x20).

 

Thanks Vlaiv 🤙 I only picked Blue at random really, not because i would intentionally choose to shoot less of it, but it's good to know that Red is the channel one could potentially get away with less of. My query was really just one born out of curiosity, because i can definitely see me encountering this situation at some point (1 weak channel) and no opportunity to add to it for a long time (perhaps the following year) so i was just curious what would happen, and you've answered it perfectly so thanks! It makes total sense too. I just wasn't sure if i would need to multiply the weak channel myself or whether the colour calibration routine would effectively do that for me, and it sounds like it would. 

Link to comment
Share on other sites

8 hours ago, Xiga said:

Thanks Vlaiv 🤙 I only picked Blue at random really, not because i would intentionally choose to shoot less of it, but it's good to know that Red is the channel one could potentially get away with less of. My query was really just one born out of curiosity, because i can definitely see me encountering this situation at some point (1 weak channel) and no opportunity to add to it for a long time (perhaps the following year) so i was just curious what would happen, and you've answered it perfectly so thanks! It makes total sense too. I just wasn't sure if i would need to multiply the weak channel myself or whether the colour calibration routine would effectively do that for me, and it sounds like it would. 

Regarding the consequences of having short channel data I think Vlaiv and I are in agreement. You just end up with more noise in that channel. The problem with having less red is that many targets are dominated by Ha (though not all, of course) so these will have signal just above the background when the blue and green won't. In this highly stretched part of the image the noise will become an issue. So I'm not sure I agree with the 'less red is OK' notion on emission targets. If you are going to add Ha to red, though, it probably won't matter.

In reality I sometimes find it advantageous to shoot more blue, though the 'reality' in question is aesthetic rather than scientific and may arise from the fact that I usually add Ha to red.

Olly

  • Thanks 1
Link to comment
Share on other sites

The one time I got a decent image using mainly Photoshop was by processing a monochrome pseudo Lum (RGB converted to grayscale) and one of the final processes for adding back in colour was to saturate and blur the RGB image and overlaying the Lum on top. Worked well so why couldn't this work with a true mono Lum and OSC RGB?

Link to comment
Share on other sites

Back on original topic - yes this works and it works well if you are careful in how you process your data. In fact, I think most of people using LRGB approach do the same if they stretch the L and then transfer color to it later.

image.png.4d44132467ec4c7996b026c209067018.png

Above is example of mixing data from two very different setups

8" F/8 RC with ASI1600 mono data and 80mm F/6 APO with ASI178mcc for color data.

Good thing with this approach is that you can optimize the time use if you have two scopes and side by side setup. Above image was taken on two successive nights - about 4h of each Lum and color (but sqm 18.5 skies so lacking depth).

Link to comment
Share on other sites

In fact, I have another attempt from my early 600D days. This is my first proper attempt at M31 with a 600D and WO GT71. I was never happy with it due to short capture time and processing skills. 

Resizer_16204777327640.thumb.jpg.0af47da35dc4d306e6dcafc87ab8fa28.jpg

 

Fast forward a year or so to when I got an ASI1600. I gathered 3 hours of mono Lum data and used it to add to the earlier DSLR data. The core is a bit too much but the noise and colour looks improved to my eye.

1361307114_M31LOSC_optimized.thumb.jpg.609dc8d2812e21246a60bf99868554c8.jpg

  • Like 2
Link to comment
Share on other sites

I have done quite a bit mono for Lum, and it has been the main idea behind my double Esprit rig. Here is my first light (half a year ago) with my new ASI6200MM (230 x 2 min gain 100) sitting on my Esprit 150. The RGB data come from my Esprit 100 with an ASI071MC (58 x 5 min gain 200) sitting next to it on a Mesu 200. That setup gives very similar FOVs for each camera. Works quite well for my standards at least.

What is it? It is NGC 6914, aka The Running Man in Cygnus, a blue reflection nebula surrounded by a vast area of H-alpha nebulosity. About 6,000 light-years away.

20201013-14 NGC6914 LRGB PS21smallSign.jpg

  • Like 2
Link to comment
Share on other sites

4 hours ago, gorann said:

I have done quite a bit mono for Lum, and it has been the main idea behind my double Esprit rig. Here is my first light (half a year ago) with my new ASI6200MM (230 x 2 min gain 100) sitting on my Esprit 150. The RGB data come from my Esprit 100 with an ASI071MC (58 x 5 min gain 200) sitting next to it on a Mesu 200. That setup gives very similar FOVs for each camera. Works quite well for my standards at least.

What is it? It is NGC 6914, aka The Running Man in Cygnus, a blue reflection nebula surrounded by a vast area of H-alpha nebulosity. About 6,000 light-years away.

20201013-14 NGC6914 LRGB PS21smallSign.jpg

Goran, is this LRGB? If so, I wonder why you went for L in the mono rather than Ha? I'd have thought you'd find more structure in the hydrogen with Ha.

This is HaLRGB from a widefield cropped and rotated to match yours, roughly.

1238227118_NGC6914HaRGBcrop.thumb.jpg.090baffb329c6c8d691aa1b8861e6a7a.jpg

Olly

 

  • Like 2
Link to comment
Share on other sites

53 minutes ago, ollypenrice said:

Goran, is this LRGB? If so, I wonder why you went for L in the mono rather than Ha? I'd have thought you'd find more structure in the hydrogen with Ha.

This is HaLRGB from a widefield cropped and rotated to match yours, roughly.

1238227118_NGC6914HaRGBcrop.thumb.jpg.090baffb329c6c8d691aa1b8861e6a7a.jpg

Olly

 

I think my thought was to also collect Ha but clouds moved and I probably got other priorities......

  • Like 1
Link to comment
Share on other sites

19 hours ago, david_taurus83 said:

The one time I got a decent image using mainly Photoshop was by processing a monochrome pseudo Lum (RGB converted to grayscale) and one of the final processes for adding back in colour was to saturate and blur the RGB image and overlaying the Lum on top. Worked well so why couldn't this work with a true mono Lum and OSC RGB?

You could, and your way of working in Ps with OSC data is rightly popular. (Actually I think it's best not to blur the RGB but to add the L in small iterations, increasing the saturation of the RGB and blurring the partial L before applying it, but that's a detail. You don't even need to convert to greyscale to use one layer as luminance. If you choose that blend mode that's how it will be applied, I believe.)

My experience with CCD OSC was simply that, in a given time (say three hours) I would not get a colour layer capable of supporting three hours of Luminance.  I found that an hour each of pure colour in a mono camera found more colour signal. It would be asking a lot of the simple absorption filters on an OSC to match the interferometric colour filters used in mono, surely?  But my CCD experience may have no bearing on CMOS data.

You can only go so far in mutilating (I exaggerate!) your colour layer. I like to use RGB-only stars so I don't want them binned or hyper-saturated. I also want, on some projects, to use my RGB as I would use short subs for over-exposed parts. That means I do want it in high quality.

Regarding Ps and PI, neither will give a better result than the other in my view. I posted a Christmas dataset of the Cave Nebula a few years ago for members to play with. Barry Wilson produced a rendition in PI which was insignificantly different from my version in Ps. The programs offer different working environments to suit different personalities and approaches.

Olly

Edited by ollypenrice
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.