Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Color or Mono... The age old question.


69boss302

Recommended Posts

3 minutes ago, ollypenrice said:

Tell us more!!! 😁

Olly

Ok, here it goes.

We need three components of color to describe it precisely enough, right - R, G and B.

Now for a bit of math - We can see that as being a vector, so any linear transform of that vector that has inverse will maintain that information, right? (it needs to be linear as light adds linearly)

We might decide to record color as being (Q, W, E) with each of Q, W and E being:

Q = R

W = G

E = R+G+B

For example. It can be easily shown that we can reconstruct R, G and B from Q, W and E by using following equations:

R = Q

G = W

B = E - Q - W

Now watch following graph:

astrodonispectra.jpg

You can see that lum is equal to R + G + B to the first approximation.

To the first approximation, you can take L, R and G and form RGB triplet by simply doing R, G and L-R-G.

We can use better approximation if we actually color calibrate our camera + filters (as we should do anyway even if we use RGB filters). That means shooting known colors and deriving transform matrix.

(raw_R, raw_G, raw_B) * matrix = r, g, b

In this case, we will simply do following:

(raw_R, raw_G, raw_L) * matrix = r, g, b

for appropriate matrix (which will be different than in above raw_r, raw_g and raw_b case  - but derived the same way).

That will give more correct color information than simple subtraction above, but to first approximation - above will work.

You can try it out, take any of your old data sets and simply discard blue channel. Then use pixel math to derive blue channel by subtracting G and R from luminance. Just make sure that they are properly scaled if you used different exposure length for each.

Why ditch B and not G or R? Several reasons - cameras are usually less sensitive in short wavelengths (but this does depend on camera model), blue is more scattered in atmosphere - atmospheric extinction is higher in shorter wavelengths and blue signal is therefore weaker.

Converted into photons - there is less "blue" photons than "red" photons for same energy. Shorter wavelengths are more energetic and that means that fewer photons are needed for same total energy output. Fewer photons - weaker signal (in photons) - poorer SNR in blue part of spectrum for same exposure time.

Makes sense?

 

  • Like 5
Link to comment
Share on other sites

The quality of OSC imaging in 'the literature' speaks for itself. Plenty of quality and beautiful examples. By quality, I refer the the overall capture + processing, but also the direct capture element where the ASI2600MC has a very large QE, well depth and 16bit range to provide the best quality data an OSC can give for someone's scenario (in the main).

For me, some reasons for OSC imaging:

1. Faster overall. I do setup for 1hr imaging sometimes, I'll show an example below. Without those sessions, I would often have little to show. Although the shorter integrations make me want more time, I would have no image at all aside from a part-synthetic luminance without the OSC option for me.

2. ASI2600Mc Pro for example, gathers high quality signal, allows nearly on-the-fly choice of exposure length without absolutely requiring darks each time. Saving a bank of darks is time consuming but useful if you stick to those exposures. With fast scopes as a good example, choosing exposure length for a given target on a given seeing night give you flexibility if you choose to use it. Bias-corrected monochrome flats work excellently also, a simple process in Sharpcap for example where time is important. Of course, obtaining dark after the fact can also be done for sure.

3. Overall optical train can be a little more straightforward.

4. Processing is similar (not identical) to LRGB combined, but you can split the channels and also split  out duo/tri/quad band channels too, and the ASI2600MC data from Ha channel of a decent filter is nothing to be scoffed at, not nearly as noisy as a Ha channel from a CCD OSC.

5. Lower overall cost, no filters needed

6. Going form OSC to mono is another step that might be easier with more experience with OSC, especially if doing mono LRGB. Possibly with a scope upgrade in your future.

I almost entirely run OSC in as-acquired RGB, no synth lum, no filters of any kind except the IR-cut on the ASI2600 MC Pro. Happily willing to gather a 1 hour image of 25-30 short (120 to 150s) exposures with f/2.8 scope, but the 2600 will run beautifully with f/5 to f/7 system with 6-10 min exposures, and preserve star color better than most other OSC. It will also make color balancing easier than RGB imaging when you don't know form experience what exposure times or integration is needed to get GB right for a given target (I found this at least). It comparison to any DSLR is chalk and cheese really. I am in Bortle 6, with three LED streetlamps shining into my backgarden.

Here is an example of 25 x 150s exposures from a few nights ago, on IC 63 and 59 around y-Cas. It requires a lot more subs for sure, but it was near zenith and the 100% moon was just to the right, no filters, direct RGB. About as bad a clear night to image a faint target in OSC with no filters. It a work in progress really, and will get a full night eventually.

But, relatively easy to obtain colors and clear data in 1 hr 40 mins, with 20 mins setup and teardown, with focusing and balancing in that too. And a shiny bright full moon lit sky to boot. However, either family/work will limit my next clear sky time, and there's a big gap between them always. To get this with mono, would require several sessions, but the mono chip has its clear advantages - a time available choice for me to be honest, and maybe it is for you? Longer integration images still surprise me from the ASI2600MC in a good way.

If your scope is fast, running from 8.30 pm to 11.30 pm and then to bed, is viable, and for slower scope, a couple of short evening might be needed, but it is the same process each time, and somewhat easier to 'gather' data than planning filters, moon, times etc. Those shorter periods are a fact for me, hence the OSC benefit.

Fair warning though, this 1 hr stack comprises subs that were taken with a variable extender with 0.1 mm changes to backfocus as I dial it in for this scope, so every image was slightly rotated with changing star profiles!

High end OSC cameras give options, especially if you are upgrading, time limited, subject to varying skies and have other reasons to limit full multi-filter imaging over multiple nights and associated processing. If not, you might find the 2600MM with SHO or LRGB quite amazing. I used to image mono with large RC scopes for astrometry and photometry (mainly in IR) where concerns were correct star profiles, but OSC imaging in the visual spectrum was something I started about 7 months ago and its very enjoyable. ASI2600 Mc was a big part of that. 

 

IC63-redo-all-session_1_session_2-crop-lpc-cbg-csc-St3.jpg

Edited by GalaxyGael
  • Like 4
Link to comment
Share on other sites

5 hours ago, vlaiv said:

 

Why ditch B and not G or R? Several reasons - cameras are usually less sensitive in short wavelengths (but this does depend on camera model), blue is more scattered in atmosphere - atmospheric extinction is higher in shorter wavelengths and blue signal is therefore weaker.

Converted into photons - there is less "blue" photons than "red" photons for same energy. Shorter wavelengths are more energetic and that means that fewer photons are needed for same total energy output. Fewer photons - weaker signal (in photons) - poorer SNR in blue part of spectrum for same exposure time.

Makes sense?

 

Good summary of the color matrix. The high energy blue photons tend to be well scattered, with the UV portion up to 300-310 nm or so mostly absorbed by normal glass. Without quartz or calcium fluoride elements all throughout the train, including the cover glass of cameras, the transmitted blues from 350-450 nm roughly, tend to be noisier and have much lower areal flux density. Silicon-based CMOS is not ideal for UV and blue photons too and QE tends to be lower, although the IMX 571 chips in the ASI2600 are very good and the most 'astro' user friendly so far I think. I remember the Foveon layered CMOS chips that tried to deal with this and other effects, by layering blue, green and red with thickness tuned to relative absorption, while limiting any aberrations compared to an in-plane bayer matrix. They were odd sensors, difficult to define the pixel size and balancing color space was not straightforward.

 

Edited by GalaxyGael
  • Like 1
Link to comment
Share on other sites

9 hours ago, vlaiv said:

Interestingly enough - dual and tri-band filters are better suited for use with Mono cameras - yet not much people use them that way.

They are in fact equivalent of Lum filter for NB imaging.

People also don't use mono + filters in most effective way to really show biggest difference between mono + OSC. They shoot LRGB instead of LRG. Later requires somewhat special processing - but not different processing to what should also be done with LRGB and OSC data alike.

 

That is really interesting Vlaiv! Shoting Lum with a dual or three band filter is something I would like to test when the moon is gone and skies have cleared, whenever that happens....

Edited by gorann
Link to comment
Share on other sites

My take on this is from a completely different angle.

Usually the USED market is a good indicator of where the current state of astrophotography is at. 

You rarely, if ever, see a quality MONO camera come on the used market.

On the other hand, you will definitely see, every once in a while, a quality OSC come on the market. It is usually sold by an imager who wants to take the leap and up their imaging quality by a notch or two.

Please do not take my analysis in a negative way regarding OSC. I myself have just ONE camera, the 294MC Pro, which is obviously OSC. 

I have been pondering upgrading to the 2600MC Pro, but what is actually holding me back is the theory that if I want to go up a level in imaging, I should save a bit more and go all out for the much more expensive setup of the mono 2600MM with filters and wheel.

  • Like 2
Link to comment
Share on other sites

4 hours ago, oymd said:

My take on this is from a completely different angle.

Usually the USED market is a good indicator of where the current state of astrophotography is at. 

You rarely, if ever, see a quality MONO camera come on the used market.

On the other hand, you will definitely see, every once in a while, a quality OSC come on the market. It is usually sold by an imager who wants to take the leap and up their imaging quality by a notch or two.

Please do not take my analysis in a negative way regarding OSC. I myself have just ONE camera, the 294MC Pro, which is obviously OSC. 

I have been pondering upgrading to the 2600MC Pro, but what is actually holding me back is the theory that if I want to go up a level in imaging, I should save a bit more and go all out for the much more expensive setup of the mono 2600MM with filters and wheel.

I quite like that way of looking at things: using the used market as an indicator. So out of interest I just went to UK Astronomy Buy & Sell, and filtered by the category "CCD equipment" (which seems to encompass CMOS too). I then tallied the OSC and Mono cameras for sale, just quickly: not counting guide cameras, and only cameras over £500 (as an indicator of quality). I gave up after the first two pages, but the results... drumroll...

OSC: 11 for sale
Mono: 28 for sale

I was surprised by that. Obviously my method has a lot of holes in it, but it's still an interesting result. A lot of the Mono cameras seemed to be older CCD, so I think that perhaps the owners are selling up and maybe jumping to Mono CMOS? 

 

Link to comment
Share on other sites

1 minute ago, Lee_P said:

A lot of the Mono cameras seemed to be older CCD, so I think that perhaps the owners are selling up and maybe jumping to Mono CMOS? 

That was my first thought, but that can be easily checked.

Can you filter by type - like CCD / CMOS? Or maybe check models being offered manually (I know it is a bit of work, but in the interest of science? :D ).

Link to comment
Share on other sites

My feeling looking at the market and selling both a mono CCD (ATIK460) and a mono CMOS (ASI1600) recently was that the interest in the CCD was very low (sold it for much less than half new price) but the CMOS was quickly sold at a much better price. If I would put up one of my ASI2600MC for sale (which I have no intention to do) I expect it to be sold immedeately.

Edited by gorann
  • Like 1
Link to comment
Share on other sites

17 minutes ago, vlaiv said:

That was my first thought, but that can be easily checked.

Can you filter by type - like CCD / CMOS? Or maybe check models being offered manually (I know it is a bit of work, but in the interest of science? :D ).

I've just done this, but speedily -- so my data won't be completely accurate! It's just for fun though...

OSC: 11 for sale
Older Mono (CCD): 18 for sale
Modern Mono (CMOS): 10 for sale.

 

Link to comment
Share on other sites

The other issue being that you get multiple listing of the same advert.

suspect compared to machine vision and security astronomy sensors are a small market,  cmos is where those markets live and thats the commercial driver sending us to cmos. then obviously the lower read noise allows shorter exposures to be stacked, meaning a less demanding and therefore lower cost mount... and i guess the possibility of taking lucky imaging to deep space. I think now the only people using ccd are those who already have them and perhaps people getting into a few niches like photometry or spectroscopy.

i've got a 183 based osc and 178 mono camera for guiding and some optics experiments im doing. It occurred to me that an addional experiment that could be interesting, assuming i cannot just look up the answer... we tend to see resolution as luminescence rather than colour. so how much detail would you loose using a lower resolution RGB data to colour higher resolution lum. Just do both would be a fitting end to this debate :)

 

Link to comment
Share on other sites

3 minutes ago, dmki said:

how much detail would you loose using a lower resolution RGB data to colour higher resolution lum. Just do both would be a fitting end to this debate :)

 

Well, the old ccd fashion used to be to bin rgb x2 to speed up capture and do full resolution lum. At this rgb:lum resolution ratio I would assume that there was no detrimental effect to the image detail or no one would have done it.

Link to comment
Share on other sites

21 minutes ago, dmki said:

we tend to see resolution as luminescence rather than colour. so how much detail would you loose using a lower resolution RGB data to colour higher resolution lum. Just do both would be a fitting end to this debate

Much more information is carried by luminance than color. We can do simple simulation to demonstrate the effect. Here is baseline image - with very fine detail:

base.JPG.af3299d77602c91f2e3c8a57ca363e9c.JPG

I'm going to split it now to luminance and chrominance and will apply gaussian blur to both parts (separately). We can then examine results:

Here is luminance component blurred by 3.0px gaussian blur:

lum_blur.jpg.7576398350b52c658ff0058e45e9a6c7.jpg

And here is chrominance components blurred by 3.0px gaussian blur:

chrom_blur.jpg.090758e7d03c695c0fdcd8f8de977956.jpg

I think that results are self explanatory and demonstrate one very important aspect of Mono + filters - you can dedicate much more time to luminance to get detail and shoot color for much shorter time as noise in color data will be much less obvious (and of course, you can bin color data to improve SNR without loss of detail in final image - provided that you respect certain workflow and use luminance as actual luminance).

Link to comment
Share on other sites

Then we have that quite a few of the higher end colour sensors each colour pixel is a  2x2 binned under the mask, so the mono version can be run a 4 times the resolution at the cost of well depth and bit resolution. And i guess for the right set of circumstances this could lead to a decent change in image quality.

Link to comment
Share on other sites

I did stress on QUALITY in the example I gave above. By quality cameras, I meant a high end top quality camera, like a 6200MM or 2600MM. 

Those are rare as hens teeth on the used market, as there probably isn’t a BETTER step up camera than those two in the amateur AP market. Those two examples are usually bought by imagers who have some years of experience in the hobby, and would not go back to just OSC. 
 

I would not count any of the 1600MMs or older CCD mono cameras in the search. I will take an educated guess that almost all those selling a 1600 or older mono CCD camera are probably upgrading to a better mono setup, like the 294MM or 2600MM. 
 

Very rarely you will see a quality high end mono full setup for sale, including camera, FW and filters. Those are usually up for sale by a beginner who enthusiastically got into the hobby, spent a lot upfront, got overwhelmed by the whole process, and either chose to just give up and go back to visual, or go the easier route of an OSC.

 

  • Like 1
Link to comment
Share on other sites

7 hours ago, oymd said:

I did stress on QUALITY in the example I gave above. By quality cameras, I meant a high end top quality camera, like a 6200MM or 2600MM. 

Then it's...
OSC: 1 for sale
Mono: 0 for sale

I'm not disagreeing with you -- rather, I actually think you're right -- but we have to be careful when using assumptions to draw conclusions as then we tend to reinforce our pre-existing notions. I don't think we can draw any solid conclusions just from the used market sales (especially not just the very brief research I did 🤣) without more information. If we guess that almost all those selling an older CCD or 1600MM are upgrading to a newer Mono, then of course we'll stack the deck full of Mono users.

For what it's worth, I sold my 1600MM and bought an OSC :)

 

 

 

Link to comment
Share on other sites

Something else to consider: when I looked at GalaxyGael's lovely Gamma Cass rendition, I was reminded that you can get a certain 'look' from the modern CMOS OSC cameras. Very natural. It doesn't have to be about going insanely deep by adding NB data (though you can with Tri band, we know.) Tom O'Donoghue and I went depth-mad on this region and produced the picture below but, quite honestly, I do love the one posted above for different reasons. We don't all want to be doing the same things or it would be repetitive.

716390609_BreakingWaveStarnetweb.thumb.jpg.f0b8dd6da11052cc96c0d94d8685d8bb.jpg

Edited by ollypenrice
correction
  • Like 3
Link to comment
Share on other sites

Is the KAF8300 CCD considered a “quality” mono sensor? There are a few of them up for sale now compared to 12 or 18 months ago.
If I was starting out now and on a tight budget , I would get one of these, a massive step up from an uncooled DSLR, even if it is old tech.

Link to comment
Share on other sites

On 25/10/2021 at 13:53, vlaiv said:

Ok, here it goes.

We need three components of color to describe it precisely enough, right - R, G and B.

Now for a bit of math - We can see that as being a vector, so any linear transform of that vector that has inverse will maintain that information, right? (it needs to be linear as light adds linearly)

We might decide to record color as being (Q, W, E) with each of Q, W and E being:

Q = R

W = G

E = R+G+B

For example. It can be easily shown that we can reconstruct R, G and B from Q, W and E by using following equations:

R = Q

G = W

B = E - Q - W

Now watch following graph:

astrodonispectra.jpg

You can see that lum is equal to R + G + B to the first approximation.

To the first approximation, you can take L, R and G and form RGB triplet by simply doing R, G and L-R-G.

We can use better approximation if we actually color calibrate our camera + filters (as we should do anyway even if we use RGB filters). That means shooting known colors and deriving transform matrix.

(raw_R, raw_G, raw_B) * matrix = r, g, b

In this case, we will simply do following:

(raw_R, raw_G, raw_L) * matrix = r, g, b

for appropriate matrix (which will be different than in above raw_r, raw_g and raw_b case  - but derived the same way).

That will give more correct color information than simple subtraction above, but to first approximation - above will work.

You can try it out, take any of your old data sets and simply discard blue channel. Then use pixel math to derive blue channel by subtracting G and R from luminance. Just make sure that they are properly scaled if you used different exposure length for each.

Why ditch B and not G or R? Several reasons - cameras are usually less sensitive in short wavelengths (but this does depend on camera model), blue is more scattered in atmosphere - atmospheric extinction is higher in shorter wavelengths and blue signal is therefore weaker.

Converted into photons - there is less "blue" photons than "red" photons for same energy. Shorter wavelengths are more energetic and that means that fewer photons are needed for same total energy output. Fewer photons - weaker signal (in photons) - poorer SNR in blue part of spectrum for same exposure time.

Makes sense?

 

Would this not require that the Lum exposure length is identical to the RGB filter exposures?

If not I guess you need to add some sort of scaling factor as L will not = RGB.

Not sure if I am the norm but I tend to use shorter exposures for L than RGB. 

Adam 

Edited by Adam J
Link to comment
Share on other sites

6 minutes ago, Adam J said:

Would this not require that the Lum exposure length is identical to the RGB filter exposures?

If not I guess you need to add some sort of scaling factor as L will not = RGB.

Not sure if I am the norm but I tend to use shorter exposures for L than RGB. 

Adam 

Yes it would. It does not need to be of the same exposure, but signal does need to be "compatible" - so scaled properly, before math is performed.

Also, flats must be scaled to 1 in order not to change ADU values.

PixInsight for example is a bit problematic as it tends to scale all values in 0-1 range. Both at import time (division with 65535) and after pixel math. Above math is best done in software that keeps ADU values as they are - like ImageJ or alike.

Link to comment
Share on other sites

35 minutes ago, tomato said:

Is the KAF8300 CCD considered a “quality” mono sensor? There are a few of them up for sale now compared to 12 or 18 months ago.
If I was starting out now and on a tight budget , I would get one of these, a massive step up from an uncooled DSLR, even if it is old tech.

It is in same ball park as ASI1600.

ASI1600 has some advantages but also some drawbacks.

Advantages would be - shorter subs required (much lower read noise), much faster readout speed, and ability to bin in software and small pixels (which gives flexibility in terms of sampling rate used).

Disadvantages of ASI1600 are that it has software binning instead of hardware (but this is really offset by low read noise) and issues with micro lens diffraction artifacts.

I'd say that for beginner, KAF8300 is very good sensor - provided that they know how to utilize it best (proper calibration and long exposures to offset for high read noise and slow readout - which means good guiding / good mount).

Link to comment
Share on other sites

2 hours ago, ollypenrice said:

Something else to consider: when I looked at GalaxyGael's lovely Gamma Cass rendition, I was reminded that you can get a certain 'look' from the modern CMOS OSC cameras. Very natural. It doesn't have to be about going insanely deep by adding NB data (though you can with Tri band, we know.) Tom O'Donoghue and I went depth-mad on this region and produced the picture below but, quite honestly, I do love the one posted above for different reasons. We don't all want to be doing the same things or it would be repetitive.

 

Lovely image BTW, although probably flammable with that much hydrogen :) . I really like the natural color balance images, and admit that the ASI2600MC does that very nicely in a simple setup. Adding some extracted Ha from the quad band filter is something I might do, but since that was 1 hr in 100%, I promised myself to give it one full night at f/2.8 and see what it shows. Interesting that the blues on IC 63 come out much more readily with no filters. There is something about natural color images, especially those with deeper integration, that have a look, dazzle or color range that I find very pleasing personally. Elephants truck is a classic example (NB versus OSc RGB), as is the cone/xmas tree region and so many more.  I do like the red that your Ha brought to an RGB image, compared to NB, but I guess that is a personal taste thing. Its the purples that we find when Ha regions are lit by large blue and red stars, or the browns in the diffuse Ha regions form OSC, or the various colors of dust etc. that are 'unique (is that right?) to broadband images.

  • Like 1
Link to comment
Share on other sites

7 minutes ago, GalaxyGael said:

I really like the natural color balance images, and admit that the ASI2600MC does that very nicely in a simple setup.

That should not depend on camera used. Most cameras should be able to achieve proper color in image if data is processed properly.

Link to comment
Share on other sites

Just now, vlaiv said:

That should not depend on camera used. Most cameras should be able to achieve proper color in image if data is processed properly.

That's objectively true. But the ASI 2600 full well depth even unity gain mode is forgiving when it comes to e.g. star color, which is what I meant and should have stated. 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.