Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

Why no foveon astro cameras?


dph1nm

Recommended Posts

is a question I ask myself? Having just borrowed an SBIG Mono CCD and got 15sec shots with the same depth as half an hour with my Canon 1000D (albeit with no colour information), it occurs to me that it is nuts using a Bayer matrix chip for astro work and throwing away 75% of the RB light before you even start. In princple, a Foveon chip should blow a OSC camera out of the water for colour imaging - so why don't we see cooled Foveon astro CCD cameras on the market?

NigelM

Link to comment
Share on other sites

In princple, a Foveon chip should blow a OSC camera out of the water for colour imaging

No - it still needs to absorb light in order to get the colour data.

The way to get high efficiency is to omit all the colour filtering i.e. a monochrome camera. If you really need colour data you can add your own filtering.

Link to comment
Share on other sites

No - it still needs to absorb light in order to get the colour data.
Yes, but the whole chip is covered by photosites in each colour (instead of 25-50%), so a Foveon chip should work ~3x faster than a Bayer (no different from a mono camera with filters, except you get all three colours at once).

To answer my own question, it seems that the first generation of Foveon chips only had 50% coverage at each photosite, and they had high dark current (~3e-/sec/pix at 22C) - but no-one seems to have tried cooling one. There is also an issue with colour cross-talk between the bands, but I do not see this being much of a problem for amateur astro work.

The another interesting thing about a Foveon chip is that you get true mono camera if you combine all three channels (although you may get 3 doses of read noise).

NigelM

Link to comment
Share on other sites

The another interesting thing about a Foveon chip is that you get true mono camera if you combine all three channels

Actually you get a true mono image if you isolate the signal from the top layer. The middle layer is "minus blue" and only records what doesn't get detected by the top layer. The bottom layer is "red" & only records what doesn't get detected by either the top or middle layers.

There's no free lunch.

Link to comment
Share on other sites

Not heard of these before, but reading the wiki page it seems they use the different path lengths of absorption as a proxy for a filter? Is that right?? The blue layer is the top level of silicon, so the photons absorbed here are most likely to be blue (as they have the shortest path length). The red photons have a longer path length, so are more likely to be absorbed in the (longer) red layer below. Effectively all the layers are the same, but have a different "QE" profile due to their depths.

Or is there some other filtering going on in there which is rejecting photons without detecting them?

Link to comment
Share on other sites

it seems they use the different path lengths of absorption as a proxy for a filter? Is that right??
Yes.

If you look at the research grade astro CCDs you will find that they're manufactured by thinning the silicon substrate and illuminating them from the back - that maximizes the sensitive area as the circuitry doesn't interfere with the detector sites. It's an expensive method of production as a large percentage fracture whilst being thinned, but it's how you get the very highest quantum efficiency at the blue end of the spectrum.

The more silicon the light goes through before being detected, the more of the shorter wavelength photons are absorbed before they get there. Foveon is a clever application of this principle, but whether it has any advantages or not, it hasn't caught on ... possibly due to the pixel count of Bayer filtered sensors being higher for the same actual resolution (and people think a 12 MP sensor must be better than a 6 MP sensor ... even if they're not measuring in the same units) and possibly because colour balance has been as issue with Foveon sensors (varying between pixels in a random way, due to the absorbtion in the silicon not being absolutely uniform).

Link to comment
Share on other sites

If you look at the research grade astro CCDs you will find that they're manufactured by thinning the silicon substrate and illuminating them from the back - that maximizes the sensitive area as the circuitry doesn't interfere with the detector sites. It's an expensive method of production as a large percentage fracture whilst being thinned, but it's how you get the very highest quantum efficiency at the blue end of the spectrum.

The modern fashion is to move to thicker and thicker devices now, to increase the red response. We have some 200-micron thick devices which have good response (>40%) out at 1000nm. Of course, the blue response is a not exactly great, and you have to bias the hell out of them (50-100V) to get decent image quality. Incredibly chunky compared with the thinned back illuminated blue sensitive chips, which are down at 4 micron thickness

But, if that is how these foveon work, it would seem that Nigel's right and they should, in theory, be 3x faster than bayer matrix chips. Maybe it is just a limitation of the implementation at the minute...

Link to comment
Share on other sites

I think it is true (reading around) that the current generation of Sigma Foveon DSLR cameras are probably not particularly suited to astro (if you look hard there are some shots of M42 around, but that is about it). But I wonder if the likes of Atik or QHY have thought about trying to optimise these chips for astro?

There is a paper

http://www.eso.org/sci/meetings/dfa2009/Writeups/WR-Lesage.pdf

on using them for photometry of exo-planets!

It is rumoured that Sony are producing their own 'improved' version, but not for another year or so.

NigelM

Link to comment
Share on other sites

  • 2 years later...

Hi, I am interested in the work on using Foveon sensor for astrophotography. I recently got an SD10 camera and started some tests - nothing profound yet.

I see that the link to the ESO paper on use of Foveon sensors in exoplanet work did not work. A link that works nowadays is: http://www.eso.org/sci/meetings/2009/dfa2009/Writeups/WR-Lesage.pdf

I would be interested in discussing the practicalities of using the SD10. Right now I am struggling to get sensible data from the (used) camera I bought. When taking pictures of a daylight scene I get good images; when I image the Moon I get a decent R image but G is missing and B has a streak in it. The G field seems to be mainly 0's with a few high pixels. As I said, the images in daylight are fine - is there anything particular to know about imaging near-point sources with this camera?

Best,

Peter Thejll

Denmark

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.