Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Debayering a DSLR's Bayer matrix.


RAC

Recommended Posts

The focusing mechanism in a digital camera depends on there being a specific distance between the camera bayonet position where you screw the lens on (flange) and the sensor. This is partially defined by the original UV/IR cut filters which have a specific refractive index, the impact of this being to bring the focus a mm or two further back into the camera so that perfect focus occurs on the sensor

When you remove the filters the optical properties of the camera changes and focus plane shifts 1 or 2 mm forwards. On some cameras you can accommodate this by moving the sensor slightly in the camera via adjustable screws. However there isn't always enough travel so you can loose the ability to focus at infinity unless your lens/telescope focuses past infinity, not a problem with a telescope but on earth based photography it can be

By replacing the original filter with a replacement eg clear glass then the original focus plane is re-established and infinity focus is possible again. The problem here is that cheap glass absorbs uv and longer infra red which means longer exposure times for terrestrial infra red photography. Quartz type glasses have less absorption and transmit 99%+ light through uv to infrared, with spectrosil flat all the way up to 2200nm.

I know is beyond the camera's limits, but believe it or not, It actually worked out cheaper to have a high quality custom cut filter made than buying an off the peg Baader type filter from a conversion company ...

Link to comment
Share on other sites

Thanks Bill for the detailed info I thought it might be to sort out focus but had not heard of quartz types.

Can you provide the details for the supplier of the quartz replacement glass?

I obtained a 1000d with a full spectrum mod (no replacement glass) and plan to use it with a CLS-CCD filter for astro and IR filters for some creative daytime fun. I did some tests when I got the camera and some of my lenses focus OK to infinity and strangely the auto-focus seems to work (it might be that the modification included moving the sensor forward as you describe). One manual M42 50mm will not focus to infinity in visible light but I think it might be close enough for infinity at IR. The images unfiltered appear to by slightly smudged by flair (I assume it's the IR/UV/Visible focus spread). I did a test with a red filter on the flash, this made the images very sharp. I have ordered 3 IR filters (680/720/850), my un-modified 300d will do for normal photography. Looking forward to reading more about your pioneering sensor modifications.

Edited by nightvision
Link to comment
Share on other sites

Sure, I just told them what glass I wanted and what size and they cut to size it for me, I could have bought uncut at £25 but got it cut for £40, packaging was excellent ;-)

http://www.uqgoptics.com

here is the spec for spectrosil http://heraeus-quarzglas.com/media/webmedia_local/downloads/broschren_mo/Spectrosil_syntheticfusedsilica.pdf - perhaps overkill

other people used cut ordinary glass, baader filters, astronomik filters, Schott WG280 glass etc

Yes its great fun but expensive and frustrating if you screw up :undecided:

Link to comment
Share on other sites

Sorry guys, I've been super busy over the past couple of months so unfortunately haven't made much progress, but I'll get back to it as soon as I can.

Hi Tristan, I've attached a picture of the scraper tool. It's simply the end of a paintbrush sharpened with a knife. The plastic is hard enough that the sharp edge holds well and can easily scrape away the layer (unlike other plastic things I tried, or toothpicks), but not hard enough to scratch the underlying surface (as the metal scraper did).

Most of the time I was using a 4x objective, switching to 10x sometimes for a close look, with a 17mm Baader Hyperion eyepiece - they make great wide microscope eyepieces.

I'm still looking out for the perfect smaller tool to get to the edges, I don't want to ruin it having made it this far in the search for perfection.

Also debating whether to turn it into a dedicated astro cam and rehouse/cool/amp-off, or keep it as a normal camera - could be pretty cool for daytime IR photography.

Wouldn't it be easier to stick 4 lines of thin black tape on the edges, covering the untidy bits? You wouldn't lose much of the sensor area and the optical assembly would look nice and tidy.

Really impressed with what you have achieved. How do you intent to clean the sensor now?

Another thing that occurs to me is that, I might be saying something stupid but, isn't it the case that the camera's software is programmed to run colour interpolation algorithms? Shouldn't that be having an effect on the quality of the pictures in the debayered sensor?

Edited by pixueto
Link to comment
Share on other sites

Another thing that occurs to me is that, I might be saying something stupid but, isn't it the case that the camera's software is programmed to run colour interpolation algorithms? Shouldn't that be having an effect on the quality of the pictures in the debayered sensor?

If you use the "colour" image directly from the camera, that is the case. But if you save the raw output, you can skip the colour conversion process and have the mono data directly.

Link to comment
Share on other sites

Wouldn't it be easier to stick 4 lines of thin black tape on the edges, covering the untidy bits? You wouldn't lose much of the sensor area and the optical assembly would look nice and tidy.

Really impressed with what you have achieved. How do you intent to clean the sensor now?

Another thing that occurs to me is that, I might be saying something stupid but, isn't it the case that the camera's software is programmed to run colour interpolation algorithms? Shouldn't that be having an effect on the quality of the pictures in the debayered sensor?

Yes, probably :grin: , I thought of 'masking' tape too, but I had a surge of blood to the head and didn't bother! (regretfully)

Having knackered my last sensor by completely removing all layers from the sensor top surface and surround, it still works but doesn't produce an image except a RAW file containing what looks like noise

However, Pixinsight which is using dcraw has no problem 'debayering' even though there is no bayer cfa there or reading it undebayered

The question is how many layers to go down, coloured dye layer, gold layer, glass layer ? - how wide to go ?

For me at least the way forward is one layer at the time just staying in the centre, once I know how deep then its a case of how wide, fortunately there are 2 obvious boundary lines, inner gold boundary and outer black boundary (the black pixels are used to set black level I believe)

Just saving up for a another sacrifice at the moment

Link to comment
Share on other sites

Great tenacity you have there :) I'm afraid I gave up when I realised that I had spent a couple of times the cost of a decent astro CCD camera on sacrificial DSLRs :D

Link to comment
Share on other sites

Great tenacity you have there :) I'm afraid I gave up when I realised that I had spent a couple of times the cost of a decent astro CCD camera on sacrificial DSLRs :D

I see your point. However this could be a real breakthrough for the astro amateur community; if only we could find a safe procedure to do it! This thread shows that we are nearly there. Maybe the way to go is to try first with cheap second hand webcams? Then come up with a workflow plan for expensive DSLR sensors. Imagine the possibilities: H-alpha filters in light polluted areas or when the moon is in the way, then using a camera which hasn't been de-bayered to add colour. This has the potential to transform amateur astrophotography.

Edited by pixueto
Link to comment
Share on other sites

Hi 12dstring, do you think there is a chance that the microlenses and the bayer array could be peeled off as entire layers; as if you were removing a sticker? Or are they stamped on the sensor independently?

Link to comment
Share on other sites

I think it's very unfortunate that Canon changed the way they attached the glass cover on the sensor between the 1000D and the 1100D. I was quite pleased that I was able to get usable exposure times of around an hour with low enough noise levels when cooled to around -10 to -15 degrees C. In parts of the world where there are more nights of clear sky it could be a winner but collecting enough data in the UK would prove very frustrating. As indeed I found it did and that's why I went over to an astro CCD (plus that fact that I had an offer of one at a very good price). The extra 2 bits of data depth with the 1100D is quite an advantage. Now if it were possible to debayer the 1100D sensor we would certainly have a very useful camera with something like 4x the sensitivity. That would make NB imaging prcatical I think even though the sensitivity would still be below that of a CCD.

There is still a possibility that I might finish off the rebuild of my Peltier TEC cooled 1100D (version 3) for use as an OSC camera for wider and brighter DSOs. It could be run bedside the Atik mono 314L+ as a second imaging system. I thought of getting a second mono 314L+ but would need filter wheel and filters and that would be just too expensive (I think).

Edited by Gina
Link to comment
Share on other sites

I see your point. However this could be a real breakthrough for the astro amateur community; if only we could find a safe procedure to do it! This thread shows that we are nearly there. Maybe the way to go is to try first with cheap second hand webcams? Then come up with a workflow plan for expensive DSLR sensors. Imagine the possibilities: H-alpha filters in light polluted areas or when the moon is in the way, then using a camera which hasn't been de-bayered to add colour. This has the potential to transform amateur astrophotography.

I believe investigations to date have shown that webcams appear to have a different set of layers that come away easier so they may not be appropriate for developing an approach for DSLR sensors. I sometimes wonder if Canon might be prepared to give out some information.

Link to comment
Share on other sites

Wouldn't it be easier to stick 4 lines of thin black tape on the edges, covering the untidy bits? You wouldn't lose much of the sensor area and the optical assembly would look nice and tidy.

I think trying to put masking tape on the edges would be harder than finishing the filter removal! If you have a look at the picture in RAC's first post you can see the gold connecting wires that rise above the height of the sensor. They'll happily disconnect with little force and you really don't want that.

What would be easier is to just crop the image to remove the edges, but I'm stupid/brave and having made it this far I can't help but try and finish the job...

How do you intent to clean the sensor now?

The sensor's still covered in bits of the broken away CFA, so that should blow off fairly easily with some compressed air. Then I'll give it a bit of wipe down with a bit off IPA or similar to clean off anything else that's stuck on.

Another thing that occurs to me is that, I might be saying something stupid but, isn't it the case that the camera's software is programmed to run colour interpolation algorithms? Shouldn't that be having an effect on the quality of the pictures in the debayered sensor?

The Canon RAW files contain the data as it comes off of the sensor before the colour interpolation, it's only lost if it's saved as a JPEG. Note that some software doesn't give you the option to turn off the demosaicing, but you can use dcraw to output TIFFs with no debayering. Any program that uses dcraw, such as DeepSkyStacker would be able to (but not necessarily can now) turn off demosaicing. Hopefully if we have proven success then the developers will have a reason to add the option to their software so we wouldn't have to convert to TIFFs in dcraw first.

Hi 12dstring, do you think there is a chance that the microlenses and the bayer array could be peeled off as entire layers; as if you were removing a sticker? Or are they stamped on the sensor independently?

I don't think so unfortunately. The microlens layer has a gel-like consistancy under the microscope and comes off very easily, but doesn't ever seem to come off it large chunks. The filter layer I would describe as almost crumbly - it doesn't seem like the coloured pixels in the layer itself are held together, only stuck to the layer beneath.

That said the idea of scraping it off with something very thin by getting underneath might work. But the trouble would be getting something to a sharp enough edge without it being made of metal or something that would scratch the glass. I'll happily experiment with my broken sensor.

My opinion is that the blind polishing technique is not the way to go, as others have found out it's hard to get right and you don't really know how far you've gone into the layers want to keep. I don't think it's ever going to be a popular mod as the risk is so high and it really isn't easy. I think the only way to do it consistantly is take it off layer by layer under a microscope so you can see exactly how far down you've gone, and I imagine this is how the guys offering this commercially are doing it. Not that I want to discourage anyone at all, I just suggest not trying to cut corners or attempting with minimal tools/magnification.

  • Like 1
Link to comment
Share on other sites

Dave

how far down did you go if I may ask? the dye layer or metal layer below

It isn't actually too difficult stopping at the level below the cfa dye when polishing, it's more that I was too confident and tried to do the job in a oner as I was worried about damaging connectors by repeated disassmebly and tried to do everything in one attempt

I think I learned from my mistake though

Link to comment
Share on other sites

Thanks for all that valuable information Dave. I'm really impressed with what you did.

At risk of being cheeky here, may I kindly suggest you clean the sensor thoroughly and assemble the camera even without finishing the edges? And maybe finishing the job later? You see, it would be very nice to see what kind of pictures you can obtain with the de-bayered sensor even if you have to crop the edges. That will inspire others to try it (I've already been looking into cheap microscopes). I fear that if you break the sensor while debayering the difficult edges this may be the end of it all and put us all off.

  • Like 1
Link to comment
Share on other sites

how far down did you go if I may ask? the dye layer or metal layer below

Just down to the filter layer. The gold-coloured metal layer (which I assume is the photodiodes) is covered by a glassy layer, and with a microscope it's very easy to tell if you've made a mark in it.

I think it'd be easiest to explain if I took some video under the microscope scraping away the different layers, won't be til the end of the week though.

At risk of being cheeky here, may I kindly suggest you clean the sensor thoroughly and assemble the camera even without finishing the edges? And maybe finishing the job later?

That's fair enough. I guess seeing how good the resultant flat field can be would be more useful to people at this stage. I'll clean it up at the end of the week so hopefully can show you some images at the weekend.

  • Like 2
Link to comment
Share on other sites

I think it'd be easiest to explain if I took some video under the microscope scraping away the different layers, won't be til the end of the week though.

That would be good to see!

I can't see anyone being able to remove the bayer filter (and cause no damage) without a microscope to be honest. I use a stereo microscope for tiny surface mount pcb work, it makes all the difference in the world, just as using a telescope does for viewing the far off stuff.

Link to comment
Share on other sites

That would be good to see!

I can't see anyone being able to remove the bayer filter (and cause no damage) without a microscope to be honest. I use a stereo microscope for tiny surface mount pcb work, it makes all the difference in the world, just as using a telescope does for viewing the far off stuff.

Any suggestions about what microscope would be good for this?

Thanks

  • Like 1
Link to comment
Share on other sites

Just down to the filter layer. The gold-coloured metal layer (which I assume is the photodiodes) is covered by a glassy layer, and with a microscope it's very easy to tell if you've made a mark in it.

I think it'd be easiest to explain if I took some video under the microscope scraping away the different layers, won't be til the end of the week though.

The gold-coloured metal layer must be the wire connections linking all the photodiodes in the sensor. They are actually several layers separated by a sheet of glass-like material. The photodiodes must be underneath, according to this:

http://www.cameratechnica.com/2011/06/23/technology-demystified-backside-illuminated-sensors/

'The rest of the layers are used to wire all the pixels together and connect them to the associated read-out circuitry. Because the wiring is so dense and there are so many connections to be made, it takes several layers of hairlike metal wires to connect everything together. Each layer is encased in a sheet of glass-like material to separate it from the layers above and below'.

Link to comment
Share on other sites

Taking off the microlenses has a noticeable effect on the sensitivity, but it's more than made up for by removing the filter array.

The net result isn't a huge increase in sensitivity for standard unfiltered luminance imaging. However the big advantages will be:

- A big increase in sensitivity and dynamic range for narrowband imaging. For example with an H-alpha filter you'd normally have the red pixel containing the light, which is then interpolated with the surrounding green/blue pixels which will just add noise. By removing the filter array you'll see over a 4x increase in sensitivity by using every pixel to collect the H-alpha light, and by not having the cover glass and red bayer filter to also cause light loss.

- Resolution increase by not having to interpolate pixels values.

- Spectral response increase into the UV and IR by removing the filters and cover glass, probably more useful with webcams.

Link to comment
Share on other sites

RAC, i'm surprised by the colours of your partially debayered pictures. As by removing the CFA matrix you significantely increase the number of photons that arrive on the blue and red channels, the resulting colour of the image processed by the dslr should not be a shade of grey but a shade of magenta/brown.

See here :

http://photo.net/digital-camera-forum/00CAB5

Your pictures are intrigating me...

Link to comment
Share on other sites

I think the idea is that when the CFA is removed then there will be no red blue and green pixels, every pixel will be similar and monochrome

Interestingly though, here is a close up and tweaked pic of one I posted earlier, what is intriguing is that there is no chess board pattern but rather than what appear to be layers of green, blue and red?

post-9935-0-33748900-1367947967_thumb.pn

Link to comment
Share on other sites

Fred, I think this is just the white balance. You can balance it so that the debayed part of the image is grey and then the filtered part will be green/yellow, or balance it so that the filtered part is normal in which case the debayed part will be pink/blue.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.