Jump to content

SkySurveyBanner.jpg.21855908fce40597655603b6c9af720d.jpg

Would this work for astrophotography? Dichroic prisms


pipnina

Recommended Posts

https://en.wikipedia.org/wiki/Dichroic_prism

I read about the Technicolour 3-colour process and how it used 3 black and white films running through the camera at once, splitting the light into purple and green with a prism and capturing green on one film, then blue and red on two films that ran sandwiched against eachother and where one film had a filter in its backing to create red and blue.

I saw this and realised my old obsessive thoughts about minimising telescope photon waste could be compatible.

As we know, if we shoot RGB images we must lose 2/3 of the light entering our scope. Either to bayer filters (which knock red and blue down to 1/4 efficiency and green to 1/2)  or to our RGB dichroic filters in mono cam systems... But what if we could use 3 sensors and a prism to get closer to 90%+ utilisation in this range? Well it seems at the slower speeds (f5+?) we tend to see in our typical astro scope this could work. Back in the early days of digital sensors, this setup (3CCD cameras) was done to maximise resolution and quality in the earliest DSLRs and camcorders, to results that looked fine colour wise but were seriously held back by the noisy low efficiency sensors.

But now our sensors are super efficient, our ability to make these prisms hasn't disappeared and astro seems like the perfect modern use for the technology! Instead of buying 3 mono cams and three telescopes and (potentially) three mounts, we could get the same result with three cameras, ONE telescope and ONE mount!

In theory you could even expand on this by having not just an RGB prism but a SHO prism too, and capture narrowband images at insane speed for your preferred telescope type.

Who knows, maybe with some design tweaking it would even be possible to create a very complex prism to allow the simulaneous capture of all 6 common astro bands at once haha.

Does anyone know more about these? I have become a little fascinated...

Link to comment
Share on other sites

Interesting idea. It should work but the cost will be prohibitive. You will need 3 cameras, and possibly some way to fine tune the focus for each camera separately, ideally with a electronic focuser.

For a while I used a on axis guider from innovation foresight that uses a dicroic mirror to split the light into visible and near infrared.

With the main focuser I would focus the visible spectrum and with a helical manual focuser the NIR.

As you can imagine the manual focuser ia not so precise.

Also the weight would increase.

Also such a device would eat up a lot of backfocus. My ONAG if I recall correctly needed around 60 mm.

So overall, 3 cameras, 3 focusers, backfocus problems, more weight... 

 

Link to comment
Share on other sites

9 hours ago, dan_adi said:

Interesting idea. It should work but the cost will be prohibitive. You will need 3 cameras, and possibly some way to fine tune the focus for each camera separately, ideally with a electronic focuser.

For a while I used a on axis guider from innovation foresight that uses a dicroic mirror to split the light into visible and near infrared.

With the main focuser I would focus the visible spectrum and with a helical manual focuser the NIR.

As you can imagine the manual focuser ia not so precise.

Also the weight would increase.

Also such a device would eat up a lot of backfocus. My ONAG if I recall correctly needed around 60 mm.

So overall, 3 cameras, 3 focusers, backfocus problems, more weight... 

 

The extra focusers are not necessarily a problem as a helical one similar to those used for OAGs can be used for 2/3 of the cameras, which can be accurate if used in conjunction with a bahtinov mask for initial setup.

All focusing after that can be carried out with just the main focuser as the 3 cameras would be appropriately spaced to the 3 focal planes and thermal expansion would be minimal in that area relative to the that of the main objective to the main focuser.

As for cost, in theory this is cheaper than dual or triple mounting telescopes to a single mount or having three separate setups, and was clearly cheap enough to use in pro-sumer equipment back in the 90s, so I feel like it could be quite viable in theory!

Backfocus could be an issue though yes, it would probably either have to be used in place of, or as part of an OAG-prism combined unit.. 

Link to comment
Share on other sites

12 hours ago, dan_adi said:

Interesting idea. It should work but the cost will be prohibitive.

 

When did cost ever deter astrophotographers??? :grin:  If it did, there wouldn't be any...

This is, of course, a very cute idea. The theory is compelling. If the focus could be adjusted at each individual wavelength (as it would be) the objective would not need to be so well corrected either.

I wonder what the professionals make of it. They must have thought about it.

Olly

Link to comment
Share on other sites

19 minutes ago, ollypenrice said:

When did cost ever deter astrophotographers??? :grin:  If it did, there wouldn't be any...

This is, of course, a very cute idea. The theory is compelling. If the focus could be adjusted at each individual wavelength (as it would be) the objective would not need to be so well corrected either.

I wonder what the professionals make of it. They must have thought about it.

Olly

Back in the 90s it was seemingly the only way to produce colour digital cameras and camcorders that had any quality to them, albeit they used 1/3 sized sensors and the units still cost $3000 in the case of this sony camcorder:

This is a revised model from 2003 which seems to have used the same 3CCD system but likely lower noise / higher efficiency sensors. It seems reasonably well corrected.

The limited information I have found suggests it can result in reflections and it limits the system to a slower f-ratio due to the light path length, but us astrophotographers often have telescopes of f5+ which most "normal" photographers would consider quite slow these days.

Technicolour used a two-beam prism for their 3-colour camera in the 30s to the 50s (the camera that shot the now infamous wizard of oz film), so the idea goes back some ways although in this case it was not to maximise light use, but to allow for colour cinematography to exist at all:

I do note that it seems to be that the JWST NIRCAM instrument uses a dichroic beamsplitter (so a two-channel prism most likely? exact details are a bit vague) to allow the instrument to observe one filter of longer wavelengths, and another filter of shorter wavelengths at the same time.

undefined

Why they didn't extend it to 3, 4, or even further simulaneously acting cameras is anyone's guess... But being a space telescope I would have to bet it comes down to launch payload weight, and following that cost.

But quite frankly with 10.5 billion dollars you can probably make almost any idea work so perhaps this isn't too indicative of the tech's feasability haha.

Link to comment
Share on other sites

I've been waiting for someone to build a sensor that does this *at the pixel level*. Just like we have microlenses and (for colour cameras) filters on each pixel now, imagine a camera where each pixel had a prism arrangement splitting the photons between 3 separate detection sites.... Once camera, one shot colour, no 66% loss of photons.

It might seem far-fetched, but the sort of features built into camera pixels now were pretty far fetched only 15 or 20 years back.

cheers,

Robin

  • Like 1
Link to comment
Share on other sites

8 minutes ago, rwg said:

I've been waiting for someone to build a sensor that does this *at the pixel level*. Just like we have microlenses and (for colour cameras) filters on each pixel now, imagine a camera where each pixel had a prism arrangement splitting the photons between 3 separate detection sites.... Once camera, one shot colour, no 66% loss of photons.

It might seem far-fetched, but the sort of features built into camera pixels now were pretty far fetched only 15 or 20 years back.

cheers,

Robin

Cameras now have micro lenses, and now that you mention it I recall a special kind of camera that uses a similar trick to bypass the need for perfect focus by having the lens bring different sub-pixels underneath it to a different focal plane, allowing for software controlled focus in post-production.

 

I recall as well a type of camera used in professional telescopes which might use EXACTLY what you describe, they act as per-pixel spectrometers but I think they only work at a narrow range of wavelengths near to the central wavelength observed or some other restriction. If they were perfect the pros would use them instead of dichroics!

 

Link to comment
Share on other sites

17 minutes ago, rwg said:

I've been waiting for someone to build a sensor that does this *at the pixel level*. Just like we have microlenses and (for colour cameras) filters on each pixel now, imagine a camera where each pixel had a prism arrangement splitting the photons between 3 separate detection sites.... Once camera, one shot colour, no 66% loss of photons.

It might seem far-fetched, but the sort of features built into camera pixels now were pretty far fetched only 15 or 20 years back.

Rather than a prism arrangement, the Foveon sensor (introduced 20+ years ago) had stacked photodiodes of differing spectral sensitivities at the wafer level.  As you can imagine, the red channel being at the bottom of the stack had the lowest sensitivity and highest noise of the three, which is unfortunate for Ha imaging.  However, it does get rid of the Bayer anti-aliasing filter which reduces resolution.  On the flip side, the article mentions color noise at low light levels, which is exactly what DSO imaging is.

Perhaps it might be more promising for solar system imaging where the light levels are generally quite high, and enabling one-shot color without an anti-aliasing filter might be advantageous.

Link to comment
Share on other sites

1 hour ago, pipnina said:

Cameras now have micro lenses, and now that you mention it I recall a special kind of camera that uses a similar trick to bypass the need for perfect focus by having the lens bring different sub-pixels underneath it to a different focal plane, allowing for software controlled focus in post-production.

Yes, Lytro's plenoptic cameras never found a market niche.  They were mostly useful for macro photography.

However, computer controlled focusing of camera lenses allows for rapid fire captures of multiple images at different focus points.  These can then be combined in post using dedicated image processing software (just like astrophotography has its own dedicated post SW) into an image with much larger depth of focus.  This seems to be the direction macro photography has headed over the past decade or more.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.