Jump to content

SkySurveyBanner.jpg.21855908fce40597655603b6c9af720d.jpg

Amateur Astrophotography - The Future??


Peter_A

Recommended Posts

Hi All,

Apologies if this topic has already been discussed. I have to give a talk to my local astro soc next week (21st Nov 2018) called "Astrophotography Old & New". I don't know why me because I only go out occasionally with my HEQ5/200P/Canon450D and there are far more accomplished imaging folk in the group than me.

For "Old" I am going to go right back to the nineteenth century and talk about the early photography/astrophotography pioneers and for "New" I have some topics on the digital revolution that started in the early 1990s for us amateurs. To finish up my talk it has been suggested that I try and say a little about where amateur astrophotography is going in terms of what kind of new technology we might be using in a few years time. I am at a loss as to what I could put in my last couple of slides so maybe you all could come up with some suggestions. Are we going to see cheaper CMOS sensors overtaking CCDs maybe? Will automatic polar alignment happen (if it hasn't already)? Will remote imaging become more popular?

Any suggestions gratefully received.

Thanks,

Pete.

Link to comment
Share on other sites

58 minutes ago, Peter_A said:

Are we going to see cheaper CMOS sensors overtaking CCDs maybe? Will automatic polar alignment happen (if it hasn't already)?

Both probably, yes. Mounts are improving, cameras and good quality optics are (slowly... in some respects) getting more affordable and I expect that trend to continue.

Also, I think collaborative imaging and data analysis will become more popular as online services and software become better and easier to use.

The quality of our skies, however.................

Link to comment
Share on other sites

There is only so much sensors can improve.

We are already very close to limits of what is physically possible - read noise of 0 - modern CMOS sensors already have read noise close or even in some cases less than 1e, thermal noise of 0 - pretty much already there with set point cooling (not quite but probably the least impacting noise source at the moment). QE at 100 would be ideal, but most current sensors have 50% or above. This field is probably the fastest changing at the moment due to drive from other fields (and size of market for sensors in general).

Possible further advances would include:

- Scope arrays (not many people use this at the moment, but benefits can be quite large)

- Related to above point, but also mentioned in previous post - collaborative imaging (already possible but not wide spread due to lack of "support infrastructure").

- Advances in software and processing algorithms (there is quite a bit of room for improvement there - like per sub deconvolution, sinc based wavelets for sharpening, context sensitive denoising - utilize stack pixel distribution and star PSF to find max probability for true pixel value - minimize error, ...).

- Maybe further down the road - amateur adaptive optics (higher order, we already have first order - active optics)

Link to comment
Share on other sites

Lucky imaging of DSOs? As sensors get ever more sensitive and less noisy I wonder how deep we'll be able to go with that. Deep enough to image faint fuzzies with an alt-az mount at 0.5"/px in UK conditions? Give it 20 years and I wouldn't be that surprised.

Edit - fair point though vlaiv; there is a limit to this (though with zero read and thermal noise shot noise becomes irrelevant with enough subs, so maybe still possible).

Finding a dark site to image from might be a challenge though :(

Link to comment
Share on other sites

1 hour ago, billyharris72 said:

Lucky imaging of DSOs? As sensors get ever more sensitive and less noisy I wonder how deep we'll be able to go with that. Deep enough to image faint fuzzies with an alt-az mount at 0.5"/px in UK conditions? Give it 20 years and I wouldn't be that surprised.

 
1 hour ago, vlaiv said:

We are already very close to limits of what is physically possible - read noise of 0 - modern CMOS sensors already have read noise close or even in some cases less than 1e, thermal noise of 0 - pretty much already there with set point cooling (not quite but probably the least impacting noise source at the moment).

 

These.

Read noise, download times, CPU power & software will continue to get better and better. I can see a bright future for 'Lucky DSO' imaging, which will obv negate the requirement for good tracking.

I can see a large fast DOB on an equatorial wedge as a great future setup.

Link to comment
Share on other sites

Interesting question, I wonder if curved sensors become popular or perhaps sensors permanently integrated within scopes for either imaging or real time viewing. The scope itself might change at least for reflectors with flat "digital ones" with steerable micro mirrors so that its figuring/focal length etc could be changed at a flick of a switch.

Alan 

Link to comment
Share on other sites

Great ideas, thanks. I shall have to look up some of those suggestions, what is the point of a 'scope array and is lucky DSO imaging a bit like processing video of solar system objects?

The list so far:

Remote imaging sharing time on commercially launched mini-satellites carrying "Mini-Hubble" scopes.
Collaborative imaging and data analysis.
Incremental improvements to mounts, optics  & cameras with reduction in cost.
Telescope arrays.
Advances in software and processing algorithms.
Higher order adaptive optics.
Lucky imaging of DSO.
Read noise, download times, CPU power & software improvements.
Curved sensors or perhaps sensors permanently integrated within scopes.

Cheers,

Pete.

Link to comment
Share on other sites

Telescope array is just simply fancy way of saying - mounting two or more scopes on your mount (or mounts) and using them in parallel.

Imagine having serious AP setup with mount of 100kg capacity and instead of using one large OTA on it (weight and cost) you put 4 smaller scopes. 20inch scope is still very expensive affair, and will probably have focal length not usable on any but tiniest targets. Instead put 4x 10" scopes (much more affordable), will work ok resolution wise with new small pixel cameras, but amount of gathered light per unit time is the same.

Flexibility of array is in having option to do 4x lum, or 1x lum, 1x R, 1x G, 1x B at the same time. You can do broad band and narrow band of a target at the same time. If you get your setup in such way that you can point scopes to same place or slightly different places - you can go with 4 scopes on target at the same time increasing total exposure time, or you can do simultaneous mosaic of 2x2 (or 2x1, 1x2 - having two scopes per mosaic tile).

Nothing stops you putting ever more scopes on a mount - or having multiple mounts in sync.

There are couple of drawbacks - like same satellite trail / airplane will be on 4 subs instead of one. Any guide errors will repeat across subs, ....

Lucky DSO imaging is something that can be done already now to some extent.

Only difference between going one 10 minute exposure vs 10 x 1 minute exposures is that latter is going to give you lower SNR because of read noise of camera (otherwise it will be the same). When you have camera that has extremely read noise - difference between these two approaches fades away and they produce same SNR.

In lucky DSO imaging you do 1-2 seconds exposures instead of long exposures. You integrate for total amount of time as you would with long exposures. Why do this approach at all? You can do things that you normally do with subs when imaging - namely discard frames where there is too much seeing or guiding error. With such short exposures you don't even need guiding, and you can cherry pick subs to include in your result stack when seeing is on acceptable level - thus creating much sharper images than you would be able to do with long exposures. Currently it is still not that feasible, since read noise is yet not that low. You can do it, but to produce good result you need longer total integration time than doing it "old fashion" way.

 

 

Link to comment
Share on other sites

I wonder how many years the eyepieces have left, the optical system in DSLR cameras is already being superseded with the mirrorless models using the sensor and miniature display and a similar system would offer lots of benefits over a compound lens currently used in a std eyepiece.

Alan

Link to comment
Share on other sites

Hi Alan,

We regularly have thirty or more members turn up to our monthly society meetings, I doubt that we would have even had half that number twenty five years ago (if the society had been running then), I put that increase in participation of the hobby down largely to the instant gratification of digital methods. I remember the excitement I felt at 12 years old when I swung a pair of binoculars towards M31 for the fist time and there will always be the diehards who like peering at faint smudges but they will be a diminishing minority.

Pete.

Link to comment
Share on other sites

There’s an interesting progression which links visual observing and astro photography:

  • VA (video astronomy)
  • EAA (electronic assisted astronomy)
  • NV (night vision)

Also, now, at least one offering of an integrated EAA / telescope package.   The short-exposure results available from these technologies are really quite amazing and link to some of the above suggestions too.

I think there’s a story there. 

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.