Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

rorymultistorey

Members
  • Posts

    113
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by rorymultistorey

  1. Just to say MASSIVE thx to this thread. My new threadripper amd desktop did not work with my old usb2 asi120mm. Driver issues. Cost me the good part of a precious night of imaging.
    SOLUTION (as said above): Download this firmware update (windows) : https://download.astronomy-imaging-camera.com/download/usb-camera-windows/?wpdmdl=258 and update to the asi120mm compatible  firmware option. You loose a little bit in the framerate dept but that's not biggie when guiding.

     

    Thx so much all.

  2. 15 minutes ago, tomato said:

    Probably a question for the BAT forum, but will there be a nominated single target to image? Presumably that will be required if we are going to get sufficient data on one target, but that is going to be tricky if the imagers signed up are all over the globe.

    And M106 will be poorly placed from the UK when Astro darkness returns.

    Actually we're having a town hall about choosing the next target tomorrow 8pm in "the bat chat" channel. You'll need to have passed the bat exam first before you can join - which is as easy as dropping a pic that you've taken in the bat exam channel. One of our team will then let you in... hold on a minute, your already in aren't you. You should have had a notification about the meeting.

     

    • Thanks 1
  3. 15 minutes ago, cfinn said:

    I have read the transition to high conversion gain is actually at gain 56 (graph is a bit misleading) but I think for short exposures of a few seconds you may as well take it all the way out to gain 100 in mode 1, then you’ll be close to 1e read noise. I think the most important thing for this kind of imaging is that we minimise read noise as much as possible. Supposedly this sensor can even go down to 0.7e read noise or thereabouts at even higher gain. The Altair variant (26M) has something called “ultra high conversion gain mode” which is supposed to enable this, but I see no such option on the QHY or ZWO cameras at the moment.

     

    this is very interesting I didn't know about the Altair model... Also noob question what are the different modes/different coloured lines about.

  4. 1 hour ago, vlaiv said:

    Here is what I believe is wrong.

    Millisecond exposures help freeze the seeing. Atmosphere is turbulent and constantly changing. Depending on aperture size - there is small window in which it is for all intents and purposes "stationary". For amateur sized telescopes in better/good seeing it is about 5-6ms (maybe even up to 10ms on nights of very good/excellent seeing).

    Any longer than this and things start to average out - "motion blur" starts to happen on PSF. It is not coincidence that measure used to describe the seeing is "FWHM of two second integration". Two seconds is enough for things to average out (those millisecond changes) and provide steady seeing figure that won't change much between successive exposures.

    While DSO lucky imaging works - it is still limited by this averaging. In planetary imaging (if done properly and exposure length kept short) we don't have this average - in fact, we keep only very small subset of valid frames - and even with them - we can actually choose to stack part of frame. This is because seeing PSF varies strongly over even short distances (I think about 10 arc seconds away you can have entirely different PSF - one blurred enough not to be included in stack - while other perfectly "calm"). In 2 second exposure - we can expect that FWHM of all stars in the image will have very close value because all of them have been averaged out versus short exposures.

    I think this would be interesting place to put some real data in this discussion. I'm currently veryu busy - I'm finishing new house (finishing is rather strange word - better expression would be finishing just enough so I can move in :D ) and am about to move in next 2-3 weeks so can't be of much of help there - but I suspect you already have needed data?

    How about taking one of your lucky sessions that has 1-2 seconds long exposures and doing stats on FWHM? Mean value, standard deviation of measured FWHMs - maybe even a graph of how it changes over time?

    That would give people a good idea of what the can expect from lucky imaging, how much of frames can they expect to keep and to get the sense of FWHM values in general.

     

    Yes of course.😊 The WHOLE point of the Big Amateur Telescope is to test stuff out and that is what we are setting up to do. Its a journey we are going on together. So far we have 60 amateurs so far from all over the world. Many very knowledgeable and a few with surprisingly big scopes (like 50cm).  (We might even be getting one of these big scopes paired up a large pixel low read noise full frame camera which potentially could have a go a 10ms imaging on the brighter targets- wow)

    BTW according to they guy who images at keck there is still a benefit from 2 second exposures. Not as much benefit as millisecond exposures but it is significant. 

     

     

     

     

    • Like 1
  5. 13 hours ago, vlaiv said:

    Don't worry about that. I'm sorry if any of my comments sounded malicious in any way.

    Indeed, I was trying to point out some of the things people involved in the project need to be aware of in order for project to work to its full potential.

    I know lucky DSO imaging works. There is no question about it really. However, I'd like to point out that results that I've seen come from rather large telescopes.

    Take for example Emil's work:

    https://www.astrokraai.nl/viewimages.php?id=266&cd=7

    There is simple explanation for this. On one hand - achieved resolution depends on aperture size. I'm not talking here about planetary critical sampling which is probably x3-4 higher than deep sky sampling. Even with DSO imaging - aperture is important for resolution. This is because different PSFs convolve image with compounding effects. One PSF is mount tracking / guiding performance. If you use short exposures, you'll be minimizing impact of that as there is less chance for mount to drift in short time. Other is seeing - that is addressed with selection of subs and choosing those where seeing impact is the least.

    Third is aperture. Here - you don't really have any options except to change the scope. It's safe to assume that people won't be doing that for purpose of this project, so they are stuck with PSF of scope that they have. 4" will have double blur of 8" scope due to aperture alone, which in turn will have half that of 16" scope.

    Better the seeing, or selecting best of the subs - more impact above will have on final resolution. For telescope sizes that we usually use in imaging - aperture has rather small effect compared to tracking and seeing, but once you minimize those two like in lucky imaging - then aperture suddenly becomes important.

    Second important point with large aperture telescopes is - once you set your working resolution / sampling rate, speed of your system depends solely on aperture size. This enables you to get large signal in short amount of time - beneficial both for accurate stacking and also to overcome read noise.

    In the end, here is interesting resource that people should examine:

    https://www.meteoblue.com/en/weather/outdoorsports/seeing/london_united-kingdom_2643743

    This website gives forecast of seeing in arc seconds FWHM.

    It would be good to check forecast value against actual measured values. Care must be taken to subtract part that is due to aperture size. If forecast is reliable (and I think it is - at least from planetary imaging point of view and I don't see why it will be different here) - then it will be good indicator of results that can be expected.

    You can't really expect to get 2" FWHM if forecast shows something like this:

    image.png.2563712d82909b4a1888559cd54da4cc.png

    and you are using 4" scope (which itself has ~1.12" FWHM contribution due to aperture).

    By the way, aperture FWHM can be calculated by using 2.355 * 0.42 * lambda / aperture expression (which will give you radians if you put same units for aperture and wavelength  - you can put 550nm for wavelength), and you can combine two by adding them in quadrature:

    sqrt(FWHM_seeing^2 + FWHM_aperture^2)

    Here we omitting contribution due to mount and assume perfect aperture.

    vlaiv first off thank you for being such a gentleman. Sometimes I am an idiot. I would put a ruder word in here in  place of idiot but you know kids and all...

    The other thing is that it feels like you think I don't  know this stuff but I guess you're not actually talking to me. However I do worry that your missing the big point. The big point is that the biggeest source of blurryness dominates all others (thats the  sqrt(FWHM_seeing^2 + FWHM_aperture^2) equation talking)   If you boil it all down to something really simple then you can say... like I did in the video -  that the biggest source of blurryness for a telescope larger than 6 inches in diameter is -assuming its a good scope that has been well collimated -  more often than not the seeing. And as you know I'm interesting in overcoming the seeing. One of my members is directly imaging exoplanets at the keck observatory and pointed me towards one of his colleagues rather interesting scientific paper on lucky imaging. https://www.ast.cam.ac.uk/sites/default/files/nlaw_lucky_thesis.pdf We can't shoot millisecond exposures of course but even second long exposures reduce dramatically the blurryness from the seeing. 

    For me its like us amateurs have only been imaging in 2nd gear. Its exciting to think that we can go much further. Many of our members don't have scopes that are sharp enough to lucky image with (collimation is a big problem actually) and many don't have cameras with a low enough read noise to lucky image with. I'm still keen that they join in bc we will help them improve and advise them on buying better equipment and we can use their long exposure non lucky imaging data to help reveal detail in fainter structures that are too dim to see through lucky imaging. Some have the equipment but don't yet have the skills to pull off lucky imaging, again we can help them.  

    This is exciting. And if us amateurs pull together and share our data we can to an extent negate the inherent problems associated with short exposure deep space astrophotography and that will give us the power - especially on the bright targets - to match or even better the 1.1arcsecond FWHM that giant scopes like the 4m Mayall on Kitt Peak achieve. 

     

  6. 16 minutes ago, admin said:

    We agree. 

    Anyone who has spent any time at SGL will know vlaiv is an experienced astronomer respected for his knowledge and contribution to the community. 

    @rorymultistorey You are welcome to use SGL to publicise your project and raise support but we will not allow personal attacks. 

    I am truly sorry. Sorry vlaiv. I saw red. I suspect it has as much to do with other things happening right now rather than you. Either way I was out of order.  

    • Like 7
    • Thanks 1
  7. On 02/07/2021 at 12:09, vlaiv said:

    Are you sure about this?

    Back of the envelope calculation suggests that you'll be getting about 0.4e/px/s from sky. In 5s exposure that means about 2e of background signal or about 1.41e of LP noise. Hardly swamping 1e of read noise.

    You need about 1 minute of exposure to truly swamp read noise (5:1 - LP to read).

     

    I just like to say to other readers unsure about whether lucky imaging works or not to check galaxies on astrobin and see how the sharpest ones had exposure lengths of just a few seconds.  The proof is in the pudding.

     

     

  8. 3 hours ago, cfinn said:

    @rorymultistorey I could potentially be of some help here. Currently working with an Esprit 120 - 840mm focal length @ f7 and a QHY 268M. Not the best scope for the job (a fast 10”+ aperture scope would be ideal), but the camera is probably as good as it gets right now. If I operate in high gain mode at maximum gain, then I am down to 1e read noise, which in my suburban skies (SQM 20) means I am swamping read noise in luminance after around 5s. My image scale is 0.92”pp, and if I can get my EQ6-R guide RMS down to 0.4” or so then I can support that image scale, which would critically sample a FWHM of 1.5-2” or thereabouts. The question is then just whether or not I can achieve that FWHM with a choice selection of 5s subs. Your video certainly suggests a meaningful resolution improvement should be possible!

    Charles

    Sounds great. And your absolutely right about having the right camera for the job. I found that in London (bortle 8/9) with my f6 newt I had to shoot out 20 second subs to get enough stars to stack (around 20 is enough) but when I went into the bortle 5 field outside london 5 second subs worked fine. Your cameras pixels are bigger than mine (asi178), and your scope is probably soaking up a similar number of photons. The best thing to do is forget about what the naysayers tell you and just try it. I have APP loaded on my laptop so after I've taken a few subs I can quickly check that they stack. I think your esprit will work very well.

    • Like 1
  9. 9 hours ago, badhex said:

    No problem! I think they rather enjoyed it! 

    One more question - whilst I can't help with imaging I might be able to help with IT. How much dat are we talking about needing to store/upload? Or do you have that bit sorted? 

    a chap called @geeks on the server is heading that up , at the moment I'm just using google drive bc its so user friendly but when we properly get going I don't know what we're going to switch to.

    • Like 1
  10. 12 hours ago, tomato said:

    I think the approach is designed to at least partly, get around the poor seeing conditions most amateur imagers encounter compared to professional telescopes located on the top of mountains.
     

    I totally respect the theory behind the limitations attributed to the Lucky Imaging approach to DSOs, but having  watched the Astrobiscuit video I have been sufficiently impressed with the results of Rory’s experiment to try and make a contribution.

    Uploading the humongous amounts of data generated will probably be my biggest challenge (after the weather of course)😊

    thx and you're totally right apart from one thing. Finding quality astrophotographers is harder than  finding the IT wizards who sort out how to upload the data😊

     

    • Like 2
  11. 14 hours ago, doublevodka said:

    Really looking forward to seeing the results off the back of this, definitely an interesting concept and a great way to follow on from the other budget astro videos 👍

    This hobby does seem to attract some "naysayers" though! "You can't do it without xyz kit, without perfect seeing and only on Thursdays!" 🤣 although SGL is pretty free of that compared to some other forums that shall not be named 😉

     

    SGL is a really great forum. Hats off to FLO for making the community a better place. Cloudy nights has some agenda that i don't quite understand and is full of elitist astronomers.  They banned me for posting this request for imagers for instance...

    To be fair its not going to be easy and you do really need a modern CMOS camera to make it work with the kind of scopes we can afford BUT it will work and ultimately the naysayers will have to admit that they are wrong 🤣

     

    • Like 1
  12. 14 hours ago, skybadger said:

    I'm really interested to get clarity on what the aims are. It seems like it's to have high resolution, high snr images using small/amateur sized telescopes. 

    The issues I see are

     tracking - what exposure length makes tracking errors below the target scale seeing - can we get enough pictures sampled better than the target scale

    resolution, by combining pictures from smaller scopes can we get sufficient snr to discern and resolve features approaching images from larger scopes

    collimation - can we educate enough people to get the most out of their systems to contribute. 

    optical errors - can we combine images from a huge  variety of image scales and their specific optical aberrations and combine the images in a meaningful way, that is not disproportionately compute intensive. Do current generations of stacking programs model the image plane aberrations for stars in each image as part of the  stacking process or does this require a new approach ? Have you got a feel for this new approach ? 

    Finally, what is meant by 'lucky imaging'. In planetary it's all about freezing the seeing to remove the atmosphere from the optical limitations and stacking those few high quality, single-sourced images.  In low snr, long exposure deep sky , you must mean something else right ?

    it's not lucky imaging if you are still exposing at 30 seconds per shot because you haven't broken any of the limits outlined above. 

    I'm not knocking any of this, crowd sourcing imagery and pipeline processing is a great tool; just trying to understand the headline aims and the underlying physics. 

    mike

     

     

     

    I'm going to have to be brief ....(time).  The heart of the theory goes like this. If you had a magical camera with zero read noise you could take million of micro second exposures over the course of a minute stack them up and your stack would have the same snr as a single 60sec exposure.. Obviously we don't have magical cameras yet but we're getting close. Modern CMOS sensors manage about 1e/pix of read noise. The read noise is low enough for us to stack thousands of subs that are only a few seconds long and get good results. The the BAT team we have a guy that images at Keck. He's produced a graph that shows you start to see the benefits of lucky imaging when your exposures drop below 10seconds. You see a really good increase in resolution when you shoot exposures of about a second. And in the future if the next generation of CMOS cameras reduces the read noise further then we will be looking to shoot at about 1/10th sec exposures bc thats when you get even more benefits from lucky imaging. Thats the theory. The reality is that most folks have issues with their set up that needs to be sorted before they're set up is sharp enough to even begin to think about lucky imaging!

  13. 9 hours ago, vlaiv said:

    Indeed - few people manage it, but I suspect it is down to few things:

    1. Optics.

    For that kind of resolution - namely 2.5" FWHM, you need good optics with enough of aperture. Using 80mm ED doublet and expecting 2.5" FWHM stars in OSC image is not really reasonable thing to do. Similarly using 6" Newtonian that is not properly collimated or CC spacing is not perfectly dialed in (or using CC that produces spherical aberration) is not going to do it.

    2. Mount / guiding

    Most people don't have enough guide resolution to precisely measure their total RMS because they are using small 50mm class guide scopes. These are fine for about 1-1.5" RMS guiding, but you need sub 1" RMS for 2.5" FWHM stars. Mount must be able to do it and guide resolution must be able to measure it properly.

    3. Seeing of course. It is only doable on a night of good seeing - like <1.5" FWHM seeing (two second exposure).

    Here is sort of a guideline:

    150mm with 0.8" total RMS and 1.5" FWHM seeing will produce 2.5" FWHM stars if optics is diffraction limited

    As for sampling rate - well, you can choose any sampling rate you want, but if you aim for say 2.5" FWHM stars - then you don't need to sample below 1.5"/px as you'll be over sampled. Many people sample at higher resolutions than that - and that wastes SNR.

    There is simple relationship between the two (this is actually approximate relationship as we approximate PSF with Gaussian of certain FWHM) - sampling_rate = star_FWHM / 1.6

    When someone is using say 0.86"/px - well if their stars are not 1.376" FWHM - then they are over sampling.

    thx dude. You know this ain't my first time around the block.  I disagree with the old school view of sample rate. Take planetary imagers for example. That's all I'm going to say on the matter. Respectfully  you go your way and I'll go mine.

  14. 51 minutes ago, vlaiv said:

    So between about 3"/px and 1.5"/px as far as sampling rate goes?

    That should be easily attainable with 6" scope, good mount and decent seeing without lucky imaging.

    Yes it should be obtainable... easily. And yet few manage it. There is no point in even trying lucky imaging until you reach this level of sharpness with  your set up. Also I just work in FWHM of stars in arcseconds I don't really know what you mean by - 3"/px and 1.5"/px as far as sampling rate goes - . I think about 0.5arcseconds per pixel is probably quite good but like I said I suspect I'm not understanding what you mean.

     

  15. 15 minutes ago, badhex said:

    Classic Terry 😂 The Foucault tester looks exactly the same as it did seven or eight years ago and AFAIK he's had it for most of the life of the class. What a legend. Emailed a few of them after I saw the video, was nice to see everyone again - it actually made me quite homesick! 

    Please can you thank them from me. Terry didn't want to give me their emails which is fair enough.

    • Like 1
  16. 1 hour ago, tomato said:

    Great video, your results with the lucky imaging show real promise, but getting our atmosphere to behave itself even for just 5 seconds at a time is asking a lot.

    I have a similar fledgling project underway with @Tomatobro entitled the SNSLA (the Shropshire Not So Large Array), but I’d be happy to contribute to the BAT if my kit is compatible. My Esprit 150/ASI 178 combo can image down to 0.47 arcsec per pixel, but small targets only.

    Your Kit is amazing. I'd absolutely love to get you involved.  I can make a special channel just for the Shropshire Not SO Large Array that only you and your buddies can see and edit etc...  And of course you'll be able to access all the other channels that we're putting together to help build the BAT too. Together we're stronger. I think the platform we're setting up has the foundations to grow. I have a great team all working for free on   this project - one guy images at the Keck observatory - but to succeed we need to tempt the very best astrophotgraphers... not necessarily the biggest scopes  but folks who really understand how we can make this project  fly.

    • Like 3
  17. 6 hours ago, vlaiv said:

    What is the resolution you are aiming for?

    So we are splitting our imagers into two camps. The regulars are aiming to get the FWHM of their stars to less than 5arcseconds wide and the elite lucky imagers are currently aiming to go less than 2.5arcseconds wide. Obviously we will start to push for more resolution soon. We haven't actually started lucky imaging yet what with it not getting dark and all but its clear a lot of folks need help with collimation etc before they can even start thinking about lucky imaging. We currently only have a handful of ELITEs who are nearly ready to push their kit to attain higher resolutions and I am very keen to find more. It would take me too long to explain everything here, suffice to say that their is a mountain to climb but we can see a path to the top.  I'm very lucky to have a good team helping. Now we need to get in more imagers who know what they are doing.

     

    • Like 1
  18. 8 hours ago, DaveS said:

    This looks to be a job for CMOS rather than CCD. Not sure I have anything that fits the bill, though the TS apo with my QHY266M might do, but at barely 0.85"/px.

    Possibly my old Trius 694 on the ODK would be interesting, giving sampling at near the Rayleigh limit, but a stupidly small FoV. A QHY 368 would be a similar resolution but at £nearly £4k a bit too much.

    You sir know what your talking about. I would really love you to join. BTW do you mean 268 not 266. I am very keen to get more folks in who know their onions. 0.853"/px is fine.

    • Like 1
  19. Hey folks, 

    If you can shoot deep space targets well then I'd love you to join the BAT. Our goal is to gather amateurs from all over the world and for us all to shoot the same target, share our data and produce images that will rival big professional scopes. We are training folks up to shoot high resolution deep space images using the LUCKY IMAGING technique. The resolution we are able to achieve is really exciting but for the BAT to work we need lots of data and for that we need more good quality astrophotographers to join. The BAT members are working together on my discord server which is excellent for sharing ideas and working through problems together. If you want to join up here is the invite: https://discord.gg/WK8qmrqcpq
     

    This is the video which really set the whole thing off and explains how us lowly amateurs are able to achieve resolution similar to that of multi million dollar telescopes

     

    • Like 13
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.