Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

Whirlwind

Members
  • Posts

    537
  • Joined

  • Last visited

Everything posted by Whirlwind

  1. There's a lot of reasons though that people interested in the hobby might not be able to effectively image though and remote imaging can fill a void. For example:- If you work night shifts, have very early mornings, work abroad frequently If your home residence is conducive to imaging (you live in a flat in the city) You have a medical condition, bad back etc that makes it difficult to do these things yourself anyway You live in the UK and would prefer your equipment doesn't spend more time as a towel rack rather than imaging You prefer more science based work (e.g. observing transits etc) where the UK weather is not conducive to this sort of work. You live in north Scotland and would prefer to image some summer targets You can use it to top up data you might have missed in the UK (e.g. you managed to get the LRG but not the B!) I get the point that for some the challenge is to get everything just right, but then some just like to have images to show for it. I do find the cost for itelescope expensive though. For the 'lowest grade' (and I say that with some hesitancy) you are still looking at £30 per hour. A whole night imaging would be close to £250
  2. Seems that you are getting lots of lovely imaging done...do you have cloud gun? My suggestion would be to process the two elements separately. Mask off the galaxies and background in turn and then recombine as a suggestion?
  3. Yeah this test is probably not ideal in the way it is being shown. The KAF8300 CCD has higher read noise and the image appears to be narrowband. To truly get the best out of the CCD you need exposures so that the other sources of noise will overwhelm the read noise. With a low read noise in the CMOS this doesn't take very long and 60s will do this. In CCDs for narrowband you need much longer exposures of 20-30 minutes to overwhelm the read noise. A fairer test to demonstrate achievable results would be to compare to 4 x 1200s exposures vs 80 x 60s. If you compared a 20minute CMOS exposure to the a 20minute CCD exposure the results would probably favour the CCD. What this really demonstrates is that the CMOS allows us to image with less accurate mounts as any lost time because of guiding/worm irregularities can easily be discarded. In this case the CCDs need better mounts so that they can expose for longer (especially in narrowband). They both have their pros and cons. CMOS also tends to require much more aggressive dithering from what I have read to combat fixed pattern noise. In addition around bright stars the micro-lens can give strange diffraction patterns. But they can both give excellent results if used in a way that utilises their strengths and mitigates their weaknesses.
  4. If you are imaging an object that has repeatable cycles (e.g. a transit) it is best to observe multiple periods and then phase fold the data onto the identified period. This way you combine the data in phase to get a better signal to noise. However that can be more tricky in the UK because of weather and when a transit might occur. Yes heliocentric (HJD) is much better to use. Barycentric data is even better really (although there isn't yet a standard determination for BJD) although is only really important for very time sensitive observations such as pulsars. I'm not sure why anyone would want data in JD for anything that might be periodic. It means you can be up to about 15 minutes out depending on the time of year, your observing location and where the object is located. If you want to compare observation over long time frames (e.g. if there are subtle changes over time) then using JD will give erroneous timing issues.
  5. It's a lovely image. My only comment is that you may have clipped the black point a bit too aggressively and makes the contrast look a bit too stark. In comparison your M51 looked better in this regard. On the other hand I always find M33 is always a pain in the proverbial to process correctly for some reason.
  6. That's OK, it's never entirely clear when people say they have no concerns over budget. However, based on your description I would be wary about an EQ6. They are not light, you are looking at 23-24kg for the mount and tripod. Unless you are a weightlifter you are likely to strain something if you try and carry that out with the telescope/camera/counterweights. This before you consider that it is not packaged in a conveniently lift-able manner. It may seem OK during the day on this forum, but try lugging about 25kg in the dark/cold at 3am in the morning. Too heavy equipment is a big factor for those stripping down and putting up every night especially if the weather is uncertain. Given what you have stated are your limits I would suggest you look at the Vixen SXD2. It is lightweight and very easy to move with what appears decent tracking. Easily accommodates your scope and should have flexibility to go up to something in size similar to the C925 edge which would be more than enough focal length for most. It is also in your budget limit. You are likely to be able to carry out the whole setup in one go.
  7. If you want to get into astrophotography then the mount is the most important aspect of your setup. With a good mount most telescopes/cameras can give you a good image once you have learnt the processing steps. But there is a learning curve. However, they also depend on your circumstances. Do you have a permanent setup or will you be setting up and stripping down each night? If this is the case then something less weighty would be useful - the quicker you can setup the more likely you will use it. Are you technology averse or mechanical averse. Some people like to tinker with the mechanics but don't like a lot of software gismos and vice versa. If money really is no object and you do want to really get into astrophotography then my suggestion would be to skip the above mounts. Not that they are bad, but they can still be a challenge to getting working right out of the box sometimes and may need refining. If you want a full long term mount and money really is no object then I'd also consider the following (they all have pros and cons):- Mach2GTO (Astrophysics) , or a Mach1GTO if you can find one used (US based so some issues if there ever is a problem). 10Micron GM1000 HPS Avalon Instruments M-Uno Software Bisque MyT or if you are after something more lightweight then the VIxen SXP2 could also fit the bill. Really depends on how much you are willing to spend...
  8. Cool, great to see that you have found a solution. I suspect what this is doing is reducing the dynamic range (so most of the walking noise is one/two levels and then the offset is slightly clipping this out leaving just the small remaining edges of the walking noise. It may be worth playing with the parameters slightly starting from this point to see if you can change the gain / offset to see if there are any further improvements (for example to remove the remaining trace of the noise.
  9. In terms of the video, a lot of software allow you to produce a playback of the blinked images which would demonstrate how frequently and what is happening between each image. I'm unsure about colour CMOS cameras in terms of actual settings. I think noise in CMOS is meant to be generally quite low so excess cooling may be unnecessary? It is strange that it has gone from being not a problem to all of a sudden being so which would suggest to me some change in the setup that is causing the issue.
  10. It might help if you posted the video here as it may help in analysing if there is any issues. Have you changed the settings recently on the camera (e.g. gain / cooling etc?)
  11. I think it is better to confirm this first. Have you blinked the sequence of images without a star correction to see whether there is any movement? I am unsure whether SGP can manage the dither to the above degree of requirements, but increasing the dithering pixel step size/duration enough to wind in the backlash should help. Are you guiding at all or is it very short subs stacked? Sometimes placing a slight misalignment in the balance that works against the backlash can also help - but is dependent on where you are imaging. Effectively the slight moment on one side acts to 'press against' the backlash.
  12. Fixed pattern noise is difficult to post process out because it isn't random per se so it is difficult for software to identify it (it appears more like 'real' data). Dithering is meant to resolve this because it adds a random shift in to each image so then the fixed pattern noise then becomes more apparent to the software as the noise it is. The noise in the fixed pattern arises in the read noise so can be more prevalent in CMOS cameras because of the low read noise and the tendency to use very short exposures. Increasing exposure durations so that thermal noise overwhelms the read noise would be one way to overcome this. You could also image with the camera slightly warmer to increase the thermal noise a bit. For this image it may be worth checking that your darks don't need updating if you are using them. Also try blinking you images to check that dithering is actually working as expected. If you have a lot of backlash in the mount then you may think you are dithering but in one direction it isn't and using up the backlash in the mount. The 'rain pattern' might suggest dithering isn't working in one direction.
  13. Is this just for imaging of visual as well? I ask because if you want a longer focal length of about 1600mm and it is just for imaging then why not say the VXL12"? This would be less burdensome mount wise and aperture is less important (compared to visual) when it comes to imaging. It's also not as costly so would allow more funds to be spent on the mount (which is always the critical part of imaging set up). You are saving about £2k. That could then go towards upscaling the mount (which might open up things like the MESU) and/or a cooled CMOS colour camera which would also likely improve your images. If you are committed to going for this weight capacity the only other mount with this capacity and price range that I know of is the JTW OGEM but is relatively untested (the base version is similar to the CEM120 in price). On the other hand if you want a visual telescope as well then a 16" will work wonders and that changes your mount choices.
  14. No you can't control the three cameras in the fully automated route. This only allows you to use two cameras (guider/main imaging camera). In the advanced version you can control 3 cameras though. You can setup one as the guider and two as imaging cameras. You can set a sequence (e.g. LRGB) for each of the imaging cameras and then start them independently whilst guiding, so is a semi-automated sequence but using only piece of software/computer. You can't use dither though because the sequences aren't linked. In principle, as it has a scripting language built in to the software you could set up a script that allowed dithering if you are software minded.
  15. Unfortunately there are a lot of unknowns in both plots to really understand what the differences are. To have a completely scientific comparison they would need to be measured in the same way to ensure that the data is comparable. Statistically assuming a perfect test you should get a distribution of errors over time. If the CEM data is almost an instantaneous sampling of the data and the MESU was averaged (say 5/10 second exposures) then from a simple statistical analysis you would expect more variance in the CEM data -really it should be a median or average over the same timeframe. I hence don't think we can make much of the comparison of the data without some real errors being shown on the data points.
  16. There is some strange diffraction spikes in that facebook image on the brighter stars but am unsure whether this is a telescope issue? I can't see anywhere what the possible backfocus range is?
  17. Have you tried Wayback Machine. You can get access to version 6.3.1 from here (not sure what version you are using?):- https://web.archive.org/web/20161224162028/http://qsimaging.com/software.html It appears that Jan 1st 2017 is the last time it took a snapshot where you could download the files directly. After that it always said to contact customer support. I did have a link to the beta testing site, but that seems to have been killed off since they updated the website. Thanks Ian
  18. I think you may be correct in that the filter has an effect but perhaps not in the way you are thinking. I would suggest the following reason for an enhanced effect in the narrowband filters. It appears to be generally accepted that the diffraction pattern is off the microlens. However it is a lens that likely diffract different wavelengths by differing amounts. In a broadband image you have many different wavelengths that are all being diffracted by a slightly different amount. As each wavelength is diffracted slightly differently than they all overlay at the sensor in a slightly different way. As the pixels are (at a broad assumption) broadly equivalently detected then these patterns merge and form one continuous 'blob' (highly technical term). Hence you likely get a broadened star but with no pattern. As you place narrowband filters in the imaging train you now have one specific wavelength which has one specific diffraction pattern. As there is no other wavelengths affecting the CCD you get to directly see that diffraction pattern. In effect the broadband image is like taking thousands of individual narrowband images and then slightly altering the pattern so that when they overlay they have merged. As such you can clearly see the effect in narrowband but not in the broadband images, but it is still there. This should be testable by using consecutively broader filters. If you started with a 3nm Ha filter and then tested the same exposure using a 5nm, 6nm, 9nm, 12nm Ha and so forth then as you broadened the wavelength range you should see that the effect 'fades away'. It might also explain why some people don't see the effect. They are using wider narrowband filters compared to someone that easily sees it in a Chroma/Astrodon 3nm.
  19. I suppose you could put the larger CCD (I'm assuming the Atik?) on the 6 inch and then bin the 460 (I assume) until you get roughly the same "/pixel. Then image the same object in 3nm narrowband. Then you should have the calibration for the CCD sensitivity, then you can run the same exercise with the two telescopes making the "/pixel roughly equivalent. Then compare a bright but non-saturated object flux (star is likely best) and divide out the response issue from the CCDs. Of course that means multiple striping of setups and retesting, and imaging time is valuable.
  20. I didn't read the article myself but this in itself should not be a surprise. I'm assuming the resolution of the test between the 14 inch and 6 inch were similar overall? The benefit of a larger scope comes from that at the same arcsec/pixel you can get more flux per pixel for the same amount of time. So if you 0.8" per pixel on the 6" then on a 14" at the same resolution (and assuming same sensitivity) you'll get more flux on each pixel. As such fainter details should show up earlier for less time observed. Usually though longer focal length instruments have "/pixel that are much smaller and spread the light out further so you don't see this gain in aperture. I think sometimes this is where getting more resolution issue comes from. There is likely a limit in any sky as to how well resolved you can make an object (without fancy tech anyway and 8000ft high telescopes). However you should get to the fainter stuff quicker *at the same "/pixel* which might make it seem like you are getting a higher resolution.
  21. However all of these are going to be dependent on the system being used. The FSQs are well corrected sealed units so there is less likelyhood that these will show some form of aberration. There's likely a number of factors that come into play. What we do know is that the size of the shadowing is dependent on the distance to the sensor. The further away the larger it becomes, but the more diffuse as the effect s spread out over a larger area of the ccd. The reason we don't see dust from an objective is because it is so far away that effectively it has been 'diffused' to being not visible. On on that basis it then becomes a question of percentage change where changing the location of the sensor to the dust makes a noticeable difference. If the back focus distance between ccd and dust was 50mm and by focusing you changed that by 1mm then the difference in the dust shadow will be more significant compared to one where the back focus distance is much larger (let's say 150mm). It will also be dependent on the size of the pixels as you will be more sensitive to such changes as they get smaller. Hence it may not be a case that they aren't there in remote setups but they aren't noticeable as the setup as designed. However more flexible setups may find they are more susceptible to this kind of fluctuation. edit - apologies for the typos, I hate typing on an iPad.
  22. It probably depends on which side of the focuser the flattener is. If you have the ability to put the flattener on the telescope side of there is a set of lenses much nearer the ccd (e.g. An Edge) then if you have any dust on the flattener then you may need separate flats if you change the focus position because you are so much closer to the dust. However I imagine that in most cases it probably isn't noticeable if you change focus position several times over one night as the stacking will likely reject the variations. It only becomes obvious when the flats are done at one position and the lights at another. My general suspicion is that these effects arise when there is some slight motion because something isn't quite locked down as tightly as you realise resulting in a slight shift and a residual 'ring'.
  23. If you want some online tests of the 120/150 then here are some. http://interferometrie.blogspot.co.uk/ Scroll down a bit to get to ESPRITs
  24. Have you definitely ruled out a guiding issue by seeing if the same thing happens when the lodestar is observing but not sending guiding corrections. You explanation appears cyclical which I would not expect to arise from a shift of the lodestar - I would have thought that would have been more 'jumpy' as the stress overcomes the resistance. A cyclical effect would imply a cyclical cause which would indicate a gear related issue? What is you guiding exposure and mount correction rate?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.