Jump to content

vlaiv

Members
  • Posts

    13,265
  • Joined

  • Last visited

  • Days Won

    12

Everything posted by vlaiv

  1. Point with this is: - dithering will tell mount to randomly move to new position - it will then wait for mount to settle with guiding. Two things can happen if you have high backlash: - part of random move will be "spent" on clearing backlash if dither is in direction of backlash (this can be handled by dither "size" - you can tell phd2 how much you want to dither) - once guiding resumes - it might not work properly until any residual backlash clears - this will only impact settling time as mount waits until guiding picks up and then exposure starts again.
  2. PHD2 docs say that you can still use it but might not want to:
  3. No, both are equally false color. Difference is in having only color gamut - one is inherently bicolor and other is tricolor. Much more color variety can be produced with three components as opposed to two. This is generally good practice if Ha represents object well. Most objects have Ha component in everywhere, but there are some objects where Ha is very separated from OIII and with these objects you will loose out if you use Ha only as luminance. Best way to tell is to stretch both Ha and OIII as mono images and just look if OIII signal appears where Ha appears as well - if they match in location (Ha is likely to be much more spread) - then you can use Ha as luminance. Other way would be to just do it and see if you like the result It is myth that you can properly image narrowband with moon out. Yes, narrowband helps with LP and moonlight - but that simply does not mean that results will be anywhere as good as in proper darkness. Narrower the passband of filters - more it helps. 7nm is not very narrow.
  4. Point is for it to be random so that fixed pattern noise would not create recognizable patterns - like walking noise. Dithering only in RA is more likely to cause spread in a single line, even if it is random - and that will create streaking artifacts. Although you guide in one direction only for DEC - you can still use dither. PHD2 should be able to move the mount if you put in backlash compensation and it will take a bit longer for guiding to settle - so that it again uses up all backlash and starts guiding in one side only.
  5. Dithering is simply moving scope few pixels between each exposure in random way. It helps to spread fixed pattern noise around and results in less noisy images for same parameters (may or may not be seen by naked eye but is measurable effect) and is indeed advised thing to do. You simply tell your guiding software to dither between exposures and communication between guiding and imaging software should do the rest (wait for next exposure until guiding settles down).
  6. Flats themselves won't be a problem for scope as they are flooded with light and small additional light leak is not going to impact flat itself much (as long as good flat signal is shot in short amount of time - not enough for light leak to build up). Good/strong flat panel solves this. What is more problematic is: - darks shot on the scope. Solution here is to do dark library with camera taken off scope and protected from light ingress (cover placed on camera and shielded from IR with something - people use aluminum foil, I just place it face down on wooden desk after putting rubber/plastic cover on). - Lights. These are shot at night, but there are small sources of light that can find the way into the scope. Try to do some LP management. If you have laptop next to the scope - make sure it is facing away from telescope. Turn down brightness of the screen if you can. Don't put anything highly reflective next to laptop screen - wear dark clothes if you sit at the laptop during imaging. Cover any leds that might be shining - like power adapter leds, mount led - whatever creates light next to the scope. If you can - shield yourself from local lights. Like house or street lights.
  7. Image does not get noisier with exposure time - only less noisy. This is true for any sensor. DSOs can be imaged with planetary cameras - same as with any other camera - except the fact that sensor size is very small - which limits its use. This image was taken with ASI185mc guide / planetary camera. camera is 1944x1224 and I was using long focal length instrument so I had to do mosaic of 2x2 panels - seam can be noticed in the image and total exposure divided on 4 panels was not enough to go very deep - but I did manage to get NGC7331 and small neighboring galaxies.
  8. If it talks like a duck and walks like a duck .... Just make sure you respond the same thing over USB as is expected from ZWO EFW
  9. Sooner or later everyone wants to do panel of M31. It starts with 2 pane, then it grows with every new season By the way - @PatrickO alternative would be to use shorter FL lens as well. I would still use something like 2x2 mosaic and bin data you get from camera to make pixels larger. You can get old M42 lens like Helios / Biotar 58mm very cheap - it would have very usable FOV at that FL (about 5 degrees - that would frame M31 nicely) and stopped down to F/4 would not have any aberrations. Here is what M31 looks like with 4.8um pixel size and Samyang 85mm lens (this is actually crop - FOV is very nice with this combination and ASI178). Again - I had to bin the pixels as 2.4um pixels of ASI178 are way too small for this lens, and even when binned x2 - image still is not as sharp as it could be (in part because I left lens wide open at F/1.4 so there is a lot of chromatic aberration - btw, this is green channel only rendered as mono image - stretched very hard just to see how much data there really is).
  10. From ZWO website: Maybe you could use HID sniffing software to see what sort of communication driver has with device. It must be pretty generic for HID device.
  11. I'm really interested in this part. How do you plan on making filter drawer automatic? In any case, since I can't help much with original query, I can only suggest alternative - which might be easiest way to reverse engineer zwo efw protocol (which I'm betting is some sort of a standard). Look up linux / indi drivers for zwo electronic filter wheel - they have to communicate via USB with EFW itself and you will be able to read the code and figure out the protocol without the need to disassemble anything (which might be against terms of use anyway).
  12. So is bigger sensor Well, at least if you don't subscribe to that "Time is money" thing :D Mosaics do wonders ...
  13. I would phrase it like this: You need bigger sensor . This holds true regardless of the target as lenses are usually not sharp enough to justify such small pixels and resolution will suffer. Since this sensor is 1304 x 976 - it will produce images in 640 x 480 px range and that is very low res by today's standards.
  14. Couple of ways: - you can use arc second per pixel and camera resolution. With 3.75um pixel size of ASI224 and 500mm FL you will have 206.4 * 3.75 / 500 = 1.54725"/px (which is by the way - waaay over sampled for any lens). now you have 1304 x 976px with said resolution produces - 1304 * 1.54725 x 976 * 1.54725 = 2017.6" x 1510.1" which then you can convert to 2017.6" / 60 = 33.637 arc minutes or 1510.1" / 60 = 25.02 arc minutes (dividing with another 60 will give you degrees). - second way to calculate it is using trigonometry - 2 * arctan ( half_width / focal_length). Width of sensor is 4.9mm and focal length is 500 so you need 2 * arctan of 2.45mm / 500mm. Arctan of 2.45 / 500 is 0.2807 2 * 0.2807 degrees = 0.5615 degrees or if you want that in arc seconds - multiply with 60 to get 0.5615 * 60 = 33.67 arc minutes Using first method we have 33.63 arc minutes and using second method we have 33.67 arc minutes. That is simply because width of sensor is not precisely 4.9mm and pixel size is not precisely 3.75um otherwise we would have following match 3.75 * 1304 = 4890 and not 4900 (or 4.89 vs 4.9 in mm), so there is small difference due to rounding of sensor specifications. And of course - you can use online tool if you don't want to remember all the math involved (which is just basic trigonometry really).
  15. Yes, I think that Baader focuser also has threaded connection. It is just matter of thread dimensions and other accessories. I already have M68 thread on focuser on one of my other scopes but it can't rotate. Above focuser comes with M63 thread and can rotate. I figured that I will be able to swap things between two scopes with this accessory: https://www.teleskop-express.de/shop/product_info.php/info/p9781_TS-Optics-360--Rotation---Thread-Adapter---M63-to-M68--M54-and-2-.html Now I can exchange things between scopes and have threaded attachment and rotation on both my imaging scopes RC8 and 80mm F/6 APO.
  16. Ideally you would want: lights and matching darks (which you already have) and flats and matching flat darks. You don't seem to have flat darks - but from what I've seen - your bias subs are in fact of the same exposure as flats, right? That would make them flat darks. Even if they are not fully matched in exposure length - it is better to use bias as flat darks then to skip them completely.
  17. I solved that issue on mine with this: https://www.teleskop-express.de/shop/product_info.php/info/p5769_TS-Optics-2-5--Rack-and-Pinion-Focuser---supports-up-to-6-kg---travel-95-mm.html
  18. Actually - hold on a second. Did you use same bias you shot for flats - for your lights instead of darks? If you do that - this happens: Now, I'm not sure if it is down to flats or if it has something to do with bias, but something strange is going on here. It is not concentric pattern you are getting - but it looks like it might be after stacking - that every sub has some of it and then after stacking multiple concentric rings form instead of just one dark ring. Ok, so yes - do try dedicated darks so we can see if that is the problem. After that - I would look into that coat and how well does it really shield the scope from light leak.
  19. Here is what I would recommend. Flats look ok but you are missing darks. You need matching darks for lights in order to get good calibration - as is, you have over correction as dark signal is not removed from image. When I do regular calibration this happens: Second thing that I would recommend is to simply skip median stacking and use regular average for everything. Just for to diagnose things, do following: - get matching darks for your light exposure. Luckily - you can do this at any time as long as you match conditions on the night of shooting (gain, offset, temperature). - turn of background normalization - use simple average for all stacking (flats, darks, bias all of it) See what sort of result you get.
  20. Indeed, it would be good to see both raw light frames and raw calibration frames (one of each - whatever you are using, flats, darks, bias, etc ...). Image you presented has some clipping so things can't be seen clearly - but it looks like you have some mismatch in your flats - maybe something moved between lights and flats, or you changed focus position. Here is suspect part of the image Background appears to have "emboss" effect on it - this usually happens on camera rotation for example or shift of some sorts.
  21. Looking at your images and camera orientation - I think that main culprit for star trailing is not polar alignment but rather periodic error of the mount. From first image I conclude that you tried to orient your FOV so that DEC is up/down on sensor and RA is left/right. Not sure how accurate you tried to do that, but if you look at your trailing - it is mostly in left/right direction - which would mean it is in RA and that means - periodic error. Polar alignment error creates trailing in DEC, while periodic error creates trailing in RA. You really need to take about 8 minutes of consecutive 30s subs to get the idea of how your periodic error behaves (I think EQ6 mount has worm period of about 480s give or take - that is 8 minutes). Some of those subs will have trailing, others won't because periodic error is not uniform in nature. What you think is due to better balance - could as well be due to "calm" part of periodic error curve.
  22. They actually correct things. They correct for coma and field curvature. Here are couple of spot diagrams of Starizona corrector vs no corrector and Starizona vs regular one (Celestron F/6.3 reducer): It might not give pin point stars but at least they are round up to say 27mm diameter (which is APS-C size +/- a mm).
  23. I don't have any experience with SCTs and their reducers / correctors. Above one seems to correct up to APS-C size - not sure if claims are correct and how good it is. Then there is this (I just ran into it due to bit of researching on what can be used for SCTs to correct the field): https://www.teleskop-express.de/shop/product_info.php/info/p732_Baader-Alan-Gee-f-5-9-Telecompressor-Mark-II-for-Schmidt-Cassegrains.html However, here is interesting answer from Baader team: It seems that no SCT corrector will do the right job (with exception of that Starizona one that is expensive) and that EdgeHD is the way to go for good field definition if one wants to use SCT.
  24. https://www.highpointscientific.com/starizona-sct-corrector-4-0-63x-reducer-and-coma-corrector-sctcorr-4-1
  25. It really depends how much post processing you want to do, or rather - what kind of it. From DSS technical documentation: I personally prefer third option - not implemented in DSS which deals with both background LP and transparency differences (version that I use also deals with gradients) - which is best described as linear fit (above is "constant fit"). Per channel background will normalize frames against reference frame - keeping color balance of reference frame RGB one will normalize frames against reference - but will neutralize background (which might not be a good thing since it skews other colors if not done carefully - and above warns about that - possible loss of saturation being an issue). Go with Per channel as your first choice - you can always restack with other option and see differences for yourself.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.