Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

symmetal

Members
  • Posts

    2,405
  • Joined

  • Last visited

  • Days Won

    3

Everything posted by symmetal

  1. Sorry Tony, I missed your posting. They are 3D printed tube rings in PLA+, which I designed in Blender, along with the dew shield. They allow the scope to sit upright or on its side on a table without risk of it toppling. They are 2cm thick and push against the ends of the dovetail bars with the aluminium rods holding them together. The inside rim is lined with 1mm thick neoprene sheet to grip the tube. An Ender 5 Plus printer is just big enough to print them. 🙂 I realised after, I could use the holes to attach the dew shield with suitable 3D printed plugs which works well, though I have since redesigned the dew shield to attach to the two lugs on either side of the tube front that the Celestron tube end cap fits onto, so it's just push the dew shield on and twist to latch it. It does have a 3D printed end cap too. 🙂 I made a conical shaped front cover as well, attaching the same way, which fits over the camera when the scope's indoors. The black tape is just to protect the paint on the tube ends from being rubbed. One of the mounting attachments with the cutout for the locating pin to ride in. The third attachment at the bottom (without the cutout), is just to stop the cover, or dewshield, from flopping forward in use, as all three attachments rest against the end of the tube. It took a lot of trial and error printings to get the attachment cutout location correct. Alan
  2. No problem. 😊 Yes I forgot to mention it's worth clearing out EQMod's pointing data before each session, although selecting Dialog Mode as I mentioned previously, helps to mitigate this. With platesolving, the pointing data saved in EQMod is more of a hinderance than a help. It's a pity there isn't an option to disable it. I went over to using GSS Server a year or so ago instead of EQMod and it works fine without these pointing model slewing problems. It also gives a nice 3D picture of where your scope is looking too. 😁 Alan
  3. You can just run the database exe file from wherever it was downloaded to. As long as you've installed ASTAP, when you run the database exe it will ask where you want them installed, and will fill in the ASTAP directory, C:|Program Files\ASTAP by default, so just hit return. You don't need to manually run anything in the ASTAP folder once it's all installed as SGP should run it automatically when you click Solve and Sync. Glad it helped. 🙂 You don't need to see the target visually. Just as long as SGP can get a useable image. Using the Control Panel Plate Solve tab click the Solve & Sync button and it will take an exposure of length as set and binning as set. I recommend binning 2x2 so that a faster exposure can be used. Depending on your scope and the filter used the exposure will likely be around 2 to 5 secs for LRGB or around 30s for narrowband. When solved the current location will be displayed along with the camera rotation angle. If you want the camera image long side to be aligned with RA then you want the camera angle to be 0 or 180 degrees (it will of course rotate by 180 after a meridian flip anyway so being one or the other is fine. If it reports say 352 degrees then you need to rotate the camera by 8 degrees, or just guess it, and plate solve again to see how much closer you are. The direction you need to rotate the camera to increase the rotation reading depends on what type of scope and camera you have, so it's best to to do a small rotation at first and see if the resultant plate solved camera angle has increased or decreased. Then you'll now for the future. 🙂 When you select Centre Target, SGP will then keep running Centreing up to the number of times specified, or until the image centre is less than 10 pixels from the correct location, using the settings shown, though you can set then however you wish. Ignore the rotator error box setting, as SGP will too, once you've told it as I mentioned in my ptrvious post. Alan
  4. A little bug is that the Framing and Mosaic tool by default expects a rotator to be present even if one hasn't been specified in the equipment list. When you click Create Sequence, on the next panel deselect the Rotate or Validate Camera Angle box as below. Your sequence will then start without the auto-rotator error you've been getting. An unfortunate side effect is this will then not display the camera rotation angle when it's centreing so before running the sequence do a manual plate solve which will display the camera angle. I always manually slew to the first target and do a plate solve just to check the rotation angle and adjust it if necessary. I assume you're using PlateSolve2 as the SGP plate solver. This can be a bit troublesome if you're quite a way off target of haven't edited its settings to make it more tolerant of star sizes. I suggest you use ASTAP as the plate solver. This is an option in the Control Panel's plate solve tab. You need to install ASTAP from here first. Just download the two files indicated to the same directory and run the astap_setup.exe file. I assume you're using 64 bit Windows. If you use the suggested default directory then SGP will find it right away. This should eliminate your plate solve fail problems. Ensure you've set the correct camera scale (arcsecs / pixel) in the Control panel camera tab (below), or the plate solvers will likely fail. This may be why Platesolve2 is failing but I suggest you use ASTAP anyway. Hope this helps. 🙂 Alan
  5. In Eqmod (or EQAscom) have you set your sync mode to 'Dialog Mode' and not 'Append On Sync'. Append on Sync can cause a similar behaviour to what you have, though it can be anywhere and not just near the Meridian. I'm curious as to why you don't let SGP do the slewing and centreing, as the Mosaic and Framing Wizard makes it very easy to do. As it uses EQAscom syncs like CdC does, you can see if the slewing behaviour is any different and so narrow it down as to what program's causing the error. Alan
  6. Set the exposure to 5ms to help 'freeze' the seeing and will also allow a higher framerate. 20ms is too long really. Set the gain very high. It will look awfully noisy on the preview but don't worry about that as stacking hundreds of frames will fix that. Using the gain try to get the histogram max value between 50% and 75% but it doesn't matter if it's lower. Set camera 'high speed mode' to on. This will almost double your framerate. High speed captures use 9 or 10 bits resolution rather than the camera default of 12 bits. As you're recording in 8 bit that won't affect the quality. Stacking will increase your actual 8 bit resolution to around 12 bits anyway. For this reason don't select 16 bit recording as it'll have no effect on image quality but will reduce your frame rate. The ASI120 mini is USB2 only so not sure what you're max framerate will be, but 200fps is normal using USB3 with a planet sized ROI. The brightness setting is the camera offset. Setting it to 0 will likely clip your sky background so increase it until the histogram peak, (the sky background) is just clear of the left edge of the histogram. Using a high gain it may not be easy to see through all the noise, so reduce the gain while checking it. Duration can really be up to 2 minutes for Jupiter, due to planet rotation, and longer if you don't have perfect seeing or a large scope, so 3 or 4 mins should also be fine and give a better chance of getting enough good frames to stack. It's not in your list, but ensure gamma is off in camera settings. Having it on will significantly reduce your frame rate. Alan
  7. 50m mains extension reels are very common. There's no problem if you take basic safety precautions with mains power outdoors. As the power you're consuming is low as far as mains power is concerned that won't be an issue either. Alan
  8. It looks more complicated than it is. If you want to read the theory behind it then these two extensive threads should explain all. And it's vlaiv approved. 🙂 https://stargazerslounge.com/topic/393694-swamp-read-noise/ https://stargazerslounge.com/topic/391776-gain-on-533mc-pro-with-fast-lens/ If you're using default offset 50 then the table as shown is fine, and you're most likely using gain 100 all the time, to use the benefit of HCG, so that one line entry is all you need be concerned with. The ADU 717 is just a general figure to aim for, (too avoid too much under or over exposure), it needn't be exact. When the Moon's around, the sky background will be brighter, hence more sky background noise, and you will reach these read noise swamping ADU values at much shorter exposures, but it's best to keep using the same exposure you determined under optimum conditions to save keeping track of different exposures all the time. You can quickly determine your optimum exposures, rather than trying test exposures, by examing one of your raw subs from each filter. I initially thought you had a OSC ASI2600, which is why I mentioned the OSC methods, but as you have a mono camera it's easier. First look at your master bias and read off its average value. The camera offset contributes to most of this bias and you want to remove this bias, containing the offset, from your calculations and just determine the ADU values added by your sky background. On my camera this is 503 ADU. Note than 500 of this figure is purely the offset as adding 1 offset value adds 10 ADU to your image. Just using 10 times your offset would be pretty close anyway to this bias figure. Just read out the average sky background ADU value displayed when moving your mouse over the image, as any capture program should display. Say 10 mins exposure gives 1000 ADU sky background Subtract the bias signal, 1000 - 503 = 497 Subtract the bias from optimum swamping ADU value, 717 - 503 = 214. Optimum exposure is then 214 / 497 * 10 mins = 4.3 mins These calculations are mainly useful for LRGB imaging. For narrowband you will likely not achieve these optimum exposure ADU values, particularly with Oiii and Sii, with 10 mins, unless you're in a light polluted sky so 10 mins is possibly worth sticking to. For LRGB much shorter exposures can be used. As I mentioned each RG and B will have an optimum exposure but just choose one exposure for all 3. Perhaps use Blue to determine what that is. Optimum Luminance exposure will be around 1/3 of this length. If you wish I can give you a copy of the Excel spreadsheet I posted above, which also includes the calculations to do the above sums. You can tailor it to any camera by filling in the table from the ZWO data sheets. 🙂 Alan
  9. That'g good to hear. 🙂 Alan
  10. The optimum exposure time is reached when the camera read noise is swamped by the sky background noise. A sky background noise of 5 x the read noise makes the read noise contribution to your image negligible and it can be ignored. Once the read noise contribution is negligible then there is no advantage in exposing for any longer, and it's best to start another sub. Exposing longer than necessary makes any tracking errors more noticeable, along with more possible star bloating. I've previously made a chart for the ASI2600 which I assume you're using as you say ASI200 in your first post. 🙂 You just need to expose until the sky background ADU level as shown by most capture programs when you hover over the image background, is at least the ADU value indicated. I've made it for different gain settings, though modern ASI cameras where their HCG mode is only enabled above certain gain settings, means fixed gains are more generally used. If you're using default gain 100 and offset 50 then a sky background of around 717 ADU is how long you need to expose for to swamp the read noise by a factor of 5. This ADU value is independant of the focal ratio or filter used, as a slower scope will need to expose for longer anyway to reach this sky background ADU value, likewise with narrowband filters. In theory separate optimum exposures are needed for R, G and B filters that's too much hassle so choose one exposure for all, perhaps based on blue. Then make the luminance exposure about 1/3 of the colour exposures. With a OSC you can only use one exposure anyway, and the median value of the image, usually displayed in the image parameters, is generally a good figure to use as the sky background ADU value unless you want to examine the separate RGB pixels. For doing a test exposure choose a sky with little nebulocity and no Moon, to determine your optimum exposure, and use this for all your future imaging, with this camera and gain setting. 🙂 I forgot to mention that higher light pollution will mean shorter exposures before this sky ADU is reached, compared to a dark site, so if you move to a site with different light pollution you'll need to redo your test exposures. Oh and great image by the way. 😊 Alan
  11. Inductors used in DC power converters like buck converters can give a high pitched whine if their internal windings are not wound tightly enough, or it's not anchored to the PCB properly. They try to vibrate at their operating frequency, generally around 100kHz, and a resonant sub-harmonic of the PCB or coil structure can create a lower pitch noise in the audible spectrum. It's likely the camera has one or more of these DC converters to power the sensor and interface circuitry from the main 12V input. It's usually just a nuisance but if it's getting louder it implies something is getting looser. As you've had it apart to change the fan, while it's powered, can you push against any of the exposed PCBs or large components with something non-conducting, like a cotton bud or cocktail stick, to see if the noise changes or stops and so isolate the cause. It may just be a screw holding a pcb which has some inductors mounted, has come loose and the pcb itself is vibrating. Alan
  12. Autostakkert is generally the preferred choice for stacking videos to create a single image. By default it loads only uncompressed videos like .ser or .avi from from dedicated astro cameras. If your video is from a DSLR, usually mp4 compressed, then you'll need to install ffmpeg.exe into the Autostakkert folder. I've given details on how to do that in the answer to this query here. You can then load pretty much any video formats into Autostakkert directly. Alan
  13. Thanks glafnazur. 😊 You spent a lot of time checking for other 'horned' stars and it looks to be just an optical effect at the frame edges. Pity. Most odd that it's only reddish stars giving this blue effect though. I'll check my other images to see if they also show this. Maybe it's the hazy conditions that have shown it up. Yes, I'm very impressed with the edge star shapes with the RASA 11 especially with a full frame sensor. Once you've got the tilt/spacing set up, which is rather critical. Alan
  14. Thanks very much. 😊 I put off getting Pixinsight for many years, but gave in when BlurXTerminator was released, and once you follow some tutorial videos, its way of working is not so strange. 🙂 Alan
  15. Thanks Adam. 😊 Binning the image using IntegerResample in PI helps greatly with the noise such that 1 hour total exposure is really all that's needed for NB, especially Ha. I only bin it to keep the image size down and to stop PI moaning that the 'resulting image will be very big. Do you wish to continue?', especially with mosaics. 🙂 A full frame sensor still gives a big image even when binned. There was a report on CN showing that binning in the camera with CMOS cameras gives significantly softer images than using software binning later. I'm surprised that the RASA 11 doesn't seem to show any significant gradients on any filter, even when a partial Moon is around. My FLT98 images always had small gradients even on moonless nights. Yes, the bird is aiming for its parent star, responsible for its existence. 😉
  16. Taken over two and a bit nights at the beginning of the month this is a two panel mosaic taken with a RASA 11 v2 and ASI6200MM camera on an EQ8. Conditions weren't great with a haze visible below about 30 degrees elevation, so there is a bit more than normal flaring on some stars. I had to discard many frames where it was too bad, so this is around 60 to 90 min per filter per panel. Binned 2x2 and processed in PI and PS. Mixed SHO and Foraxx palette used. I actually did no background extraction, as efforts with ABE and DBE made it look worse, so I'm pleased with how it turned out. PhotoMetricMosaic did a good job on blending the panels. It was going to be a three panel mosaic to include Sh 2-170, the 'Little Rosette Nebula' which forms the dot of the question mark, but after four hours of hazy images I gave up and had to leave it out. Central crop Just creeping into frame at the top is a planetary nebula labelled as PN A66 1 in Aladin's Simbad Towards the top right of open cluster NGC7762 is another unnamed cluster centred between two bright yellow and blue stars. Above this is a star 'with blue horns'. Aladin has the star labelled as V* V712 Cep, a long period variable, but no horns are visible near it. A mystery. 🤔 Alan
  17. DBE (Dynamic Background Extraction) would normally be used before you stretch the image. Placing sample points around the edge of the image on areas you want to be a neutral dark grey. Whether this works well on stretched images I don't know. You will need to adjust the 'Tolerance' and 'Minimum Sample Weight' in order to place samples on highly coloured background areas. Don't forget to select subtraction as the 'Target Image Correction' mode. Don't use auto generate samples which creates a sample grid over the whole image, but place the samples manually. As it's a starless image you can make the 'Default Sample Radius' higher like 15 to 20 pixels. Image centre is looking good by the way. 🙂 Alan
  18. Looking at your photos and diagrams I agree with your last post on the DB9 wiring. 🙂 The unused red and blue wires on the RJ12 cable I would desolder and cut off the bare wire ends and insulate with tape or preferably heatshrink. Particularly if they are connected to 5V or ground on the RJ45. It may be worth shortening this cable anyway, as it's likely far longer than it needs to be. I said a couple of post ago that the stepper motor RJ45 pinout for your original adapter and the temperature adapter were different, but that's not right as they are the same. Not sure how I read that wrong but that sorts out the confusion I had. 😁 The RJ45 has +5V and Gnd pins (for the temp probe) so it's lucky you insulated the unused pins in your working adapter. The original diagram saying N/C on the those pins, which just meant no connection needed for the adapter, and not that they are not connected on the controller side of the RJ45. Alan
  19. The pin numbering of RJ11 and RJ12 connectors is a bit of a minefield, as the plug pin numbers are the reverse of the socket numbers. 😬 If you look on the internet you'll find both number orders applying to the plugs as some people took the socket numbers and assumed that was the same as the plug, and vice-versa which would seem logical. I would take the lid off your temperature probe adapter and see which 4 pins of the RJ12 socket are actually wired and then see what colours on the cable they correspond to when your RJ12 cable is plugged in. Using the RJ12 pin connection from the diagram, rather than it's pin number which may be wrong, here's what your adapter cable should be like. RJ12 Coil 2B to DB9 pin 1 RJ12 Coil 2A to DB9 pin 2 RJ12 Coil 1B to DB9 pin 3 RJ12 Coil 1A to DB9 pin 4 It's not critical if you get the wiring incorrect. The worst that will happen is the stepper won't move at all, (as the coils will effectively be open circuit), or it turns in the wrong direction if any one of the coil pairs is reversed. If both coil pairs are reversed it will work as normal. 😉 Alan
  20. You can get these solderless DB9 connectors, but soldered would be preferable if you're well trained 😁, as it's one less connection in the chain. Alan
  21. Just check it's RJ12 (6P6C) in its spec similar to this, as a 'standard' phone extension cable may be RJ11 (6P4C). Alan
  22. Hi Adam, In your original post you were looking for the pinout of the adapter you had in your Optec box which you thought was RJ45 to DB9 but turned out to be an RJ11 to DB9. Is it actually an RJ12. They are both the same physical connector but the RJ12 has all six pins installed while the RJ11 only has the centre four pins with the outer pins blank. Is this perhaps the adapter that works plugged into the temp probe box using an RJ12 to RJ12 cable to connect to the DB9 focuser? The thing that confuses me though, is that the pins used for the focus motor in the RJ45 pinout on the temp probe adapter, is different from the focus motor pins of the RJ45 adapter you newly wired. If the temp probe adapter is to be placed between the controller and the focus motor shouldn't the RJ45 pinout be the same as far as the motor pins are concerned. Or is the RJ45 on the newly wired adapter connected to a different RJ45 port on the controller. 🤔 Personally, instead of an RJ12 to DB9 adapter, if I didn't have one, I would just get an RJ12 to RJ12 phone cable which are cheap, cut off one end and solder a DB9 connector in its place to connect to the focuser. Alan
  23. Very impressive. 😊 I like the white pillars behind the yellow ones with the blue 'sky' in the background. Personally I don't think 'correct' RGB stars actually improve the image as they tend to imply that the background colours are real. Alan
  24. That's not bad for just an hour. 🙂 For comparison here's a similar view in H-alpha to see if yours adds anything different. Note the RASA in NB produces circular diffraction rings on bright stars if a semicircular cable routing is used, the ring spacing depending on wavelength. What H-beta did you use Steve? Alan
  25. Astronomik do an H-beta filter in their visual filter range, at a reasonable price, with a pass band of around 14nm, so i thought it could be interesting to try it on the RASA to settle the argument 😁 as it should capture the fainter data reasonably quickly. At that pass band the fast optics may not be such a problem, compared to their 6nm filters. Wondering what the difference between visual and imaging narrowband filters was, it turns out that the visual filters don't block IR and need an additional IR blocking filter if used for imaging. This is difficult on the RASA 11 with a mono camera as there's nowhere to fit an extra filter when using the Baader UFC system. I could use it with a OSC camera with its built in UV-IR cut protect glass I suppose, as the green should catch most H-beta. Alan
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.