Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

discardedastro

Members
  • Posts

    895
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by discardedastro

  1. My next build is very likely to be based around the 3900 - it's an absolute no-brainer for price/performance against Intel now. PI is naturally very good at parallel processing if you've got the IO to match, and the PCIe4 drives now available with Zen 2 (and with Intel eventually, no doubt, though Intel have for the last 10 or so generations of chip heavily segmented high-IO capacity parts into the "enthusiast" parts at four times the price) will satiate the requirements of a 32-core part running on 32 images at once.
  2. This is the first image where I've gone really deep on a single filter over a few nights. Had plenty of fun processing data across meridian flips with the moon - lots of gradients and oddities. 7.5 hours or so of data, Ha (7nm Baader). 200PDS, EQ6R, guided with PHD2, capture with INDI/Ekos. Software for processing was PixInsight. Processed each night individually through calibration, cosmetic correction, subframe selection and weighting. Then registered everything, calculated local normalisation, integrated with local normalization and linear fit rejection. DBE'd carefully, though couldn't get all of the gradients out without affecting signal data. Used MureDenoise, deconvolved with a 350-star-sampled DynamicPSF and some light masking/support masks for local deringing, then further denoising with MultiscaleLinearTransform, HistogramTransform, CurvesTransform, a light LocalHistogram transform, and TGVDenoise to finish. If I get some OII/SIII filters I'll have a go at revisiting this and combining the images.
  3. Absolutely. It's also a great way for amateurs who don't know how to really describe or talk about what they're seeing to share their experiences. I'd almost see imaging as a gateway for visual.
  4. I'm not a lunar/planetary imager but when I've tried then usually the workflow is totally different to DSOs with different software. SER's basically a video format, since the overhead of all the separate image files adds up quick with fast capture for lucky imaging. I've tried using: PIPP to pre-process the video into the best frames (which then yields tiffs etc) and clean some things up Autostakkert! or similar to do stacking - Registax can be used for this too, I think, though I've never tried it
  5. Didn't know about MureDenoise, will take a look! I did DBE with corners and a few points that were relatively neutral in terms of overall contrast in the image, probably 10 points total, then did ABE near the end, which did a pretty good job in my eyes of preserving the finer details while improving overall contrast. I'm hoping to get some more time on this target through the week if the cloud forecast stays as is - would be nice to have another two filters but need to upgrade the EFW mini to a full EFW so it's a fair bit of cash even just on the 8.5nm filters from Baader.
  6. Hydrogen alpha only, 30x300s for 2.5h exposure last night. Postprocessing in PI; cal, subframeselector (all good, so just marking weights), cosmetic correction, staralign, localnormalization, integration, very light DBE, MLT denoise, ABE, histogram+curves, TGVdenoise. Was a bit twitchy about background extraction on this, given the nebulosity in the background, but I think this is quite a pleasing result.
  7. FWIW, I've spent about two afternoons doing backlash adjustment now and found the process quite straightforward - it's a simple matter of wobbling things as you drive them to find the spot with the most play, undoing four big bolts ever so slightly on the worm bodies and then adjusting two grub screws, re-tightening the four bolts and driving it 360 degrees to ensure there's no binding. I've attached the instructions for it. At 0.5x guiding rate PHD2 only uses a 400ms backlash compensation pulse. I'm curious when you say the guiding software is fighting with the firmware. Could you expand a little more on the behaviour you're seeing and how you have things set up? SW_EQ6-R_backlash_adjustment.pdf
  8. EQ6-R's alt bolts are at least accessible but the bolts are still not terribly granular - I find it very hard to achieve fine polar alignment. I'd say that my EQ6-R's a fine mount, insofar as it's worked well when aligned well. I've fettled it a bit to reduce backlash and it does demand perfect balance but once you're there it'll perform as advertised. A used EQ8 at that price point would certainly be compelling if I were buying new. I think beyond the EQ6-R/8/CEM60/80 level of performance (stepper motors, belt+worm drive, encoder-equipped or not) you're realistically looking at friction drive mounts like the Mesu, which will break your budget. I would say that I've got an EQ6-R Pro and still have non-round stars with .6" guiding at 0.5"/px imaging scale - I'm not sure I'd leap to the mount's performance alone as a diagnostic of oval stars. In my case I'm looking at my guiding (probably moving to an OAG+higher-res guidecam).
  9. Paracorr to replace my MPCC MkIII on the 200PDS, since that's probably causing a few issues in my imaging (lopsided field and some astigmatism). Then it's saving for the obsy, in theory. To add to the chaos, though, my PC's getting a bit tired, so that'll take precedence - this one's had a good run though, since 2012!
  10. I've had issues with Ekos getting wedged on a job before. Usually the capture module kicks off autofocus, I stop or pause the job while it's doing it to let me make an adjustment to the sequence, and on resuming the sequence it just sort of sits there. Normally I just restart Ekos - it only happens once a night at most, and if I set up the capture sequence early on and don't touch it it seems entirely reliable.
  11. Tanqueray tonight. It's the only way to cope with the politics. Definitely going to have another crack at this target and get some more data, I think. When clouds permit...
  12. I do, but I always feel a bit wary of oversaturating the image - it's a bit hard to judge the balance through LRGBCombination, though. I used a slightly nonlinear saturation curve to avoid over-exaggerating the reds too much. I'll maybe have another play with the combination of L and RGB tomorrow. I also don't have a colour-calibrated monitor, so a bit wary of what I see sometimes; when I replace this desktop I'm also planning to upgrade the monitor and get a calibration device. Very happy with the sharpness of the image, though, which has been an ongoing challenge for me. Guiding is working well and producing sub-arcsecond results - I think the only thing I can do to improve on that is going to be an improved guide camera and OAG (though my very-very-rigid guidescope mount doesn't seem to be suffering from any differential flexure - they're solid milled blocks of alu which are friction-fit onto the guidescope and bolted onto an ADM dovetail setup). Then a better mount, of course! The deconvolution for this was 50 rounds w/ 3 wavelet layers, and I might be able to improve on the result - I didn't use local support as I did all of the processing of this image from raw frames to finished photo on two fairly short coach journeys, and I didn't have time for it. I did do a 200-star DynamicPSF round to get as good a PSF as I could, and did mask the processing for all lights to avoid artefacts in dark areas. Edit: I just tweaked the JPEG, in lieu of having the real data to hand (it's on my laptop and I've had too much gin to muck about getting that going):
  13. Very odd. At risk of teaching you to suck eggs, could it be the clutch not providing enough grip? Nice and easy to test, at least... If the worm's meshing properly and the cog on the worm is happy and being driven by a tight belt then there shouldn't be opportunity for problems mechanically. I'd be turning my attention to the stepper or drive circuits, though that starts to get hard to test...
  14. Not that much light (recurring theme - I do need to pick a good target and focus on it for maybe 10, 20 hours) but this turned out alright. Light from two nights in August and September. M101 (Pinwheel Galaxy). 48x300s subs total, of which about 15 Lum. Baader filters, 200PDS, EQ6-R Pro, PHD2 guiding on a Primalucelab 60mm guidescope. ASI183MM-PRO imager, ASI120MC guider. Processing my usual DSO routine in PixInsight; stacked with local normalization, cropped, DBE'd, deconvolved with dynamicPSF+mask, combined RGB, ABE, background neutralization, SCNR, photometric colour cal, MLT denoise, histogram transform, LRGB combination, final curves+a bit of saturation before TGVdenoise.
  15. It is certainly mechanically possible, more or less depending on exactly what you've got on the mount and where. Reducing guiding speed could help, yes. Those corrections also look quite large - it may be worth tweaking the Dec algorithm to prefer more, smaller movements over larger movements (smaller min-move would be a good start). I'd start there and then look at reducing guide pulses to .3x or .2x if the problem persists.
  16. Absolutely wonderful - I just started capturing data for this (a very cloudy night) and now I know what sort of image I'm aiming for. The fine detail in the inner clouds is superb.
  17. I've had similar issues and it all came down to balance and PA accuracy. Balance your mount and scope as perfectly as you can. If you're using a Newtonian and don't need to look down the scope (imaging use only), rotate the 'scope in its tube rings round so that the focuser points to the counterweights to improve dynamic/radial balance. Take your time on this and get it right, especially for the RA axis. Start with PHD2's static polar alignment to get a rough PA, and then use the drift alignment to get it spot on. Just resign yourself to spending 20 minutes before you do any imaging doing that, just as you'd spend 5 minutes collimating if you're using a Newt. Drift alignment was a bit confusing initially but has gotten me to very good PA very quickly in recent nights. There are other good checks to do - use PHD2's manual mount controls to ensure you can actually see movement in all 4 axes. PHD2 can also do a star-cross test, though that's not quite worked for me sometimes even when my mount's set up right. These will confirm good motion via your guide input. PHD2's guiding assistant is also well worth doing. I'd do a low Dec calibration (and do the meridian flip calibration, too, so you get that done at low Dec), then use the guiding assistant (measure for 3-5 minutes) to work out what sensible min-move parameters will be and measure the relevant backlash.
  18. 0c is a pretty standard lower operating temperature for most electronics. Industrial-rated parts typically go down to -5c as a pretty standard thing. For telecoms cabinets and sites with not a lot of kit in we actually have heaters to keep things warm enough in winter, though usually self-heating does fine. If you have a power outage, though, and the kit powers down you still need to be able to warm things up prior to start-up! I'd be stunned at anything not being happy at 10 or 5c. What was the dew point? Could've been condensation related?
  19. It's only a small amount - LRGB was about 4.5h, Ha 45 minutes - but a fairly fast/large aperture (200mm f/5). NBRGB I tried but was unhappy with the results - until on revisiting it today I turned off the STF I had applied and realised the blueish tint was just a result of that. NBRGB with Ha for R gives this quite pleasing result: There's quite a lot of Ha in the background and this works really well I think. I'm not sure what magic is going on under the hood, but it's working.
  20. Hi all, I'm shooting with a Newtonian so diffraction spikes are naturally a part of things. However, in Ha I'm seeing far less in the way of spikes, even with a threefold exposure time over my L frames. Consequently combining HaRGB using PI's LRGBCombination to use Ha as L gives odd results: The Ha on its own (after curves+some more post, but you get the idea): L, similarly processed: And the RGB without either mixed in: Any ideas on how I might go about successfully creating a HaRGB image from this data?
  21. Skies look alright here but I'm properly dead on my feet from last night so a nip of whiskey and off to bed for me. Last night's imaging came out well though: Some high cloud about and poorer transparency here than last night, but best of luck to everyone having a crack at it tonight...
  22. Yep - I oversample and it works. Obviously matching sampling is best, but for doing high-detail things like galaxies it's a fab setup. Bit useless for widefield though! Edit: Also, yes, you'll want to replace the focuser. Just motorising it will work for lighter loads, so on a mini filterwheel you might get away with it, but I swapped for a Baader Diamond Steeltrack: https://www.talkunafraid.co.uk/2019/08/a-focused-approach/ has some details/photos. The ZWO EFW or PLL Sesto Senso will work just fine on either focuser without breaking the bank.
  23. Very happy with this as the first image of the season - this was all taken in the one night with my usual 200PDS/ASI183MM rig. About 4 hours overall integration. Full gory details of capture are on the technical card at https://www.astrobin.com/422916/ Cropped, rotated closeup: Stars aren't quite perfect in shape but I'm imaging at 0.5"/px to make life difficult for myself. Processing was with PixInsight and followed the "usual" mono LRGB route: Calibration - 25 flats, 10 darks (at least) Blink SubframeSelector with some custom weighting, written to FITS headers CosmeticCorrection StarAlignment LocalNormalization with 256 scale ImageIntegration using linear fit/windsorised sigma as appropriate with SubframeSelector weighting Crop DBE DynamicPSF sampling from 300 stars down to 25 Deconvolution using a local support mask (generated from layers 2-4 of a 5-layer MultiscaleLinearTransform + StarMask with scales wound down and small-scale cranked up appropriately) and the PSF from DynamicPSF ChannelRecombination BackgroundNeutralization PhotometricColorCalibration SCNR MultiscaleLinearTransform to denoise HistogramTransformation LRGBCombination w/ chrominance noise reduction CurvesTransformation TGVDenoise in L*a*b mode
  24. +1 to setting up well in advance - I started off my session with a long Ha exposure, there's better contrast in Ha during twilight anyway (I think - seems so) and the long exposure highlights any PA/guide issues before I get into it.
  25. I can definitely confirm the 200PDS will work great for imaging, but you'll need a chunkier mount for imaging. When I was starting out I spent the cash on the mount (an EQ6-R Pro) and got a "cheap" 200PDS - turns out the 200PDS is actually pretty good! Imaging is all about guided pointing accuracy and tracking stability, really. The 200P, being a longer focal length, will make high demands of your mount - shorter focal lengths like those found in smaller refractors (also lighter) will let you get into imaging with a smaller mount, so that's worth considering. You'll still need reasonably good performance from the mount, but there's plenty of people imaging on EQ5s with small scopes. Planetary/lunar won't make the same demands on things, since you're using much shorter exposures (typically sub-second versus hundreds of seconds) so guiding and tracking error barely factors into things - so long as you can stay pointed at the target well enough and get a good enough polar alignment you can fix the rest in software.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.