Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.

sgl_2019_sp_banner.thumb.jpg.a0ff260c05b90dead5c594e9b4ee9fd0.jpg

rickwayne

Members
  • Content Count

    134
  • Joined

  • Last visited

Community Reputation

132 Excellent

About rickwayne

  • Rank
    Star Forming

Profile Information

  • Location
    Madison, Wisconsin, USA
  1. APP is Astro Pixel Processor. Not as many features as PixInsight but pretty straightforward to learn and use. Gives IMO excellent results. Reputedly one of the very best packages for stitching mosaics.
  2. I'm particularly fond of your Pleiades, myself. Ain't that APP a grand little program?
  3. Don't neglect consideration of KStars/Ekos for automated acquisition. Very full-featured, runs on a Pi, Mac, Windows, or Linux, is open source, built from the ground up for remote operation, sequences and automates dome control, heaters, park/unpark, meridian flip, guiding, mosaics, more.
  4. Likewise The Forum That Dare Not Speak Its Name* has plenty of downloadable data in the Beginning And Intermediate Imaging forum. *Cloudy Nights. Shhhh!
  5. This is a case of "easy to learn" vs. "best results". At that focal length you will likely eventually get smaller RMS numbers with OAG but a small guidescope, rigidly mounted, will likely get you guiding more easily. You can probably reuse the camera for the OAG so the only sunk cost is the guidescope, which you can resell for half or better. Having *any* guiding is likely to be a big enough improvement that a separate scope will serve you well (unless nonrigid mount, mirror flop, etc.).
  6. Great color in the nebulosity -- but, um, are there really that many purple stars? (Spoiler: There are not ) If you could remove the stars (e.g. starnet++) and process the nebulosity separately for this excellent result*, then add the stars back in, it would be even better. I'm totally unfamiliar with Luminar; would your workflow allow that kind of manipulation? Also, detail and your composition rock as well. *Or you could remove the stars from your final result here, grab them from your original image before much processing, and stick them back in that way.
  7. What did you use for processing?
  8. Nice! And hello from far-away Madison!
  9. The various sub-disciplines have very different requirements, especially planetary vs. deep-sky, and even within deep-sky you might have different needs depending on your targets (big emission nebulae? Small galaxies?). What would you like to image? What level of light pollution are you working in? And what sort of optics do you have at your disposal?
  10. You could also give starnet++ a try for star removal. I was able to get much better results with that right off the bat than from Photoshop tweaks. As vlaiv says, after you've registered your NB and RGB images and run starnet++ on the latter to get a no-stars layer, you can do a "difference" blend mode in Photoshop and "stamp visible" to create a stars-only layer. Process that and your NB stuff, then add the stars back on top and set blend mode to "lighten".
  11. +1 on 183MM-Pro + ~350mm. I really like mine a lot. Resolves fine nebulosity detail visibly better than my DSLR. Confident it will do better still once my (*@&% guiding is sorted.
  12. Sorry, I can't find it. My best memory of their analysis -- which was supported by data, but you only have my word for it -- was that "seeing wobbles" are such high-frequency phenomena that you'd need absurdly fast exposures, AND processing, AND mechanical response the pulses, in order to chase them in the first place. Certainly the Jupiter image is wiggling at much more than 1 Hz -- I'd guesstimate more like 10 or more. So even 1-second exposures would be integrating quite a few of those. Mind you, I've no interest in flogging an angels/pinhead argument here, especially since I don't have the data myself. I do advocate is that one be open to experimentation. If 3 seconds works for you, more power to ya. It's not as if I've got my own guiding dialed in yet, tell you that for free.
  13. I've seen a recent article that seemed to debunk the "chasing the seeing" concept pretty thoroughly and argued that you should use the shortest cycle that gave a sufficient SNR. I will try to rediscover it.
  14. Of course, the other option for imaging from light-polluted areas is to pick a narrowband wavelength which predominates in the light from your target, and use that as a luminance channel. It will emphasize whatever color its wavelength falls within (e.g. H-alpha will preferentially brighten the red areas of an emission nebula), but sometimes that's perfectly fine. If I'm shooting the North American Nebula, it's all red anyway. That won't do very well on, say, a reflection nebula. Using H-alpha for luminance on the Trifid is going to leave you with some pretty dim blue areas. I don't think the wavelengths of the three commonest narrowband filters map very well to creating a synthetic L that includes R, G, and B. Hydrogen-beta for blue, maybe, and O-III for green, but it's hardly ideal.
  15. Wowsers. Great image. ABSURDLY great image for your first NB!
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.