Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.

rickwayne

Members
  • Content Count

    183
  • Joined

  • Last visited

Community Reputation

173 Excellent

About rickwayne

  • Rank
    Star Forming

Profile Information

  • Location
    Madison, Wisconsin, USA
  1. Actually I find that with Astro Pixel Processor I'm spending very little more time processing the individual filters' outputs, at least up through calibration and integration. Once I tell it in the loading process which lights go with which filter (and, optionally, session), it just handles it and I get N integrated images, where N == number of filters. There is the process of compositing itself, of course, which is alien to OSC folks. But it seems most people do e.g. gradient reduction after compositing, rather than on the individual mono images. So yeah, it takes a little longer but I'm definitely one of the "hurts so good" crew.
  2. I don't think you'll be sorry you went with mono. The flexibility of doing RGB, LRGB, HaRGB, monochrome Ha, bicolor narrowband, and three-or-better-band narrowband...it's just the nuts IMO. Since I'm very much still learning, the ZWO filters are by no means the limiting reagent for me, YMMV. If I'd waiting until I could afford something better than their $130 NB filters I'd still be waiting, instead of imaging.
  3. Oh, sorry! Linear processing is what you do before "stretching", aka contrast enhancement. Usually you wind up with better results if you do as much work as possible on the image before that happens. Calibration is using special frames like darks (for subtracting the thermal signal the sensor records even when there's no light), bias (for subtracting the signal the camera adds to even the shortest-possible exposure), and flats (recording your optical train's and sensor's imperfections such as vignetting, dust, or funky pixels). Then there's integration, aka stacking, averaging a bunch of sub-exposures to yield a smooth, noise-free (hah!) image. Light pollution usually casts some kind of gradient across the image, which should be removed, along with any overall color cast. Then the next step would be stretching, or mapping the very narrow range of values that represent what you want to see in the final image (e.g., the tones in a nebula) onto the full range of values in that image. For example, the whole range of values from the darkest to the brightest parts of a nebula might only span a few hundred numbers in the calibrated, stacked image, but you want that to range from near-black to almost-white. That's the levels and curves part. Then comes more tweaking. You might apply extra stretching to particular bits, for example. Or do noise reduction. Or digital sharpening.
  4. As far as I can tell the D3400 is supported by the INDI library, which means you could run it with KStars and Ekos. That's a full-featured astronomy suite, runs on Macs and Linux too.
  5. I'm a big fan of Charles Bracken's book. If you're set on Photoshop, see if you can pick up a copy of the first edition of The Deep-Sky Imaging Primer, the (current) 2nd Edition does include a fair bit of info on Photoshop processing but concentrates more on PixInsight. 1st Ed. was heavier on PS, or so he says (I only have 2nd Ed.). I would heartily recommend that you do your linear processing in something other than Photoshop, though. I'm not even sure how you'd accomplish calibration with dark and flat frames in PS, for example. There are gradient-removal plugins, but really you'll get better images if you use a dedicated astro package to do calibration and integration at least. PS only has mean and median stacking, for example. the more-sophisticated algorithms in DSS, Astro Pixel Processor, Siril, or PixInsight will make better use of your sub-exposures.
  6. Under most conditions you can actually get away with less total integration time with LRGB than with OSC for equivalent results. You can short the color data in favor of concentrating on high-res (no debayering!) low-noise Ha or luminance. Or so say some experts I respect. Plus, as others note, the mono camera enables narrowband work down the road.
  7. This may be pretty orthogonal to your original request, since: I have the cooled version I have the mono version I'm running at 336mm focal length I'm strictly a DSO guy OK? Caveats noted. I deliberately chose the tiny-pixeled 183 because my DSLR had whacking great pixels and I wanted as distinct an alternative as I could feasibly obtain. So now I've got a wide-field, low-noise, OSC camera, and a narrower-field, fine-image-scale one. YMMV, and almost certainly does. Know that the IMX183 sensors are notorious for amp glow. It calibrates out, yes, but it does add complexity -- if you use darks at a slightly wrong temperature, or forget which way bias or dark-flat frames go, you'll be left with a very visible pattern on your image after stretching. This is an argument for ponying up the extra cash for a cooled camera, so that all your frames are at precisely the same temperature and dark current can be correctly subtracted from everything. For what it's worth, here are a couple of narrowband and an RGB image shot with the 183. Focus and guiding were both a lot better on the Heart than on the other two.
  8. Thanks! This is WAY more solid info than I had. I'm still a little hungry for something deeper than "this isn't a problem with our filters because reasons", but I don't doubt their assertion -- nothing like a repeatable experiment to back one up! Plenty good enough to guide any notional future purchases, now I'm merely curious. And I have to go home, take the cover off my filter wheel, and peer at the physical object -- ZWO's spec sheet says 7nm, not 12, and I'm reasonably confident that's the product I have. Duh.
  9. I've heard plenty of folklore -- and repeated it myself! -- about the relationship between filter bandpass and aperture ratio, but I'd love to hear some quantitative information. I know that the angle of incidence can dramatically affect how much light passes through an interference filter. But specific nanometers and f ratios, I have not seen. My optical train is f/4.8 according to Stellarvue, I'm using 1.25" 12-nm filters on a 1" sensor with 55mm back-focus. Vignetting certainly isn't egregious -- I mean if I stretch a flat till it screams I can certainly see some, but it calibrates out easily. Heck, I've got a 183, vignetting is the LEAST of my calibration problems! (If you don't get that joke, do a Google image search for "IMX183 amp glow"...and marvel.)
  10. I should point out that I do understand that requirements may vary. For me, a laptop would have to have a 12-hour battery life to suffice as my main/only imaging computer. My multiday trips totally divorced from mains power are few, but they're CORE. The Pi-based gadgets have the advantage that the current-eating devices (i.e., those with screens) need only be operating while you're actively interacting. For the long hours of sequenced imaging, you can run your rig on the Pi's trickle of battery power. My Pi 4 gets its juice via a Waveshare stepper-motor control board, so I don't even have to have a converter -- just plug another barrel connector in. Sometimes when I've worked in the back yard, I've used the high-end MacBook Pro on which I develop software for a living. 2-second plate solves, I'm down with that! But then I have to worry about leaving it outside all night. If the Pi gets rained on or possum-gnawed, meh, it was $56.
  11. It really is a kick, isn't it? Orion is a great target, it's easy to get something amazing and you can spend a career getting something just a little bit better every time. I certainly haven't gotten an image of it yet that really satisfies me, but I've done a bunch I'm proud of nevertheless. And by "easy", please don't think I'm implying that you have a run-of-the-mill first result there. Yours is amazinger :-).
  12. Gonna put in a pitch for the Pi here. A Raspberry Pi 4b with 4 GB RAM will set you back all of US$56. It has plenty of snort to do sequencing, guiding, plate solving, autofocus, while running a planetarium program too. A Pi 3 like ASIAir will struggle a bit doing all that. You can purchase turnkey software for US$49 from Stellarmate, or go free from the ground up if you prefer. There are at least two known good platforms (Stellarmate and Astroberry) to choose from. For that matter, you can buy a Pi 3 and case with everything preloaded from SM for US$149. You can remote in wirelessly from a tablet, phone, or laptop, or connect an HDMI display and USB keyboard and mouse if you prefer. Whole thing will easily run all night off any battery that can power your mount and camera. I won't pretend I"ve never had any problems getting things working from time to time. (Few astronomers can!l But it largely Just Works.
  13. Dithering, in case the term isn't clear, is displacing the optical train by a few pixels' worth every so often, so the same pattern of photons isn't falling on exactly the same spots on the sensor for frame after frame. That can lead to noise artifacts ("walking noise" is commonly called out). It's frequently and conveniently used in conjunction with autoguiding because this is exactly what an autoguider does -- repoint the optics by a tiny bit (a fraction of a pixel) to compensate for mount imperfections or whatever during the exposure. The same software and hardware can be used between exposures to repoint by some number of pixels in a random direction, so that the resulting frames don't quite all line up and the noise artifacts are vanquished. Stacking software has to be able to align images anyway (frame-to-frame pointing is never perfect to begin with), so it's easy for it to deal with dithered images. It can be done manually by changing where the optics are pointing by a little bit between frames, whether that be via a mount's hand controller (mildly tedious) or by changing a ballhead's position by a fraction (SUPER tedious).
  14. Yeah, I was really struggling for a response to this and eventually I had to drown my Inner Nerd and just make my response "he's right". I think where I was having trouble was conflating some similar issues. In case anyone else has the same confusion, here's me: "It's analogous to blowing out highlights, but in the other direction, right? Once you get to zero, you're at the floor, you can't discriminate values below zero. So you're losing differentiation among the darkest tones." Which is true...for a single exposure. So, other things being equal, six photons are six photons, whether captured in one exposure or 200. As vlaiv is careful to caveat, not all other things are, in fact, equal. But once you get past read noise, they're equal enough.
  15. It's really amazing, isn't it? I still haven't gotten over the thrill of seeing something appear on the screen.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.