Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

rickwayne

Members
  • Posts

    695
  • Joined

  • Last visited

Everything posted by rickwayne

  1. Excellent plan. I'm always in too much of a hurry to collect data and it has cost me.
  2. Fundamental recommendation: Buy and read Bracken's The Deep Sky Imaging Primer, Lodriguss's Catching The Light, or Richards's Making Every Photon Count. Seriously. It's fulfilling to try stuff but you will make better images sooner if you equip yourself with the prior-art science and engineering involved, much of which is not intuitive. Or at least it wasn't to my brain, which is that of a software engineer originally trained as a photojournalist. For example you'll have a really good grasp of the Barlow question (well-explained here, to be sure!). You doubtless are familiar with signal/noise ratios but perhaps not with the sources of noise specific to astrophotographs and how to balance off the techniques for dealing with them. Or how to diagnose guiding and tracking problems. Or...well, read Charlie or Jerry or Richard for many, many more.
  3. Yep! So, first question: What do you want to image? (The Milky Way? The few big, bright objects like M42 or Andromeda? The Moon? Planets? Fainter deep-sky nebulae? Galaxies?) Even with top of the line, absurdly expensive gear, most people who do, say, faint nebulae as well as planets have completely separate rigs for it. Next question: How? Is it enough to just see an enhanced electronic image in the moment, or do you want to take pictures and process them for later display? (The former technique is called Electronically Assisted Astronomy, if you want to do some searches). Third question: As Olly says, what have you got already? Finally: Broadly speaking, how much were you hoping to spend? All this is NOT to discourage you! Astro imaging does not require tons of expensive equipment. But if you tell us where you want to go, directions are much simpler to provide.
  4. Oh man, that is pretty much Peak Rick, there. Sorry all.
  5. Ok, here's some science! I don't think this video has yet been name-checked in this thread, but try He shows the results for exposure, cooling, etc. Again and again the message is "good enough is good enough". 0 deg. C is a huge win over +20, -10 a bit better, -20 hardly distinguishable from -10. Diminishing returns step quickly in for almost everything.
  6. IIRC others have commented that ASIAir does not provide a user-settable offset.
  7. Exactly right. If you're not blowing out the highlights, enough integration time will let you resurrect really, really dim areas without changing the exposure time. If you think about it, even if you're only capturing 1 photon every 3 exposures from a given pixel's area, that still adds into the stack. The benefit of longer exposures is less read noise, since you're getting the same integration with fewer reads. Downside is degraded outlier elimination (e.g. satellites). Orion is one of the few targets in the night sky that is so bright that it does respond well to HDR techniques. You don't need nearly as much integration time for your highlights (i.e., the Trapezium) because highlights will be less affected by noise to begin with. I use the poor man's Photoshop HDR of simply putting the highlight image on top, giving it a layer mask set to black, and painting a gentle bit of white on the Trapezium with a 0% hardness brush set to pretty low opacity. If I'm impatient it looks like someone burned a hole in the image, if I'm not it's undetectable.
  8. OK, that PHD screenshot looks like you've got it goin' on. Whoo! As for your Orion image, let's see...well, Telescopius's M42 shot is so awful you can hardly tell which part of the nebula is in the FOV. Here's Andromeda for comparison. I'm assuming you were using your 80mm scope, anything else on your list and the crop would be even more extreme. The 462 is not particularly well suited to deep-sky work, as you say. (Though "Onward to Mars" is fairly apposite (-: )
  9. That's pretty close to correct. Dark flats always work; bias frames are a bit simpler to manage. Some cameras yield very inconsistent results at the short exposure times necessary for bias frames, so for those, dark flats are your only option (the ASI1600 is one of those). But it's not CMOS vs. DSLR -- plenty of CMOS cameras, such as my ASI183, do just fine with bias frames. There have been any number of online flamewars about this but trust me, those are the facts of the matter. Better yet, don't trust me, do your own experiments! I know I have, but that's just N=1, right? I prefer the convenience of shooting a crap-ton of bias frames once or twice a year to the necessity of shooting a new set of dark flats every time I change my flats exposure time, but that's just me. I run my flats with Ekos set to "auto", so the exposure time does vary from session to session (though I've never experimented to establish how much this would affect the end result). If you feel better about dark flats, go nuts.
  10. You may wish to combine that with test images in different parts of the sky. If your tilt changes due to the physical orientation of the tube, then something is likely flexing.
  11. It's not as if you'd have to replace the camera if you discovered that the scope really needs to be upgraded -- it would work just fine with the new scope. But if you're going for deep-sky, cooling is defs worth it. Cooling per se gives you advantages in reducing thermal noise (though with quickly diminishing returns, per Dr. Glover's video). But the big win is in consistent sensor temperatures, which ensures that your calibration frames are always spot-on. Of course I shoot with a 183, whose amp glow is the stuff of legendary horror tales. Still.
  12. I am tempted to try an experiment -- shoot RGB at normal sub-exposure times and accept that seeing will blur the image a fair bit, but then do as many gigabytes of luminance at video speeds as will fit on the disk. Getting the compute resources just to classify the frames and select the best ones, to say nothing of calibration and integration, would be a challenge. But maybe I could charm the necessary Amazon instances out of the University somehow. I mean, I have the administrative privileges, but jail has even fewer astrophotography opportunities than Wisconsin in December.
  13. This is just a theoretical observation -- I have only used lucky imaging for planets. But...what benefit is there to a 5-second image? That's enough time for a whole bunch of atmospheric turbulence cycles. I mean, yes, seeing distortions are cumulative, but they accumulate very very quickly to an asymptote. Now, if the goal is to reduce errors from guiding that's not up to the task, or tracking problems, OK, that makes sense. Those errors occur over much longer time scales. Ready to be educated, here.
  14. A potentially useful rule of thumb: If long exposures are not a problem for you in themselves, expose till you "blow out" a few dozen pixels or so at the top end, but no more. This ensures that each subexposure is as long as it can be, minimizing read noise. As Vlaiv is fond of saying, dynamic range isn't really a thing when you're stacking exposures -- you can still capture photons that come so seldom that some exposures miss them entirely! -- but this does use all of your camera's range. Intermittent tracking satellites and aircraft, cosmic rays, and other gremlins argue for shorter exposures, so there is a wider statistical universe for outlier elimination. And there are significant benefits to standardizing on one or a very few subexposure lengths (fewer sets of darks e.g.).
  15. The math for applying flats only works if they're normalized to a reference level from which the processing software can apply the corrections. A frame exactly the same as the flat, but with no light signal, is one way to do this. Another is to establish the absolute base signal from the sensor, with no signal (thermal or otherwise) and scale that. That's done by taking a number of dark frames at the minimum exposure time for which the sensor yields reproducible results, and stacking those to create a master bias frame. There is a perennially tiring debate about which is better -- flat darks require less extrapolation, bias frames don't have to be reshot every time you change exposure for your flats, tastes great, less filling, you get the idea. Some sensors (e.g. the Panasonic in the ASI 1600 series) do not yield reproducible results from very short exposures, and so bias frames are not an option. For my 183, even with its famously wacko amp glow, bias frames work fine. I don't argue with anyone who advocates dark flats. They also work fine. For an example, I use 1/8000" exposures for my DSLR, and 1/1000" on the 183.
  16. "No", cried the porcupine, "we must stick them with quills! It's the only way!" I run a Raspberry Pi on the scope and frequently remote into it with a tablet. Photos go straight onto a USB device plugged into the Pi; I only use iPad battery when I'm actually looking or controlling. I have less than US$120 in the Pi, case, motor-controller board (lets me run off same 12V as my mount), and software. Heck, I can use my phone if the tablet dies.
  17. No dumb questions! Yes, that is what integration time means in DSO jargon, the sum of all subs' exposure times. M33 is much dimmer, I have never tried to capture it without tracking but I reckon it would be much harder. Conveniently enough, M42 happens to host a lot of satellites this time of year from my location. Thanks Elon! (Only a little bitter there, since many satellites contribute much more to overall human well-being than my astrophotographs do.)
  18. Just centering the donut doesn't necessarily ensure optimal optical collimation. I'd add successively evaluating for on-axis coma closer and closer to focus. The Tri-Bahtinov is a great way to establish that the mirrors are parallel, even if they're not necessarily coaxial. Then again my brain is polluted with all sorts of Richey-Chretien crud, so what do I know.
  19. 1) They're certainly sufficient to start. You won't get very long exposures without a tracking mount of some kind -- the "300 rule" is a reasonable heuristic. (Divide 300 by the focal length to yield the maximum exposure that won't yield star trails.) Your results will drive what you'll need. Between the extremely high contrast and incredibly fine detail of astrophotos, it's a challenge for any lens. So the fewer compromises, the better. Kit lenses are zooms (compromise) and inexpensive (more compromises). Stopping down a bit, or placing a mask with a circular hole in front of the lens to provide a perfectly round aperture, can help. 2) Definitely go for primes. Autofocus doesn't work so you can look for cheap old used manual-everything glass. This image, for example, is from a 1970s-era 50mm macro. 3) Those are three pretty different kinds of targets! The Milky Way often does well with wide lenses to include the landscape, which (see "300 rule") means reasonably long exposures without trailing. So no EQ needed. Planets are tiny and super-bright (they're in full sunlight), so the favored technique is to use long focal lengths and shoot video at the highest frame rate possible, out of which the processing software selects and stacks the frames with the least atmospheric disturbance. (I use 1600 or 2432mm telescopes for planets.) You can get away without an EQ mount, though it's a bit tedious to repeatedly reframe. For deep sky objects (e.g. nebulae), except for the very brightest, some kind of tracking EQ mount is an absolute requirement. And the answer to "how much does a tracking EQ mount cost?" is, of course, "How much ya got?" You can build a barndoor tracker for very little that will do short exposures up to 2-300mm; you can buy a camera tracker that will yield somewhat longer exposures and handle longer focal lengths; or you can get anything from a US$800 to US$10000 equatorial mount, whose performance scales nonlinearly with price. I went with the US$800-some option for my first mount and did pretty well from 350-900mm.
  20. USB 3 is notoriously picky and tricky -- plenty of folks have trouble with much shorter-cabled setups. For example a lot of CEM70G users struggled early on before the community figured out that short, high-quality cables were the only way to reliably connect to that mount. Sounds like a perfect application for a single-board computer at the scope, together with WiFi or Ethernet for remoting in to it. Basically what I do with my Pi when I want to park my [removed word] in my nice cozy recliner. For me it's a Pi 4 running StellarMate OS, but lots of folk use Intel NUC or similar Windows critters. Nice thing about INDI is you can either run the client app (KStars/Ekos) on the mini and remote in to its desktop with just about any device, or run just the INDI server and device drivers "headlessly" on the mini and run your client on your inside computer, connecting the two over the net.
  21. M42 and M31 are very Zen targets -- some of the easiest to image in the night sky, and yet with endless challenge for the experienced imager. If you can get 60 seconds, that's probably enough for just about anything you're likely to encounter starting out. (Criminy, one night when my autoguiding didn't work I got this with 60-second exposures through narrowband filters. APOD? No. Recognizably the Veil? Absolutely.) Do go by the histogram and not by the appearance of the sub-exposure, though. Ideally you want no or very few pixels at 0, and a small number of maxed-out ones as well. No amount of stacking will recover out-of-range detail on either end (yes yes, if a photon hits once every ten frames that's recoverable...work with me here, OK?). And, as newbie alert points out, you can use HDR techniques to expand the dynamic range if needed. Mathematically speaking, the best SNR you can obtain comes from the smallest number of exposures -- that is to say, longer is better. But as you've already discovered, there are practical limits to that. Even with a perfect mount, if you did a two-hour exposure you'd still have the problem of blowing out the bright areas (most frequently seen as a loss of star color, since for most DSOs they're the brightest thing in the frame), and transient issues such as satellites and aircraft. If you have 200 exposures and one has a bright line through it, the software can statistically eliminate it with little loss. If you have one long one...not so much. The question of diminishing returns crops up over and over again in astro, whether it's total integration time, aperture, setpoint for cooled sensors, number of dark frames... You can do the math to figure out exactly how long your exposures should be to make the read noise below a given percentage of the total noise, but that percentage is arbitrary to begin with. So...good enough is good enough. Integration time rules.
  22. Histogram: The commonest pixel value is of background sky, so that should be the biggest peak. If you place that peak at 1/4 to 1/3 of full scale, you should acquire even the dim nebulosity in each sub-exposure. ISO: DLSRs differ, but many of them are "ISO-invariant", that is, within a certain range an image shot at one ISO and "stretched" to be as bright as one shot at double or quadruple the setting will appear basically identical. Totally backwards to what we're used to in terrestrial photography, but honest, it's true. So just use the longest exposure time that reliably yields good subs (i.e. small round stars), and dial the ISO up or down to get the histogram peak where it should be. Totally concur with Olly that it's counterproductive to tweak subexposure times and ISOs. Find a pair of values that yield reasonably good results and just stick with that. Because... ...Integration time matters way, way, way more than the number of photons a sub receives. WAY more. Yes, there are corner cases and read noise to minimize and entire books on the subject, but four hours of 1-second exposures will beat a half-hour of 10-minute exposures every time. Again, not intuitive, but true. The question of whether 10 one-minute exposures are better than four 2.5-minute exposures is a red herring. Find a basic combination of exposure and ISO that yields reasonable results with your equipment and then just hammer it with as many minutes/hours/days of integration time as you can stand. (Among other things, it greatly simplifies processing to only need one set of calibration frames.) The Deep-Sky Imaging Primer has a really excellent chapter explaining all this stuff, covering signal/noise in detail and explicitly running examples of short vs. long subs. You don't need a ton of math to follow it but if you like math, he shows his work. Hard to recommend that book highly enough for anyone starting out. I don't aim quite as loftily as Olly most of the time. My rule of thumb is more like "One hour marginal, two hours OK, four hours better". You can look at our image galleries and set your own level of "acceptable".
  23. That is certainly true for deep-sky objects -- by the rules of thumb for star trailing, you could get maybe half-second exposures at best unless your mount tracks. For many dim deep-sky objects, that's just not anywhere near reasonable, we tend to talk in terms of hours of integration time and so you'd be looking at thousands and thousands of sub-exposures to stack. Orion is a bit of a special case -- so bright that it breaks the rules. I got this with a grand total of 12 minutes, admittedly from Bortle 1 skies. You could certainly image the Moon very nicely. 750mm is a tad short for planetary work but with a Barlow or two you could make it happen. The standard technique for those brighter objects is to take video (FireCapture is the standard, dunno if it would work with your camera) and then use stacking software that pulls out the best bits of the best frames and stacks them for noise reduction. AutoStakkert! is the leader there. Finally, you sharpen the daylights out of the result. I've gotten more-or-less acceptable results out of Photoshop, but Registax is what the cool kids use.
  24. A gentle correction: "autoguiding" refers to the process of continually taking short exposures with a separate camera and using those to nudge the mount during the long exposure with the imaging camera. Has nothing to do with pointing to a target.
  25. Yeah, either way you can make great images. When a decision is so evenly balanced that you get stuck, eh, trust your judgement: Your most careful evaluation indicates that you are equally likely to be happy either way. Three months later you're much more likely to be saying "I am having so much fun doing this" than "Gee this sucks, wish I'd made the other choice".
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.