Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

rickwayne

Members
  • Posts

    695
  • Joined

  • Last visited

Everything posted by rickwayne

  1. Even simpler: pick up an LED tracing panel on Amazon or similar. Try shooting a few images with it turned different ways, if those images aren't different the panel is consistent enough. The one I got runs off USB, doesn't need a diffuser at its lowest setting, and cost US$15. To shoot the frames, don't obsess about exposure or anything. Just get the histogram peak in the middle of the graph, and use that exposure henceforth. It merely has to be somewhere in the sensor's region of linear response. As was said above, bias frames are easy to manage and reuse, and work well with DSLRS. Have to have bias or dark flats for the flats to work correctly.
  2. Oh, and the drift alignment stuff sounds correct. One thing, though, change the slewing speed on your mount so you can run longer exposures. It actually doesn't matter if you go off the edge of the frame so long as you get the timing right. So long as the ends of the "V" wind up in frame and you can distinguish one star from another-- easy in the example you show-- it doesn't matter if the point of the "V" is way off the edge. Ideally you want to run the same length as your exposures, but in practice it can be shorter.
  3. I should just know, after all the times I've looked at M31, which way is RA and which is DEC. And on my phone I can't analyze it. But you can. If the trailing is in the RA direction, you've got periodic error or drive train issues. Elongation in DEC, unless you're guiding, means polar alignment problems.
  4. I find PS a nearly indispensable tool for finishing touches for just about all my photography, astro included. But having done some soup-to-nuts processing with it, ugh, never again. Astro-specific packages such as Astro Pixel Processor, PixInsight, and SiRiL simply have the tools to streamline some of the specific problems we encounter. I dunno, maybe there's a plugin for doing mean stacking with winsorized rejection in PS. It's not that it's impossible to get great results with Photoshop, it's just that achieving them requires a lot of time, or a lot of expertise, or both. If you intend to do a lot of other photography, then your pixel editor matters a lot and you might want to put your limited budget there. If astro is a main goal, consider putting money into one of the dedicated packages, and use a cheap or free editor (The GIMP comes to mind). SiRiL + GIMP are a completely free solution providing advanced astro-specific processing and reasonably advanced tweaking of the result. I will probably have to go back to GIMP when I retire and can no longer use the University's PS license.
  5. That is quite the first result! If/when you start doing the same in dark skies, you may need to engage "Pro mode" or whatever your device calls it so that you can dial in some exposure compensation. The software will try to find a middle ground between the dark sky and the sunlit Moon, and thus overexpose the heck out of the Moon. If you can tell it "No, take your exposure reading right here" you'll be way ahead of the game.
  6. +1 on that. The INDI architecture allows you to run just the device drivers on a scope-side computer and run the actual observatory software on another one, if you like. I never use that, because if the network connection breaks, so does your session. I heartily endorse the idea of a hard-wired machine that will continue to run your session headless, even if networking breaks down. The Pi 3 was a little bit shy of the computing oomph to make that happen conveniently. The Pi 4 has oomph to spare. Any mini-PC or laptop, likewise.
  7. INDI is not limited to Linux and Mac. Runs on Windows too. I will make my customary Raspberry Pi/KStars pitch here, for a particular reason. There is no substitute for a good mount, and good mounts necessarily cost money. So if money matters, and you can save on other areas, you'll be able to afford a better mount. You are in approximately the same boat that I was in, I suspect, so forgive me the "I did it this way so you should too". A Raspberry Pi 4 costs about US$65, a case with room for a HAT board will be maybe US$15 more, and the Waveshare Motor HAT board is US$25 last I checked. StellarMate OS is US$50; a microSD card, maybe US$12. So US$170 for a unit that will easily mount on your scope or on your mount, will run on less than an amp of current, and takes the same 12V input as the rest of your astro gear (even the same barrel connector). It will join your home WiFi, or you can run a Cat-5 Ethernet cable out to it. Both work with essentially no configuration. You can remote-desktop to it from Windows, Mac, Linux, or even mobile devices, and there's a free app to run it from the latter (similar to the ASIAir). It will connect to just about anything, not just ZWO gear, and runs a complete astronomy suite out of the box (including the PHD guiding program). Yes, more astronomy programs are available for Windows, but you don't need more. If you have a complete set -- and believe me, KStars/Ekos is nothing if not complete! -- you can image without limitations. The reason the Waveshare HAT board appears above is because it's a nice power adapter for the Pi (no need for a wall-wart), but more importantly it provides two stepper-motor outputs, and there's an INDI driver for it. A US$40 stepper, a bracket to hold it, a pulley, and a belt to go around your focus knob, and you have a complete autofocus system. I recommend StellarMate OS over the free suites (e.g. Astroberry) or compiling your own, simply because it's cheap for the time and frustration saved, and the support is quite good. If you want to apply the same logic to the focuser, Ekos supports the ZWO EAF.
  8. You know, it was so easy to shoot and process I hardly feel as if I can take credit for it! I had been doing bicolor (Ha+OIII) images, and only recently acquired an SII filter, this is only my second target for it. I literally loaded the lights and calibration frames into Astro Pixel Processor and clicked "integrate" -- didn't mess around with selecting the best lights, changing parameters on normalization, nothin'. Then I ran APP's gradient/LP tool on the resulting Ha, OIII, and SII lintegrations and ran the "Combine RGB" tool. SHO 1 (Hubble) palette preset, calculate, done. Pick a stretch level, save out a TIFF. This version represents a little more work on the nonlinear data in Photoshop, just general "make it pop" stuff, but it was more like applying makeup than the reconstructive surgery that I resort to sometimes. A whirl through Topaz Denoise AI with some sharpening, and boo yah.
  9. One low-rent way to check wide-angle lens performance is to tack five sheets of a newspaper some meters apart on a wall, four corners and center. You can't work with infinity focus this way, but if you separate them enough, you can at least get a few meters back and get the lens into the middle of its focus range. Then ensure your camera is as exactly parallel with the wall as you can determine, focus critically on the center using live view, snap, and examine the results. You'll have to decide for yourself if the vignetting is objectionable. If you want to reduce or eliminate it, you'll need to use flat frames, and either flat darks or bias frames to calibrate those. Fortunately those are pretty easy to do. If you get yourself an LED tracing panel, you can just point the camera straight up and lay the panel right on the Samyang's hood (if you're doing this in daylight, you'll probably want to use masking tape or cardboard wrapped around the hood to prevent light leaks). You can buy expensive calibrated panels if you want, but the cheapo LED ones do quite well for this purpose. You may wish to stand off and shoot some photos of the panel and stretch the daylights out of them to uncover any irregularities in the panel itself, but since you'll only be using about 70mm out of the middle you should do fine. Shoot your flat so that the peak of the histogram (which should be quite narrow!) is in the middle third of the range and you're good to go -- no need to obsess over the "right" exposure, it just needs to be in the sensor's region of linear response. Bias frames, which are necessary for flats calibration to work correctly, are super-easy -- barely an inconvenience. Set your camera to the same ISO as you use for your light and flat frames, cover the lens (or just shoot with a body cap on), set the shutter as high as it will go, and crack off as many frames as you're willing to stack. I use 100 because why not, but 20 is probably enough and 50 ample. (Flat darks are just flats with the lens cap on; for most cameras, bias are a little more convenient, but either will serve to calibrate your flats.) It's really good to get into the habit of shooting and processing calibration frames, as you get into more challenging targets it becomes more and more difficult to fix those problems in post without leaving telltale traces.
  10. For folks in your position, I frequently suggest that they put the scope on hold for now and put that money into an upgraded mount. 200-300mm manual-everything lenses for DSLRs are a relative bargain, you can Google around for astro performance of specific lenses. That gets you plenty of good deep-sky targets, and lets you bring up your game with polar alignment, focusing, pointing, guiding, and very importantly processing. By the time you've got those skills honed to the point where the focal length of the lens is a limitation for you, you'll have saved enough for a telescope, and will have a great mount to put it on. An entry-level DLSR/lens setup on a good mount is a joy to use. The converse...not so much.
  11. I had used my iOptron CEM25P's power adapter to run my Raspberry Pi before, so I thought was fine. Hah! I got problems like: The Pi wouldn't accept my password When it finally did, it plate-solved twice, but never again Guiding was completely and literally off the charts After two hours of fighting random Ghost In The Machine issues, I thought to swap the wall-wart for my customary battery, and the problems magically evaporated. There was literally not a single hitch in getting three hours of SHO data on the Cygnus Wall after that. I'm pleased with the result, but imagine how it would have looked like with five hours of data!
  12. My starting point is to attempt to reproduce what's coming down through the optics. So I will shoot equal R, G, and B (though often significantly more luminance), composite them in Photoshop or Astro Pixel Processor, and adjust from there. That said, managing color balance is probably one of my weakest areas in processing, so I'd be eager to learn more. Now, if you were talking narrowband, that's a horse of another hue. The Ha signal is so much stronger than the others that unless you want a primarily monochrome image, or a very noisy one, you'll wind up shooting more OIII and SII.
  13. For focusing, the easiest hack is a Bahtinov mask -- it's a grating that you put in front of the scope that causes stars to have a spiky diffraction pattern. You can use it without any computer support, just magnify the live view and focus till you center the middle spike. They're pretty easy to make, but you can also buy them commercially for not much dough. It works best on a star, NOT a planetary disk, so the best practice is to point at a bright star, focus, REMOVE THE BAHTINOV MASK, and point at your target. The emphasis is sincere, but also a wry comment. I don't think an imager exists who (a) has used a Bahtinov and (b) hasn't forgotten at least once to remove it while imaging.
  14. Apparently my phone didn't post my earlier comment. What algorithm did you select? Are you using a backlash number, and how did you arrive at it? Polynomial is certainly quick, but occasionally fails for my rig, so I go with slow-but-sure Linear for unattended work. I put in a backlash number that I measured by successive reversals and observing the motion, but it would probably work fine without it. Certainly Linear is generous in its overshoots every time it has to reverse direction.
  15. You can definitely run sufficiently-capable software for processing on your iMac. PixInsight, Astro Pixel Processor, SiRiL, ASTAP...really no need for Windows. Honest. Deep Sky Stacker is the single Windows-only program I've ever wanted to run and I found (better) replacements easily. People might say that there are more choices for Windows software but what do you care about the dozen programs that you're not running? As for acquisition, well. There's a certain tendency to advocate one's own choices, eh? Still, what I chose works very well, so here goes: Buy a Raspberry Pi. I started with a 3B, but a 4 is miles better. $65. Buy StellarMate OS to install on it -- a turnkey Linux operating system image with drivers and acquisition software pre-installed. $50. Use the Pi's built-in WiFi to control it via the StellarMate app on your phone or iPad, or a VNC app on your iPad, or even from your iMac. Or, if you are working someplace with mains power, you can plug a display, keyboard, and mouse into the Pi and use it like a desktop computer. The control and acquisition software that comes with StellarMate can run a camera or a complete observatory. There are videos on everything, and the company (OK, the guy) is excellent with support. Don't let "Linux" scare you -- it works very similarly to Mac OS and Windows. (If you're willing to pay a bit more for an even more turnkey experience, they'll sell you the Pi, with a case, with everything preinstalled for $229.) When I image from home I'll remote into the Pi from my MacBook; in the field, I use that MacBook until its battery gives out, then take out the iPad and use that for the rest of the night. The biggest advantage to setting up this way is robustness -- the Pi is doing nothing but running the astronomy software, and you can mount it right on your rig so there's never a problem with pulling a USB cable out inadvertently and crashing a session. Next to that, for me, is its miserly power draw. If you want to work in the field, the Pi will run all night off the same battery that powers your mount. I personally really like having the device I'm typing on NOT connected directly to the scope, since I'm clumsy, especially in the dark when I'm short of sleep.
  16. I've had it cycle half a dozen times when KStars and the mount were REALLY disagreeing. Usually it's three for me. I never do a star align on my mount, and I almost never leave it set up two nights in a row. So when I reassemble it, the mount always is a bit mistaken about where it's really pointing and its first slew is invariably off. I believe that the Polar Alignment Assistant, like Capture and Solve, syncs the mount to the solution. Since it's going start reasonably close, even a blind solve happens pretty fast and then you're GTG. [Edited to add] Ah, I have no idea whether it syncs the mount at each stop along the way. I suppose you could tell by watching the hand controller readout for changes.
  17. For free software, SiRiL and The GIMP are pretty good. I climbed the Photoshop learning curve long ago for my terrestrial photography, so that's what I use, but I use it for refining an astro image, not creating one. For most of my processing, Astro Pixel Processor is just right -- powerful, has almost all of the features I need, easy to learn, and easy to use. PixInsight still beats it for abstruse features but IMO you're much better starting off with APP and getting good images right away. Caveat: While APP support on their forums is quite good, the software itself is done by one guy, and version 1.083 has been in beta for over a year, with final release repeatedly promised very soon. As a software developer myself, I know that's never a good sign. Mind you, I really hope they pull through, it's great software and I want them to succeed. But I'd be lyin' if I said I wasn't at least a little concerned. BTW if you're working with a DSLR, the raw-conversion libraries also seem to be behind the M1 curve. Dunno if PI is working OK with those but APP definitely has a problem.
  18. That's a great little tutorial on calibration frames. I would add one explanation to it. Dark flats are mentioned, but there's a bit of hand-waving WRT whether you need them. DFs are an alternative to bias frames. Since flats are done with a fairly bright light source, their exposure times are quite short so there's little to no thermal current in play. That's why you don't need to use dark frames to calibrate them. However, there are pixel-to-pixel variations, as well as the applied bias current, so you DO need some sort of baseline zero-light image. Bias frames can be used for this quite nicely. Advantage: You only have to shoot one set per gain/ISO that you use, and you can easily do it during the day, or cloudy time. However, some cameras (IIRC the 294 sensors in particular, maybe the 1600?) deliver inconsistent results at such short exposure times. The alternative is to use dark flats, which use the same gain/ISO and exposure as your flats, but with no light. Disadvantage: If your flats exposure changes, you have to reshoot your DFs. (Indeed, for us mono imagers, you have to shoot and store a set for every filter, since the transmissivity and so exposure differs between them. For me, that would be Ha, OIII, SII, L, R, G, and B!) I don't know if any of the common programs can scale one set of dark flats to different flat-frame exposures; if so, that would lessen the problem considerably. So if your camera takes consistent short-exposure images, bias + flats is a simpler way to go. If not, dark flats + flats is every bit as good. DLSRs are designed to deliver great images at very high shutter speeds, so I'd personally go bias; Your Mileage May Vary.
  19. In terrestrial photography we don't have to stretch as we do in astro, so you don't notice the phenomenon. But if you shoot two images at the same shutter speed and aperture, one at low ISO and one at high, and then pull the exposure and contrast up on the low-ISO one to yield the same histogram as the high one's, you'll find pretty identical noise. In terrestrial work you'd just open the iris or extend the shutter speed or add light to the scene, and say that the low-ISO image was just underexposed. So was the high-ISO one -- it's just that the camera's electronics did the stretching for you. In either case, you'll wind up with a noisier picture. In astro, of course, we're nearly always working at the maximum useful aperture, and we can't add light to the scene. So we extend the "shutter speed" by increasing the integration time, and turn up the gain by increasing the ISO and by stretching.
  20. > Is it just down to the inability of a dslr to get darks sufficiently close in temperature to the lights ? No, although that can affect the results. DSLRs use different sensors, in the main, than astro cameras, and modern ones have very little dark current. My Pentax K-5iis is one of these, I don't bother with darks. Astro camera sensors are optimized for different parameters, and so are comparatively weak where DSLRs excel. Google "IMX183 amp glow" to see examples of my astro camera's dark current, for example! Software can compensate for differing temperatures to a degree, if the actual sensor temperature is recorded in the EXIF data from a DSLR ("dark scaling"). Bias is a bit of current added to the photosite wells to ensure that all of them report an at-least-slightly positive result. If the actual light fluxes onto two adjacent photosites differ by a magnitude that the sensor would represent as seven electrons, one might generate 5 electrons and the other 0 (it's really at a level that would result in -2 electrons, but that's a physical impossibility!). But when you read the sensor, the difference is only 5. By adding 10 electron's worth of bias current, you raise the laggard above the unknowable floor of zero, so they're 15 and 8, restoring the relative values. A bias frame records this as well as any "floor" current that the sensor produces even when no light is on the sensor. Subtracting the bias frame value lets you correctly extrapolate where zero is for that photosite. It's especially helpful with flat calibration frames. Speaking of which, you should definitely be shooting and processing with those. Vignetting, dust, and other systemic problems boil right out of your photos that way.
  21. Second image shows a bit of star elongation along the DEC axis -- if you're not guiding, that's a sign that your polar alignment is a smidgen out. You can do a drift test near the meridian and celestial equator, then nearer on the horizon near the CE, to double-check before you start sequencing. I'm a fan of the DARV method myself for that (https://www.cloudynights.com/articles/cat/articles/darv-drift-alignment-by-robert-vice-r2760). Right on for your first couple of DSOs!
  22. Yep, at least for my standards a tracing pad works just fine. I wish it were continuously-variable brightness instead of just three settings, and I also wish it could be switched on and off by my computer instead of needing a human finger on the button (which would allow me to automate flat-shooting entirely). But for US$14, eh, I can deal.
  23. Great bit of maker-ism! What's translating between lin_guider and the faster/slower commands to the board? If it isn't an INDI driver, you might consider writing one -- you can use an existing driver as a skeleton and replace the business end with your code. That would give you many software choices, including PHD for guiding and Ekos for a SharpCap-style polar alignment routine. The imaging camera will undoubtedly work with it, and you might be able to find or write a driver for the guide cam. I mean, kudos to you for building it from the bottom up, but with those drivers you'd also have access to a complete observatory system if you want it. C'mon, you KNOW you're going to want to put your rig in a shed with a motorized roof controlled by the observatory scheduler... Or have you moved on in the interim?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.