Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

rickwayne

Members
  • Posts

    695
  • Joined

  • Last visited

Everything posted by rickwayne

  1. Hi, and welcome! For eyepieces, the magnification is given by the focal length of the telescope (900mm) divided by that of the eyepiece (25mm or 10mm). So a smaller number on the eyepiece means more magnification. Of course you don't get something for nothing, more magnification means more optical problems and more trouble with air turbulence. That is confusingly called "seeing", you'd think that term would refer to how well you can see things -- cloudiness, or transparency -- but no, it's how stars wave around when you try to photograph them. Planets are pretty tiny so you have to magnify them a lot to see more than a spark of light. Jupiter and Saturn do amazingly well with quite small telescopes but sadly you've just missed them, they're really close to the Sun in the sky now. You may be able to catch a peek just after sunset though.
  2. Not changes in the atmosphere as such, but effects from the amount of atmosphere you were shooting through, which (if I remember my Bracken) can be as much as 500km if the target is right on the horizon!
  3. Those actually look like some sort of diffraction spikes to me. Huh. My only other thought: Is there much difference in altitude (angular height above horizon) between these images?
  4. Awww! You're very welcome! The kindness of strangers is certainly the only reason I've been able to have any success at all in this.
  5. The most successful astroimaging session of my life at that point, this was taken in Big Bend National Park in October 2018 in the darkest skies I've ever seen. I was digging through my back catalogue and said "Huh, I bet I can process that a lot better now". And it only took me an hour or so, too. Astro Pixel Processor -- I am saying. You can look in my astrobin for the old image, which had a far harsher contrast in the nebulae and clipped the background egregiously to black. Pentax K5-iis DSLR, 28x180" at ISO 800, Stellarvue SV70t, guided.
  6. There is an online calculator to give you the theoretical numbers for field rotation, but I like the "try it and see" approach. There are several sources of noise in astro images, read noise happens every time you read the sensor so it scales linearly with the number of sub-exposures (aka "subs", "lights"). At a certain level (incoming photon flux times exposure time) you'll have enough signal that the read noise really is no longer a factor (often referred to as "swamping the read noise"). Of course, ya don't never get nothin' for free, longer exposures have their own problems. For one, every second your shutter is open is another opportunity for badness to happen -- tracking error, kicking the tripod, firefly lighting off inside the scope's dew shield, satellites, aircraft. Too, your sensor can only register so many photons before it's filled to capacity so that that part of the image is blown out and unrecoverable. Strange that we struggle with that problem when we're imaging dim objects, but the stars are so much brighter than nebulae that it's a real issue. The beginning-astro books have great discussions of this -- you don't have to follow the maths to get the gist and internalize the rules, but it really does help to be exposed. If you prefer video, Dr. Robin Glover, author of the SharpCap imaging program, had a great talk on exposure and related topics. Note that you're using filters, the Bayer matrix in front of your sensor means "multiply by three". Don't worry, you'll know what I meant when you get there :-). My favorite of the beginner books is Bracken's The Deep Sky Imaging Primer. Others will recommend Richards or Lodriguss, can't go wrong with any of 'em.
  7. I'm liking everything I see there. Great details, wonderful color in the emission portions, really good control of the tonal range. Nice!
  8. Thanks! I've heard from at least two or three other people, in other fora, with essentially the same experience. That's encouraging! Sigh. The iPolar app runs very nicely on my Mac, but it's sort of boring running it with no camera to connect to...
  9. Once my CEM70 ships (always assuming I'm not too old to lift it by then), I plan to do a little quantitative experiment on iPolar calibration. Once calibrated, asserts the manual, one needn't shoot two images every time one aligns, it's just turn on the iPolar, match dot to cross, and you're laughing. I'd expect the vicissitudes of using gear in the field -- thermal cycling, if nothing else -- would eventually perturb the relationship of iPolar unit to RA axis sufficiently to require recalibration. But I've no idea how long "eventually" is. (As in,"eventually iOptron will make more mounts and you'll get one, Mr. Wayne.") So I'm going to record alignment stats every time out -- PHD's estimate of PA at least, a DARV if self-discipline wins over eagerness to get photons on sensors. Has anyone done an exercise like this, formally or informally? What's your experience with how long your iPolar "calibration the camera", to quote the manual, tends to last?
  10. Wow, that software really has it going on, doesn't it? If I were starting out and looking for capture software I think I'd go for that first.
  11. Why not just use SiriL? That's being actively developed. And is free. ASTAP is another very complete, free package with which you can analyze your images. I told it to analyze a recent set of my flats, screen shot of the result below. "Background" is the figure of merit. And of course if you were using KStars/Ekos you could set a target ADU value and the flats-capturing module would cycle through exposures looking for one that yielded the value you sought :-).
  12. To be clear, the polar alignment tools in SharpCap, or Ekos, or NINA, or a Polemaster or iOptron's iPolar for that matter, are great. Certainly much better than the polar scope in my experience, and much more efficient than the fiddle-and-check method of drift alignment. But to confirm your polar alignment, the gold standard is actual observation of declination drift on the celestial equator.
  13. And of course if you really want a test that's as close as possible to what you'll actually see while imaging, there's the DARV method. I would recommend using a more interactive tool to establish a good polar alignment, and then double-check with DARV. I use the polar alignment assistant in Ekos but it's not always super-repeatable, so it's good to have a backup check.
  14. PixInsight versus Photoshop: Even within the astrophotography domain, these are tools peerless in their own little realms, rather stumble-footed outside it. Even beyond the very considerable difference in ease of learning and ease of use (not the same thing), one is at heart a program for manipulating data scientifically, and the other is for editing and polishing visual images. Oversimplifying a bit, processing an astro image (should) involve four overall steps: calibration, integration, stretching, and the fiddly bits. πŸ™‚ Because we enhance the contrast in weak signals so dramatically, problems tend to be magnified along with the image. Calibration is a process of compensating for known, predictable problems with the sensor (dark current, thermal glow, hot or cold pixels) and the imaging train (dust, vignetting). Astro software takes this in stride; Photoshop has essentially no tools for it. Integration is the mathematical process of averaging each pixel's value from a "stack" of (putatively) identical frames. Photoshop can provide this, in fairly cack-handed fashion (either a simple mean or a simple median calculation); astro software has all manner of sophisticated tools to optimize the process (e.g. outlier rejection to eliminate Elon's little contribution to our hobby) pretty transparently to the user. Stretching is contrast enhancement, specifically mapping tones which are very close together at the dim end onto a much broader and brighter range, without blowing out the brightest tones. Astro software has built-in tools for this but Photoshop can do a very nice job. Before stretching, astro-heads refer to the image as "linear" data; "nonlinear" afterwards. The fiddly bits: Overall editing to create an image pleasing to the artist. Localized brightening or darkening. Context-aware patching of artifacts. Emphasizing some parts of the image while toning down others, for dramatic effect. Photoshop is hands-down nolo-contendre the class of the world at this, with a comprehensive array of tools that have been relentlessly honed by a large company for 32 years to be both effective and intuitive to use. There are purists who insist that scientific integrity demands that we not "lie" with our data in this fashion; indeed, that's basically the philosophy behind PixInsight. That's a perfectly valid position. For the rest of us... Of course it's a little more complicated than that. (Is gradient removal a "fiddly bit"? After all, it should be done on linear data.) But the general shape of the problem is valid, I assert. Astro tools are better at linear data, arguably easier to use for stretching, and Photoshop simply blows them away for artistic image editing. Now, PI is the most complete set of tools I know of, but IMO handing it to a beginner is akin to saying "If you're a real astrophotographer, you should prove it by climbing this cliff. Here's your blindfold and mittens." The core functions are available and, I'd argue, more accessible in other software. Astro Pixel Processor springs to mind, but Deep Sky Stacker, Photoshop, and a couple of plugins will certainly do the job. SiriL wins no points for the obviousness of its user interface but it's perfectly straightforward, and it's free. All of which is to say, Ande, that understanding the process is the real key, and while any of these software stacks will help you establish an initial workflow, simpler is probably better to start.
  15. Wow, that is an AMAZING first image. It's really unfortunate that there are so many things we can't test or debug without using up dark-sky time, eh? I don't know if your setup will do this, but if you can feed your plate-solving software one of the lights from this session, that should let you test it in the daytime. Usually it's one of four things: The field of view numbers aren't close enough to the real thing The image scale number (arcseconds per pixel) is off The initial coordinates fed to the solver are wrong The solver can't find its index files, or not all of them are present Most solvers can do a "blind" solve with no coordinates at all, but it takes longer and is more likely to gack. How far did you manage to get with PHD?
  16. If you're going to get into guiding right away, be aware that it's one more thing to hook up, mount, and go wrong. Not trying to dissuade you, but there are so many things to set up, debug, and learn about that simple is often better. It's amazing what you can accomplish with 30-second exposures! Guiding is one place where you can safely economize. There's absolutely no point going big on a scope or expensive on a camera -- quite the contrary, in fact. You want to keep the weight down, you only have around 15 pounds total to work with on an heq5. You should be fine with a 30-50mm scope and a nice cheap little (preferably mono) camera. You could start by using the finder shoe to mount the guidescope but many folk encounter differential-flexure problems with that so if you can, it's best to do something rigid (SVBONY make some guidescope rings that are super-inexpensive). Do you already have a DSLR or other interchangeable-lens camera? That would be great to start with. You'll probably need a coma corrector (if you go Newtonian) or field flattener/reducer (if refractor). And you'll need something to control it all with. If you have a computer you can leave within USB range of the scope, and can run for as long as you want to image, there you go -- just a matter of picking the software. Several free options are available including NINA and KStars/Ekos. I work in the field a lot and wanted something that would run all night on a battery that would power the mount too, so I bought a Raspberry Pi, which does a great job. There is a free software suite for that called Astroberry which gives you a complete observatory system with KStars and Ekos. Others will no doubt recommend their favorites.
  17. If you have a gain control in the software for the guide cam, you can adjust that down too for use in the daylight (but remember to kick it back up for use at night -- ask me how I know πŸ™‚ I've never actually guided with ST-4 but yes, that does sound correct.
  18. 1) Establish a rough guide-cam focus in daylight first, either with a sheet of paper held behind the guide scope or just waving the guide cam back and forth. Then you'll have some idea where the camera's sensor should lie. You can also do the fine focusing with the camera in daylight, probably easier than with stars though stars are perfect for the very last refinements. 2) Plate solving. I am saying, my dude. πŸ™‚ Absent that, yes, an el cheapo red-dot sight is a real boon compared to looking at the feed from your camera and trying to figure out WTF. If you upload an image to nova.astrometry.net (assuming that's an option for you in the field), you can overlay an RA/DEC grid, get exact coordinates, and even see a map of where the frame lies in the sky. Because PLATE SOLVING! Honest. 3) "The cable" -- are you using on-camera guiding (aka ST-4)? Is the cable one of the small phone-jack-connector types, RJ-11 or RJ-10 on each end? Your description is unclear to me, I'm afraid -- if you connect a cable from guide cam to mount, that would appear to be two ends -- for most cables, that is in fact the maximum number of ends πŸ™‚. So I'm not sure where the "other" end came from that you plugged into the computer. You should know that there are two ways to connect guide cameras. One is to plug an ST-4 cable from the camera's dedicated guide port into the mount's dedicated guide port, and then have another connection from the mount to the computer. Another is to eschew the camera/mount connection and have one USB cable from the camera to the computer, and another USB or serial cable from the mount to the computer. The latter is usually called "pulse guiding" and is preferred, but certainly by no means mandatory or vastly superior.
  19. The advantage to luminance on RGB targets is that you're collecting photons (roughly, for argument's sake) three times as fast as with any one of the color filters. And since human vision relies much more on luminance data than on color to discriminate detail, you can rapidly build up sharply-defined images whose color can be much fuzzier (and thus take less dark-sky time to acquire). Even binned x1, you can shortchange the color data and blur out the noise, and no one will be the wiser b/c their eyes are oohing and ah-ing over the detail that your luminance channel provides.
  20. I think that's a very reasonable attitude. The only reason I suggested it is that it takes no extra kit, and (again, depending on what you already have) perhaps no extra software either. As Einstein said, "Everything should be as simple as possible, but no simpler." If you find that you can usually get on-target with a minimum of ceremony, good on ya. If you find that you're struggling to get targets in view within a reasonable time (Bog knows I do), might be time to revisit the idea. It is something you can at least get mostly set up in the daytime, without spending dark-sky time cursing at it.
  21. Second the "buy it!" advice. Probably the best hardware bang for the buck in all of astrophotography. I don't understand why using Ha as luminance on an emission target would wash out the other colors -- by definition, luminance should not affect the color balance at all, though of course as image brightness goes up, the available range of saturation necessarily decreases. Getting that right in Photoshop is going to be more of a challenge, I admit. I suppose if you're aesthetically unmoved by false-color narrowband images, monochrome Ha will seem like the rankest heresy to you πŸ™‚. But IMO you can do some really lovely stuff that way too.
  22. Since you're using a mono camera and filters, I assume that you want to do deep-sky work (nebulas and galaxies). You're asking a really huge question, which of course is fine and encouraged. But any answer you get on a forum like this is necessarily going to leave a LOT of information out. The single most succinct answer I can give you, that will waste the least amount of your time, is simply "read this book": The Deep-Sky Imaging Primer, 2nd Edition, by Charles Bracken. Other people will recommend Steve Richards' Making Every Photon Count, or one of Jerry Lodriguss's intro books. Fine books all. I'm sorry if it sounds like I'm brushing you off, because I'm really, really not. But you will need a base of knowledge to understand things like why we shoot multiple short exposures and stack them, or how careful you have to be with polar aligning your mount, and by far the most efficient way to gain that knowledge is to read one of these excellent, approachable books.
  23. Which books, if I might ask? Always looking for things to put on my gift lists πŸ™‚
  24. Eyepiece projection is a bit of an odd duck -- usually much better (albeit lower-magnification) results can be achieved at prime focus, basically using the telescope as a big telephoto lens without the additional magnification that an eyepiece provides. Though if you're imaging something super-bright like planets, you might get away with it. If you want to see where the prime focus point it, point the scope at something bright (NOT THE SUN!) in the daytime, and hold up a piece of paper behind it, moving it back and forth till you see a sharp image. You can do the same thing with your camera, with the lens dismounted and Live View switched on, of course, but a sheet of paper won't need expensive repairs if you bash it into the back of the scope while you're waving it back and forth. Most refractors will do prime focus with nothing but a T-adapter and a nosepiece to fit where the eyepiece normally goes; some might need extender tubes. It gets more complicated, of course (field flatteners, flattener/reducers, etc.) but that's the gist.
  25. I suspect that a recent cell phone would actually work better -- easier to handle, probably just as many megapixels, network-enabled out of the box, and a sensor with more recent technology. Not to mention that you might very find a holder gizmo that will keep your cell steady behind the telescope. If you had something like a DSLR where you could remove the lens and mount it to your scope, then we'd be talkin'.
Γ—
Γ—
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.