Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

rickwayne

Members
  • Posts

    695
  • Joined

  • Last visited

Posts posted by rickwayne

  1. Polar misalignment shows up as a steady drift along the DEC axis, as if you slewed the mount in DEC slowly during the exposure. Since the elongation in your image appears to be only along the RA axis, your polar alignment appears to be good. DARV or some other direct measurement of drift will tell you just how good.

    As for rotating your camera, that's merely a convenience for identifying which axis runs which way in the image. You can also plate solve an image to determine its orientation. Some people like to align their camera along the axes as a matter of course; I like to rotate to frame the image the way I want it to look unless I'm debugging something. (Admittedly, I'm nearly always debugging something! )

  2. Or you can use the Focus module's HFR reading to quantitate just how good your focus is. Unlike with a Bahtinov, you can't see directly how far you are from best focus, but you can pay heed to the numbers and just make them small. If you're reasonably consistent in how much you move the focus knob each time, you can pretend you're an autofocuser and seek the bottom of the "V" shape the HFR measurements make on the graph.

    • Thanks 1
  3. DARV is a great way to double-check or fine-tune your polar alignment, since it measures the end result: drift in DEC. Which, as Vlaiv points out, you don't appear to have.

    As an ab-initio method for alignment, it's a little abstruse for most people -- you  have to work out which way a given knob turn narrows or widens the V, both near the meridian (for azimuth) and near the horizon (for altitude). Helps to take good notes! But once you have those, it's easy enough to look up that, for example, when doing the meridian image, when the "dotted" leg of the V is on top, adjust the azimuth knob on the right of the mount, etc..

    Personally I really like the direct-feedback methods like Ekos Polar Alignment Assistant, Polemaster, iPolar, etc. But you can't beat drift alignment for reporting what is actually happening -- all those others are one sort or another of derived measurement.

    It is possible to introduce errors by getting things too tight, for example gear meshing should be tight enough but no tighter.

  4. Don't be afraid to use the 2-sigma stretch -- it uglifies things, but points up problems nicely. In fact I use it all the time to identify gradients and such. I never save an image that way -- blecch! -- but it's a very useful tool. 

    • Like 1
  5. Have to concur -- 2000mm on that mount would really be Crazy Town for deep-sky objects. I would counsel that you start with the 50, and teach yourself imaging and processing (including shooting and using calibration frames -- flats, flat darks or bias, and darks, if they're appropriate for your camera. Do it right from the get-go, is my advice.

    M31 is actually quite a large object, so you can frame it well with surprisingly short focal lengths, especially if you're willing to do a bit of cropping.

    Two great free tools for figuring these issues out are the Telescope Simulator in telescopius.com, and a desktop planetarium app like KStars or Stellarium (and there are other good ones too).

    And one terrific non-free tool: If you haven't already, invest in a book like Steve Richards' Making Every Photon Count or Charles Brackens' The Deep-Sky Imaging Primer. You will learn a TON about how this stuff works and that will enable you to avoid many of the classic beginner mistakes.

    BTW you can print a Bahtinov mask on paper, cardboard, or thin plastic and cut one out for yourself. Although a nice 3D-printed one that fits over your optics like a lens cap is a real convenience. There are so many details in this hobby, so much to keep track of and so much to do just right, that little conveniences turn out to be important.

    Welcome, and enjoy the journey.

    • Like 1
  6. On 14/02/2022 at 16:37, vlaiv said:

    Maybe simplest explanation would be to show it?

    image.png.7f62f85635bfff18e91c4dcc2fa8c476.pngStars look lovely but nebulosity is obviously much lower resolution than stars. It almost looks like water color painting rather than actual image with detail suggested by size of stars. That is the sign of heavy use of denoising that softens up image way too much.

     

    Gotcha. What is also referred to as a "plastic" look. I had not seen it in reference to how the stars appear. Thanks! 

    • Like 1
  7. And another. I particularly like how I can use my Pi at the scope, or connect a laptop to the scope, or use my Pi at the scope and remote into its Linux desktop, or use the app on my phone to control it...it's quite the adaptable system.

    Astroberry is a good choice if you're a software person; you can also buy the turnkey StellarMate OS as noted above, which gives you the option of using phone app to check in on the system or control it as it's sequencing.

    It's quite amazing what the software can accomplish now. The Pi 4 has plenty of snort to do plate solving, which enables fun tricks like polar alignment without a view of the northern half of the sky, or automatically zeroing in exactly on your target.

    • Like 2
  8. I won't restart the Dark Flats vs. Bias wars here, but will note that for many camera and processing software combinations, one can indeed use a set of "bias" or no-signal frames instead of dark flats. They are shot as darks with the shortest exposure your camera can do consistently.

    There are indeed some CMOS cameras for which bias frames do not work well, but "most"? Citation needed! The only one I'm aware of is the Panasonic sensor in the 1600.

    Really it's a horse apiece for one-shot color cameras, so long as your flats are always done at the same exposure there's little need to reshoot dark flats. Although you could make a case that the thermal signal could be a significant part of the value for dark flats, but not for bias. In that case one set of bias frames and one of darks for each temperature suffices, whereas you'd have to shoot and manage a per-temperature set of dark flats too.

    But flats are also pretty short exposures so that can be safely neglected.

    Mono imagers tend to rant more about bias since we often have a different flats exposure for each filter. In my case for my mono (CMOS!) camera that's one set of bias or seven sets of dark flats. Even I can do that math.

  9. We all indeed have our favorites. I use and recommend Astro Pixel Processor partly because it was recommended to me as a beginner by some very experienced, capable imagers who had used a variety of packages. It guides you through the basics of calibrating, stacking, and stretching, has a really terrific light-pollution/gradient tool that's simple to use and pretty much obviates the need for filters even from my Bortle 8 back yard. And its "happy path", just loading up files and clicking "Integrate", is likely to produce good results with the default settings. And there's support, which is pretty important for any of these programs.

  10. I'm not so sure about the increased guide exposure, once multi-star guiding is working. My experience bears out the theory: the turbulent cells in the atmosphere are actually small enough that your guide stars are almost certainly going to be in different ones. This averages out the seeing effects. It's still possible to go too short, of course, and a too-dim noisy image will challenge the algorithms. But one should definitely experiment with one's own setup; rules of thumb from before multi-star guiding was available may not apply.

    Now, sound from within the dwelling vibrating the window as a unit...OK, I'll grant you that this sitch might not be exactly canon!

    • Thanks 1
  11. Ugh, I can't find that handy image that shows which way the stars point when you're too close or too far away from your flattener. I think the stars do a pincushion thing (streaks oriented radially) when you're too close and a barrel thing (streaks oriented around circumference of image) with too much spacing, but could be the other way 'round, too.

    Probably not your issue, but it's worth surfacing now and then for other folks who're trying to diagnose problems in image trains including a flattener.

    • Thanks 1
  12. The very best way is to experiment. Fortunately this need only take a few minutes. Stop down the lens a bit, take a series of progressively-longer exposures, and examine each histogram as it appears. When the big peak of the histogram clears the left edge of the graph, stop -- that's long enough. You want your exposure to be long enough that you're not clipping any of the image to black (including the sky background), but otherwise as short as practical, to avoid problems with star trailing, satellites, airplanes, etc. Stacking integrates total exposure time, which is much more important than the length of a single sub-exposure.

    I've seen advice to put the top of the big histogram peak at 1/4 scale, 1/3, even 1/2, but all you really need is a little bit of flat no-signal space between its left edge and the edge of the graph. Too long a sub-exposure time (or too high an ISO) can give you the opposite problem, white clipping at the right side of the graph. Since stars are the brightest thing in astro images, that means your stars will lack color.

    Shorter sub-exposures means you need to shoot and stack more of them to achieve a given total integration time, which can place practical limits -- nobody wants to manage 3,600 sub-exposures if they can avoid it!

    As for total integration time, shoot as much as you can stand to image and process. For deep-sky objects, more is really better, although you hit diminishing returns pretty quickly. But think hours, not seconds or minutes.

    Likewise, as drjolo recommends, have a squint at the stars to determine if the f-stop suffices to your needs.

    • Like 1
  13. Sorry, I know nothing about the camera per se. But I can at least offer you some guidance on the concepts.

    The tradeoff between high and low gain is between shorter exposures to achieve a given SNR, and "top-end" range. If you want color in stars and other bright areas of the image (e.g. a galaxy core or the Trapezium in M42), your exposures have to be short enough not to max out those values. High gain is rarely needed for one-shot color cameras; you can always make up the low end with more subs to yield more integration time.

    From Altair's documentation, it looks as if HCG is just an extra multiplier that can be applied to gain values; how you set it in the application you're using may vary.

    Although higher gains paradoxically yield lower read noise, the effect is small compared to the loss of highlight detail; many people use "unity" gain so that one electron on the sensor means incrementing the output pixel value by one. That appears to be LCG and 200 gain for the 269, but I could be wrong.

    Offset is a means of protecting the low end from various kinds of noise. Since the sensor can't read anything less than zero electrons, and noise can affect the signal in either direction, in some cases a non-zero input plus the noise "clips" to a zero value, since negative ones aren't possible. So an offset adds a "cushion" to every pixel, below which signal+noise should always be > 0. Of course, you lose a bit of range this way too. I suppose you could call it the black point but it's really an entirely different concept; it has the effect of setting the black point below zero.

  14. I'm with Olly on this one. You've got terrific optics in that 135; going for more reach may be very frustrating from the start. It is best to start with the simplest thing that could possibly work, and get your whole workflow going. This includes polar alignment, focusing, getting calibration frames, and very importantly, processing.

    Is that what I did? <Hollow laughter> No, no it was not. I spent a lot of time dinking around with all the complex bits wishing I'd started more simply.

    • Like 2
  15. If you're not above a bit of wiring you could do worse than buy a LiFePO4 battery with a built-in battery-management system. Lithium iron phosphate chemistry has significant advantages, especially in number of charge cycles (2X to 10X Li-Ion) and, if you care, much less environmental impact. You can discharge them till they're flat and they come back for more, Li-Ion is much more sensitive in that regard. (Technically it's the BMS that keeps you from killing them, but it's built in.)

    I use a 20 Ah ExpertPower and it runs my Pi 4, focus motor, CEM70, and TEC cooler in the 183 for at least a couple nights. Was something like $265. I added some cords I'd cut off power adapters ("wall-warts) that had the appropriate 5.5 x 1.2 and 5.5 x 1.5 barrel connectors, and also wired a cheap fuse holder and a little LCD multimeter into the circuit. Was well within my exceedingly modest DIY abilities.

    If you're running a dew heater you'd likely want a bit bigger battery.

  16. It sounds as if computer control of your mount and camera is your first objective (though I certainly wouldn't argue with Olly that autoguiding is a big  step up!). You are correct that one USB connection goes to the mount, and another to your camera (a third goes to the guide camera if you add that).

    You can go the scope-side attached route (StellarMate, ASIAir, Intel NUC) or use a laptop directly connected. Each has advantages and disadvantages. I am a huge fan of the former route, because I often image in the field on battery power.

    If you want to minimize tinkering, a prepackaged appliance like StellarMate or ASIAir is likely to be most satisfying for you. In either case, the hardware and software are already set up, just follow the directions to configure it with your particular equipment and you're laughing. I'm not saying that downloading ASCOM, installing a set of astro software, and getting device drivers and astro software working with it is hard, just that it's extra steps which are unneeded with an appliance. (Likewise for Linux and KStars/Ekos/INDI, if you're using something other than Windows.) You'll need another device (e.g. laptop or phone) to connect to and drive the software (although with the StellarMate you can actually plug an HDMI TV or monitor, mouse, and keyboard right into the appliance and use it like any other computer).

    If you would  like to save some money and don't mind assembling things, you can buy a Raspberry Pi 4, a case, a MicroSD card, and the StellarMate OS software for less than US$150. You download the software, burn it to the MicroSD card, plug that into the Pi, and you have essentially the same appliance that they sell for $229.

    If you prefer to install software on an existing laptop, you'll have to pick a package and  follow their instructions. Popular ones include APT, NINA, and KStars/Ekos/INDI, there are certainly others too. I would recommend something more integrated, such as NINA, rather than trying to put together a bunch of separate programs.

    The least frustrating and simplest way to automatically point your scope involves "plate solving", which is software that can examine an image and  determine its exact stellar coordinates. Again, StellarMate  and ASIAir have this built in, little to no fiddling is required. Once you're told the software your camera's approximate field of view, when you issue a "goto" command it  takes a stab at it by rotating the mount to what it thinks is the right position, shoots and analyzes an image, nudges the mount, and repeats until the target is exactly centered (and I do mean  exactly).

    The same software stack can also be used for polar alignment. My rig doesn't even have a polar scope or finder that I can look through. Once I have the tripod legs adjusted to level the mount, I'm all done with the crouching!

     

    • Like 1
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.