Jump to content

wimvb

Members
  • Posts

    8,950
  • Joined

  • Last visited

  • Days Won

    7

Everything posted by wimvb

  1. Fyi, if you load a sub and have it as the active window, you can use that to get the astrometry solution (get from image). You then apply the process to the integrated image. But do use the sub that was the reference during image registration. If you plate solve the integrated image, you can still get the data from (any) sub and use that as a starting point.
  2. At about this point in the discussion, it seems appropriate to suggest your next investments in AP https://www.firstlightoptics.com/books/making-every-photon-count-steve-richards.html https://www.firstlightoptics.com/books/dark-art-or-magic-bullet-steve-richards.html
  3. I don't think you need to crop that much. There are other methods to remove or reduce the effect. Do you see the reflections here? https://www.astrobin.com/vouq78/ The image was cropped to remove the worst part. Then I used desaturation to hide the rest. Btw, I noticed the same reflection effect while watching a netflix series tonight. There it was definitely a bright light near the camera that caused it.
  4. The pattern can be caused by reflections off chip edges, bonding wires or what have you not, within the camera and close to the sensor. Basically any light that can make its way into the sensor chamber. This was suggested by Adam Blick in one of his youtube videos.
  5. Agreed, if it's not on all aubs, then it's most likely from a street light. Adam Block has a youtube video qhere he shows a way to remove the effect without having to toss the affected subs. The method assumes you use pixinsight.
  6. Reflected light from a bright star outside the fov. The reflection extends all the way to the galaxy.
  7. I believe it was @ollypenrice who once said that filters are not about what they pass through, but rather what they block. A nb filter with higher OD will be better at blocking light pollution, and give you more contrast. Filters with high OD are more difficult to make, and are more expensive.
  8. The OD is not a linear scale rather 10^-OD This means that if OD = 0, all light is passed If OD = 1, 10% of the light is passed, 90% is blocked OD = 2, 1% is passed, 99% is blocked OD = 3, 0.1% is passed, 99.9% is blocked, etc. OD is a measure of how much light outside the pass band is blocked. A good (high OD) nb filter will block much of the light outside the transmission band, and yield better contrast than a poor one (low OD)
  9. If you plan to use the original asi120 and not the mini version, make sure that it will fit. On my scope, I had trouble trying to fit the asi120 between the focuser and the filterwheel. I bought the asi290 mini, and it just works. You can use it with the standard zwo oag. The 290 is more sensitive than the 120. So far, I've had no trouble finding guide stars in the fov. Multi star guiding doesn't always work, though.
  10. Nice image. Have you tried some of the hdr processes available in PI? HRDMT, LHE, MMT. These can be used to adjust local contrast and still reveal some ifn.
  11. Very nice images, both Re star formation, do you plan to add Ha atg some point? This will very likely add another dimension to this fine image.
  12. Great! I think you got it, but another 3 hours or more to confirm it would be nice of course.
  13. Congratulations, winners. Well deserved.
  14. I believe that dither settings are under the "Guiding" tab. The Minimum Move setting here is only for guiding, not for dithering.
  15. I remember you writing about problems with the ZWO filter wheel some time ago. So, you never got that resolved? That's too bad, life's too short for that kind of problems.
  16. How do you plan your sequence? Because the weather can be somewhat unpredictable here (clear but high cloud or just plain wool blanket clouds), I always try to collect all three channels per night, and I usually have a repeating sequence of 10 x 4 minutes exposures per channel plus 15 x 3 minutes for L. On good nights with low fwhm, I only collect L. I stretch the RGB image much less than the L image, usually only Arcsinh stretch followed by curves to lift the lower parts. I try to keep the curve linear above the 0.5 point, so as to not overstretch the stars. I also have the advantage of a reflector, with virtually no chromatic abberation. (I still have some difference in fwhm between channels, but I suspect this to be from the atmosphere.) Happens to me too sometimes. I weed the stars in the psf process with a blunt tool. I only keep stars that have a Moffat profile (excluding Moffat4 and above, as well as Gaussian profiles), Amplitude between 0.1 and 0.9, and a Mean Absolute Difference that does not vary too much.
  17. For deconvolution to work, you need an image with strong signal and low noise. I suspect that I live in an area with less light pollution than you, because I seldom need to use noise reduction on the L master. I also never use deconvolution on an rgb image, but only on L. If I only have rgb data, I either extract luminance from that, or I combine the colour masters to create a synthetic L to work on.
  18. Thanks. Being almost a quarter the price of an Esprit, the Mesu can handle four MN190s. 😁 That would be a beast.
  19. I suspect that cone error and polar misalignment are the main sources of field rotation in combination with a meridian flip, when imaging near the polar region. During frame alignment/plate solving after a meridian flip, you compensate for a rotation in one axis (axis of cone error, or alt/az) by a rotation along two other axes (ra and dec). The best you can do is to check how much cone error and polar misalignment you have, try to minimise it, and crop the stacked image to get rid of what's left of field rotation. It's annoying if you have a small sensor and/or a long focal length, to have to cut away a large portion of an image. But there only so much you can do about it. As for PHD calibration, unless you use st4 guiding, you can and should calibrate away from the polar region. Do this near the meridian and near the celestial equator. If you use eqmod, PHD will know where the scope is pointing, and adjust the guiding accordingly. The ts 8" isn't exactly a small scope, and you should definitely consider mounting it with a Losmandy plate, rather than a vixen. The heq5 should be able to carry the weight, but your setup is sensitive to any mechanical instabilities. If you don't have the means to upgrade the mount, consider upgrading the dovetail and puck.
  20. Thank you both for the feedback. I highly appreciate it. I find that the addition of nb (Ha) is too often overlooked in galaxy images. Especially interacting galaxies, but also single galaxies, have regions where Hydrogen clouds collaps and form new stars. When I did the research for this image, I saw that any image of this galaxy pair isn't complete without the inclusion of Ha. While possibly overdone, this image makes a statement, and shows the physical processes that are going on here. I also wanted to highlight the fact that the interaction between the galaxies causes the Hydrogen clouds to be more concentrated where that interaction is strongest. So, some "overcooking" was intentional. But, like @Rodd, I'm constantly aware that how the final image looks, depends very much on the device that is used for viewing. To me this image looks very different on my tablet than what it does on my computer screen. At the moment, I'm reprocessing this image with the intent to print it. Which is an entirely new challenge.
  21. Now you make me blush. 😊 Great image, Rodd. And amazing that you get this crisp detail without any deconvolution.
  22. To decrease the effect of walking noise /outliner pixels, apply cosmetic correction on the calibrated images prior to integration in PI. Cosmetic Correction is under the "Image callibration" section of processes. Tick the check box "Use Auto Detect" and "Hot Sigma". Lower the Hot pixel rejection slider to 2. Save an instance of CC to the workspace and use it during the BatchPreprocesseing script. I don't know if it is available in WBPP. Sorry, I don't see it in a single sub Ha OIII
  23. If dithering works, you should see a hot pixel distribution similar to this. If the hot pixels are along a line, you can expect walking noise.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.