Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

wimvb

Members
  • Posts

    8,852
  • Joined

  • Last visited

  • Days Won

    5

Everything posted by wimvb

  1. wimvb

    M16

    Nice catch, Rodd. Is this cropped?
  2. Very nice widefield. Keeping better star colours with short exposures makes sense. Once you start to overexpose stars, all 3 channels will be maxed out = white. If the camera has enough dynamic range, it may be better to keep the exposure time short. But you risk losing faint detail.
  3. From what I could find, the Orion has a small sensor with large pixels. With your 200pds you are undersampled at roughly 1.7 arcsecs/pixel. Ok for nebulae, but imo too course for galaxies. At the same time, you probably won't fit nebulae on that small sensor. What ZWO camera are you considering? On my wish list it would be 1. ZWO cooled 2. DSLR (new or second hand) 3. ORION G4 But as @Skipper Billy wrote: use a field of view calculator when evaluating various candidates.
  4. I've now made a MQTT version of the sky quality meter. In this version, the device no longer creates a webpage with the SQM readings, but publishes the timestamp and SQM readings to a MQTT broker. The code is in the github link provided in the original post. Here's what needs to be done: 1. Install a MQTT broker. I installed Mosquitto on a Raspberry Pi I had lying around. Here's a good resource: https://randomnerdtutorials.com/how-to-install-mosquitto-broker-on-raspberry-pi/ 2. Install a MQTT client. For this I use MQTT.fx ( https://mqttfx.jensd.de/ ) on my Windows computer. There are several clients available, including Mosquitto. 3. Get a simple mqtt client for the esp32: https://randomnerdtutorials.com/micropython-mqtt-esp32-esp8266/ This version of the sky quality meter is not (yet?) compatible with INDI, you would need an INDI MQTT driver for this.
  5. Just remember that with a CMOS camera, you should never scale darks. Leave that option unchecked in the calibration process. You can of course, create a dark library and do so on a cloudy night in a darkened room. Less risk of light leaks that way.
  6. What's even more impressive is that the zwo filter wheel and oag fit in this scheme. No extra adapters needed, just attach one to the other. They seem to have thought this through.
  7. Some cameras have good enough sensitivity in the red part of the spectrum to capture H-alpha, even without removal of the ir filter. What camera do you use?
  8. Then you should try 200 frames, still just over one hour. Image quality (measured as signal-to-noise ratio) is determined by total integration time = nr of frames*exposure time. If you want to catch more of the weak parts, you will need to increase the exposure time. And take flats of course. When I imaged with my pentax dslr + 135 mm lens, I took flats by pointing my camera at a door that was evenly lit. Even though the door was blueish, the flats worked. Just make sure any shadow is not in the field of view. This method may leave some gradient, but will help you get rid of dust bunnies. Maybe useful until you get that lightbox.
  9. And for good measure you can also wrap the front of the camera (with cap on) with double layers of aluminium foil. Plastic caps are not light proof, especially infra red light can make it through the plastic.
  10. Where's the fun in that? For me processing raw data is where the challenge is in AP. Data capture is very much a mechanical activity, while processing is the artistic part of our hobby. The problem with this set of scripts is that you hand over control allmost completely. One of the advantages of Pixinsight is that it lets you take control of all the details in post processing. Daunting at first, but powerful once you understand. Besides, every image has its own challenges, which I don't think you can solve by applying a single script. Some of the shortcomings of the scripts shown in the video were the cropping of all four edges equally (that would have clipped your M42), leaving a stacking artefact on the right hand side in the image, the green cast that could have been removed much more efficiently without creating a purple/magenta background, etc. For a beginner I can recommend the videos by Richard Bloch (just Google Richard Bloch PixInsigt), and Kayron Mercieca (lightvortexastronomy.com), as well as our in house guru Harry Page.
  11. Put a little more work into it. Noise reduction, colour contrast and repaired the stars.
  12. I had a look at it myself, but unfortunately none of the vendors ship to Sweden. Over here, these usually are a lot more expensive. Btw, I can only recommend PixInsight, it has a different philosophy than other apps, but it has everything included. No need to get plugins or addons. It's well worth the money.
  13. First impression: Great image. Unfortunately it still suffers from the dust bunny in the top half Simple stretch and some HDR compression in PixInsight
  14. Here's how I take flats. https://www.astrobin.com/326547/?nc=user IKEA Floalt + embroidery hoop with 4 layers of white cotton. The Floalt has intensity and light temperature control. Together with the automatic exposure control in Ekos/Kstars, it produces good flats.
  15. It depends. There is a thread in the imaging section about these. The consensus seems to be that if the leds are situated at only one side, illumination will be uneven, and it won't work. But if illumination is even across the surface, it will work if you cover the panel with white paper. Have a look here
  16. I have to agree. For deconvolution to work, you need to have a good signal to noise ratio. It can work without a mask, but it's a lot easier with. If you increase the regularisation (nr of layers and strength/layer) you can push deconvolution to target stars only. This is one way to reduce stars before stretching. On this image, I would mainly use it to do just that. More data under more favourable conditions is the best way to improve the galaxy.
  17. +1 This galaxy is usually the target of large reflectors, not refractors. Excellent result, and I admire the commitment.
  18. Good catch under those conditions. You may be able to squeeze a little more out of your data with deconvolution and star reduction.
  19. Congrats on your success. Incidentally, ngc 185 was also one of my first deep sky objects with computer controlled targeting + astro camera. It may not be the fanciest target out there, but it's one you have to do just because you can, imo.
  20. Nice! The inverted image shows more depth (3D) on my screen than the regular one. Probably due to the deeper shadows.
  21. Here's the image with only a crop and a basic stretch applied. The vignetting and dust bunny are obvious. DBE removes the vignetting, which makes the dust bunny even more obvious, as it becomes the darkest area in the image. and as received (stretched):
  22. Dynamic Background Equalisation: a tool in Pixinsight to remove gradients in the image. The shadow of dust near the camera sensor. This will cause a dark area in the image. Only flats will help even these out. "no can do". I toned it down as much as possible. It's just below the bright stars above the galaxy. I can post an image where it's still visible.
  23. Here's my attempt in PixInsight Crop DBE x2 Background neutrlisation Photometric Colour Calibration Split Luminance RGB: Star repair script Masked stretch Further star repair Luminance: Histogram transformation Curves Transformation HDR transformation Curves Transformation LRGB combination LRGB: LRGB combination Star reduction 50 % resampled
  24. Downloaded. Some first remarks 1. the target is a bit off center, but it fits the fov very well 2. you need to take flats. There is strong vignetting, which combined with large offsets between subs, creates a gradient that even DBE in PixInsight struggles with. I had to apply two passes of DBE to get a reasonably flat background. There is also a dust bunny in the image that can't be removed.
  25. I timed it just now again, from parked position. I used a previous capture of the Sunflower galaxy. It solved it in 7 seconds, but at the same time ASTAP failed to do the refinement solution with the ccd simulator. My local astrometry on the Rock64 took 13 seconds to solve the image from ccd simulator. I need more clear nights ...
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.