Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

wimvb

Members
  • Posts

    8,834
  • Joined

  • Last visited

  • Days Won

    5

Everything posted by wimvb

  1. True. But zwo can only blame themselves. They decided not to develope a proper driver for linux. Btw, my old asi120 (usb2) works under linux, but zwo don't guarantee it. It's a bit of a gamble. Then it shouldn't be too difficult getting this to work on a StellarMate. Phd2 at least is the same. 😉 The advantage of ekos/kstars and StellarMate is that you have everything in the same package. You only open one piece of software (ok, 2 if you use PHD2 and not the internal guider) to control all your hardware. And the computer out in the field is a lot cheaper than a Wintel based machine.
  2. It better be, they're both ASI (ZWO). 😀 Things would be worse if the guide cam were qhy. StellarMate is also an option, it's without the camera limitation.
  3. If by black box you mean a light tight box to shoot darks, then be careful. Such a box will also trap heat from your camera. Better to wrap the front of your camera in double layers of aluminium foil. Then in a dark room or cabinet/closet, which doesn't trap heat as much. Even a cooled camera will have to work harder to maintain its temperature in a box. Good luck.
  4. Never 😄 As said, each app can handle only one camera at a time. Both should be selectable in each app, but only one can be active in each app at any time.
  5. Increase the number of exposures. The exposure time should be long enough to clear the read noise floor. If your uncalibrated light frames have bands (read pattern), then you probably need longer exposures.
  6. If you standardise the cooling, gain and exposure time, you can reuse darks. The exact settings depend on your setup and sky conditions. Start with a fixed temperature (I use -20 C) and gain, then experiment with time untill you find an exposure time that works. You may need one for L, one for rgb, and one for narrowband. CCD cameras may not need darks, and neither can uncooled cameras, but for cooled cmos they are essential.
  7. Excellent progress. What camera gain do you use? With 3 mins exposures, you are probably using unity gain or lower, I guess. The NA nebula is mainly an Ha region, only emitting at that wavelength. Luminance frames won't do you any good on that target. In fact, they can deteriorate your image, especially if you have light pollution. Even small amounts of lp will add noise but no more signal than either an Ha filter or a red filter. Better to use an Ha filter, and unbinned rgb for the stars. For Ha you either need a longer exposure time than for rgb, or use a higher gain setting, and shoot many more frames. In my experience, cmos images improve dramatically if you increase the number of exposures to several tens per channel. The lower corners of the NA image also seems black clipped. You will probably see a general improvement if you lower the black point while stretching (ie, keep the left side of the histogram slightly above 0).
  8. Did you use darks to calibrate your light frames? If not, then that's the problem. There is a simple test you can do to see if your master flat is doing its job. 1. Calibrate each flat sub with a dark flat master. (Darks can't be created by only using a plastic lens cap. These let too much light through) 2. Stack the calibrated flats to create a master flat. 3. Calibrate one raw flat frame with the master dark flat (subtraction) and master flat (division). 4. Stretch the calibrated flat frame and the master flat by the same amount. The single flat frame should not show dust bunnies and no vignetting, while the master flat should.
  9. Very nice, Göran. And a very short build time. 👍 Btw, for those who are not pixel peepers, two essential details. No shoes allowed inside, and a shoehorn next to the entrance. 😋
  10. I guess you mean this? https://agenaastro.com/zwo-canon-eos-lens-short-adapter-asi1600-efw.html
  11. There is a very simple test you can do to see how much real information there is in an image, such as your L master. Make a copy of the image. Stretch that copy using histogram transform (pixinsight) or levels (ps) by bringing in the white point slider so far that you start showing details in the background. This will clip stars and the main galaxy, but it will reveal anything in the background. You may need to also bring in the black point, but do so without clipping. If you see the remains of dust motes, or a fixed pattern caused by your camera, you've gone too far. But you should definitely see the small satellite galaxy Holmberg IX, and maybe also Arp's loop before this. It may be easier to see those details if you first invert the image, in which case of course, you move in the black point slider. This test will help you once you start working on the original image. You can keep it as a reference to do side by side comparisons.
  12. Great image. Not necessarily. Just process the L to the best of your ability and blend it into the colour (lrgb combination). If you want more L, you can combine the r, g, and b masters into a synthetic luminance. Then combine that with the captured L master. It may improve the overall quality of the luminance, but I'm not concinced that you need it.
  13. Just a matter of access. I could get a router cheap at the time. A switch would also do.
  14. That would cost you one usb port. I opted for an extra router, to which I connect the sbc's (two actually) and my laptop. Since I have my setup a bit away from my house, and too many obstructions, I found wifi too slow and error prone. When I used dongles, I could lose my wifi connection. So I got myself a router that I set up with its own little network. I wire the sbc's and laptop to it. No internet, but that isn't needed during imaging; I platesolve offline. Ansvr (astrometry net server) on windows or built in astrometry on the rock64. The reason that I got two sbc's is that it turned out I was one of the first to use a 64 bit Rock64 with INDI, and it took a while to get everything working. For some reason, having two zwo cameras on one sbc didn't work. My imaging camera and my guide camera are both ASI. But distributing INDI is easy, and gave me redundancy. PHD wasn't available in 64 bit on Linux, but Patrick Chevalley (who maintains phd on linux) added the arm64 port to the repository. When you use free, open software, you have to accept that it may not be fully developed, and that you may have to figure out things as you go along. But it's great to be part of that community and help in its development, even if only through testing and sharing experiences. A few weeks ago I moved from Ubuntu 16.04 to 18.04, and installed everything from scratch (Ubuntu, mate, INDI, ekos/kstars, phd, and astrometry). Unlike the very first time, it was easy this time, and everything worked immediately. Also a sign that the technology is becoming mature.
  15. Rock64 from Pine64.org. It has more memory than Raspberry Pi (1, 2 or 4 GB), usb3, support for sd cards up to at least 128 GB, but no built in wifi.
  16. Me too. But I do prefer phd over the built in guider. 2 programs going while I do my thing. I have had kstars on my windows laptop crash, but I could relate that to how long that computer had been running since last start. If I reboot my laptop before I start kstars, it doesn't crash. Probably it has something to do with windows memory management or cache usage. Coming season I plan to run everything from my sbc, including plate solving. I'll only use windows rdp on my laptop.
  17. The last time you were able to complain about Starlink, I think he means. I haven't kept track, it won't be dark enough for another two months.
  18. I see it as a sign that INDI is becoming a mature alternative to ASCOM. And even with several commercial products coming to market, you still have the option to buy a Raspberry Pi and install free alternatives, such as ekos/kstars, astroberry, CdC and CCDCiel, etc.
  19. Very similar software, but different hardware. The extras are in the HAT that Atik developed. Probably also custom drivers to control that hardware. Not necessarily so. Zwo have the same concept, but locked out non-zwo cameras, afaIk. So it depends on how open the Atik software is.
  20. If you examine the specs and images of this product, you'll discover that Atik put some thought and engineering into it. 12 V power input, so no need for 5V, plus 12 V outputs to power stuff. ST4 guide port, built in focuser interface. A Raspberry Pi with a custom HAT, based on StellarMate, and codeveloped with Ikarus (read, Jasem Mutlaq). All that should be worth the higher price. Imo, it is competitively priced when compared to similar products. The real question is, will it support other cameras, or only Atik?
  21. I'd LOVE to see it, but hate to lug it.
  22. Very nice image. But as already noted, a bit overstretched. In LRGB imaging I usually apply a very mild arcsinh stretch, followed by a masked stretch to the RGB data, as per Barry Wilsons recipe: https://barrywilson.smugmug.com/PixInsight-Tutorials/Boosting-star-colour-Repaired-HSV-Separation The luminance image needs to be held down in contrast. Areas with too much brightness will result in washed out colours. Barry Wilson has a good tutorial on this as well: https://barrywilson.smugmug.com/PixInsight-Tutorials (LRGB workflow) I sometimes use a shortcut: After processing the L image, I decrease the white point with the curves tool (curvestransformation), down to below 90%. This will bring back colour in the stars and the brighter areas of the main target.
  23. Kappa sigma pixel rejection generally does a good job if you increase the rejected fraction. PixInsight has large scale pixel rejection that works extremely well.
  24. I just process the lum with all the bells and whistles. Then, before lrgb combination, I apply curves transformation with the white point pulled down to about 0.85, but otherwise linear. The l to be used in lrgb should have similar values as the l that you can extract from the rgb image before lrgb combination. The difference between the two is sharpness and control of local contrast, noise reduction etc
  25. Is your dither working as it should? If you create a movie from your unregistered but calibrated subs, you should see the stars moving in roughly the same pattern as your dither movements. If they still move in a linear pattern, even if that pattern consists of forward and backward steps, your dither doesn't work. (In PixInsight you'd use the Blink tool for that.)
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.