Jump to content

wimvb

Members
  • Posts

    8,949
  • Joined

  • Last visited

  • Days Won

    7

Everything posted by wimvb

  1. Me 2. Just not about buying property in Sweden. Been there, done that (a few times). Including a cottage near a lake.
  2. If you live in my general area of Sweden, you can watch them 90 - 95% of the time, without missing any clear spells.
  3. That's hardly a skill where I live. If I "predict" it will be cloudy, I'm right 90 - 95% of the time. That's better than most professional weather sites. 😉
  4. As of today: snow shoveler. I just cleared my obsy roof from 3 inches of snow.
  5. As close to neutral as possible without "bending over backwards"
  6. As @newbie alert already noted, if your flats are not working, you need to check if they have a correct exposure. Flats should be mid gray, and the histogram should be in the middle of the display on the camera (assuming you use a dslr). Before you apply flats in your calibration process, they themselves must be calibrated. This is usually done with bias exposures. If you haven't done so already, add bias frames to the calibration and stacking.
  7. Cloud watcher On a more serious note, since I have a background in electronics and done some construction, it was the astronomy stuff that was new.
  8. We're having an interesting discussion on the Swedish astronomy forum about imaging north of the arctic circle. An astrophotographer (David Björkén) who lives in Jokkmokk made a video which he shares on youtube. I thought that this might interest you guys & girls "down south" as well. Enjoy
  9. Very unlikely. The image rotation information is not used by the calibration software, only by image viewers. To see if flats work, you calibrate one flat sub with the master flat. The result should be a fully corrected and uniform, mid gray image. For osc, there may be a colour bias.
  10. You got some great advice re the RPi400 already. I would like to chime in on getting a second Raspberry Pi 4 (without the keyboard) or even just a Raspberry Pi 3B+ dedicated for your astro gear. Run Astroberry/Kstars on that. With the latest versions of Kstars, you don't even need PHD anymore; I get the same or better guiding results with the native guiding tool as a friend of mine with PHD2. His scope is just 2 m from mine in the same obsy. PHD has polar alignment tools, but the PA routine in Kstars works just as good, imo. I would connect the Raspberry Pi with an ethernet cable to the home network, as this is more reliable than wifi. You then run Windows Remote Desktop (or VNC, whichever you prefer) on your home computer (laptop, Mac). Use kstars on the RPi, and not on the home computer that connects to this, because if you lose the network connection, the RPi can continue if you run everything locally. If you run Kstars on the home computer, the system will crash if you lose connection. After a night's imaging, just transfer the image files to your home computer for processing.
  11. Sorry, can't help you there. But I'm glad you got that issue resolved.
  12. The second image has a clarity that the first image lacks. But in the second image, you've flattened the background to the extent that you lose faint fuzzies. To the right of M108, there are a few fuzzy patches in the first image, that are galaxies at a distance of 1.8 Gly (!), and also near the bright star at "two o'clock" there is a larger fuzzy galaxy that is at 600 Mly. In the second image, you are about to lose that one. This image is definitely worth a revisit, imo.
  13. Great catch. At 2.5 hrs there's still a a fair amount of noise and the image looks a bit soft. How were your sky conditions?
  14. If you image mostly nebulae with an osc camera, such as a dslr, then either an uhc filter or dual band filter, which only has transmission near Ha and Oiii, and a large gap inbetween will be best. If you image reflection nebulae, star clusters or galaxies, probably an IDAS lp suppression filter is best. The problem with lp suppression filters is always that you want to exclude as much light pollution as possible, while maintaining as much of the full spectrum as possible. Unfortunately you can't have both, and how effective a filter is, can depend very much on your local sky conditions. Here are a few suggestions at least. https://www.firstlightoptics.com/light-pollution-reduction-imaging/optolong-l-enhance-narrowband-deep-sky-imaging-filter.html https://www.firstlightoptics.com/light-pollution-reduction-imaging/idas-d2-light-pollution-suppression-filter.html https://www.firstlightoptics.com/light-pollution-reduction-imaging/idas-d1-light-pollution-suppression-filter.html
  15. The problem with light pollution isn't so much the increased background level, which can be processed out, as is the noise that is left behind. The noise associated with any light source varies as the square root of the light intensity. The best ways to decrease noise is to take more exposures, or to block the light before it reaches the sensor. In a Bortle 7 or 8 location, I would definitely use some sort of filter on an osc camera, or go mono.
  16. They've done that for far too long now. The problem is that they're off by half a country. Göran has clear nights, while the rest of Sweden is covered in clouds.
  17. It could be a diffraction streak. Any straight line or edge in front of the scope can cause this. Even a washing line for example.
  18. SharpCap needs a minimum 1 degree field if view, I believe. With the 72ED that should not be a problem, but if you also want to use it with long fl telescopes, you should be aware of this. I don’t use SharpCap myself, but a friend who tried to use it with a 1150mm fl telescope and KAF8300 sensor, had trouble doing a polar alignment.
  19. The motor will allow vibration free focusing, but if you want automated focusing you should be aware of the backlash these motors have. I built an arduino based control unit for such a motor once. It uses the Moonlite protocol and is compatible with INDI.
  20. My last image from 2020. Actually, 2020 wasn't as bad as 2019 and 2018, counting the number of images I posted on astrobin (with estimated total integration time) 2020: 10 images (95.6 hrs) 2019: 7 images (38.1 hrs) 2018: 7 images (27.2 hrs)
  21. Do you need to use the indoor computer if you also use the 3rd computer for remote access? If the imaging and control software (drivers, ASCOM, SGP) is installed on the mini-pc, and this has a wired connection to your home network (router), you only need RDP on a computer for remote access. To control your gear while away from home, you need a public ip address and open a port on your router. You can then, from any device that has internet access, control your obsy, without the need for an intermediate, indoor computer. I've done this installation for a friend who has his gear in my obsy. Some time ago, he initiated a sequence while travelling on a bus. He used his mobile phone as a hotspot, and his Mac laptop with RDP to control his telescope. ASCOM and SGP are installed on a mini-pc in the obsy. The advantage of having SGP on the obsy computer is, that if you lose network access, any sequence will continue to run. Whereas, if you have SGP on the remote computer, a lost connection will crash a sequence. This has happened to me on more than one occasion in the past (INDI and Ekos/Kstars). Just a thought.
  22. As a reference, images from my ASI294MM at 4.63 um pixel size, take about 1.5 s to download from camera to the Rock64 (raspberry pi alternative) that I use for imaging. For gear control and imaging (no processing of course), you shouldn't need much more. Download times of more than a few seconds with these cameras seems excessive. Otoh, I wouldn't be surprised if you have driver issues. QHY may be good cameras, but I have avoided them in the past, because of software/driver issues. And so far I'm not convinced they've solved those. Just my € 0.02
  23. That is odd. I assume you also posted this problem on the PI forum. Did you get a response there? Unfortunately, I can't help you, as I haven't updated my version for quite a while.
  24. If you use no pixel rejection, all stack image pixels are used to calculate the average signal, which will be your integrated image. As long as the individual pixels behave nicely ( eg gaussian distribution), the integrated image will have the lowest noise possible. This is because if you reject pixels, the stack is made up of a lower number of samples (for each pixel), and the average of fewer samples will show a larger variation (more noise). But, if there are outliers, exclusion of those pixels will result in less variation = lower noise. The noise estimates, such as SNR, should go up. Ideally, those outliers would be caught during calibration (hot and cold pixels). But nothing is ideal, and pixel rejection is needed to refine the calibration, as well as remove satellite trails, cosmic ray bursts, etc. I think you can safely use the SNR estimate, as you will be comparing numbers for different rejection settings. But also look at the rejection maps (especially high rejection). If you start to see details from your main target in those, you have a too aggressive rejection.
  25. The book I linked to, will clarify this, and much more. Can the camera save images in raw (unprocessed and uncompressed) format? This is important if you want to get decent results. Again, the reason for this is explained in the book.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.