Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.

Welcome to Stargazers Lounge

Register now to gain access to all of our features. Once registered and logged in, you will be able to contribute to this site by submitting your own content or replying to existing content. You'll be able to customise your profile, receive reputation points as a reward for submitting content, while also communicating with other members via your own private inbox, plus much more! This message will be removed once you have signed in.


Advanced Members
  • Content count

  • Joined

  • Last visited

Community Reputation

1,155 Excellent


About wimvb

  • Rank
    Brown Dwarf
  • Birthday 29/07/63

Contact Methods

  • Website URL

Profile Information

  • Gender
  • Location
    Sweden (59.47° North)

Recent Profile Visitors

1,566 profile views
  1. As long as the Pi and the desktop client are on the same network, you don't need internet, just your local network. With INDI installed, you eventually don't even need the remote desktop anymore. The Pi's INDI server will be talking directly to the INDI client on your laptop. I found that installing INDI and PHD on a Raspberry Pi under Ubuntu Mate, was even easier than under Raspbian. BTW, in my experience it is wise to have a few spare micro SD cards. You can change RPi configuration by just popping in another card. A short while back, my PHD configuration was acting weid. I just changed the SD card and switched over to Lin_guider, which I knew would work. The same applies should the RPi break down: just get another RPi, load the SD card, and you're good to go. Try to do that with a laptop.
  2. Great image, with excellent detail in the nebula. Thanks for sharing.
  3. A short update, not too far off topic. Raspberry Pi with INDI is a great way to set up a remote imaging rig, even if "remote" is from the garden to the living room. RPi is the server which connects to the hardware, and a laptop can act as a client (the control computer) Until recently, the client software (Ekos/Kstars) had to be on a Linux machine, even if that machine was running in a virtual mode on a Windows computer. But now, Ekos/Kstars is also available for the Windows operating system: I downloaded and tested played with it, and as far as I can tell, it works. Mount control and CCD/Camera control is as under Linux. The one thing that doesn't work yet is the offline plate solving tool. But it is possible to use the online tool (
  4. If you want to get into AP, you'll need to invest in software. I chose PixInsight, because it's basically a one stop shop. Even if you go the PS or StarTools route, there are some processes in PixInsight that are hard to replace. As for final result, that will allways depend more on the person than the tool. Regarding hardware, you can either add autoguiding, or a sensitive, low noise camera such as asi1600.
  5. Looks really good. The background looks kind if red, but that may very well be caused by weak Ha. Isn't that contradictory to the workflow? (I.e., you turn off noise reduction that will reduce fuxed pattern noise) Or are they two different things? I made hot pixel maps in PixInsight by binarising a master dark. Clipped the low signal such that only bright hot pixels were left. It needed some experimenting to get it right.
  6. I thought that was an ESPRIT 100ED you got. Or did you get another one? (Which would explain the clouds. )
  7. Wow, those tadpoles really stand out. Great. If you can do that in a 2 hour gap, you must have a permanent setup, I guess. It usually takes me an hour at least to set up my rig and align. Thanks for sharing.
  8. I agree, the best image processing tip I have ever received was to collect more data, and increase exposure time. Good luck
  9. PixInsight: that makes it easier. Try this: Make two duplicates of your unstretched image after dbe and colour calibration. On one of the copies, use masked stretch with a preview that contains only background as target. Decrease targeted background to 0.1 and increase clipping fraction to 0.01 or even slightly higher. Compare this to a histogram transformation that you do on the other copy of your image. "Over the next week or so" Isn't that overly optimistic? Good luck
  10. I agree. Some of the images posted here make you wonder why NASA would want to spend M$ on sending telescopes in space. On the other hand, there's less cloud up there. Cheers,
  11. Nice result, despite the mishaps. I think that with some gentle processing, you can lift the nebula a bit more. What software do you use for processing? Both PixInsight and Photoshop can do masked stretching, which stretches the weaker neb more than the stars. There was an article by Blair MacDonald about this in a previous issue (April 2012) of JRASC (Journal of the Canadian Astronomical Society). Regarding your mishaps, I had a similar experience last night. Nothing seemed to work, and Ekos/Kstars managed to park my scope right into the mount on two occasions. To top it off, when I set my camera to shoot darks before I went to bed, I forgot to set it to the correct ISO settings. This morning I had 20 10 minute darks at ISO 1600 in stead of ISO 200. Today during the day, I sorted out the parking issues, but clouds and snow stopped me from testing tonight. Hope you get some clear nights soon to collect more data on this target.
  12. You definitely got more detail in the second version. I think that in the first, you had clipped the background. There are probably better noise reduction methods in PS, but if all else fails, you can try to reduce the colour noise by masking the stars and galaxies, and dialing down colour saturation. This should neutralise the background, while keeping colour in the galaxies. BTW, didn't dig up your original post, but how many subs at which camera settings went into making this image?
  13. Making the nebula more intense red, can certainly be done with careful masking and colour adjustment. But you'd have to be careful to exclude stars, as you will otherwise increase the red halo, that some already have. Also remember that the intense red you see in nebulae images, is the result from adding narrowband Ha to rgb data. Rgb images, even with modified dslr cameras, don't always have this appearance. Personally, I wouldn't deviate too much from the original colour balance. But in the end it all comes down to personal taste.
  14. Nice "snap" Had you been into ap, you would've had to calibrate, stack, remove gradient, colour calibrate, stretch and use hdr techniques to "handle the core of luna". In the end, the image would probably end up being black and white only. Some times, simple is better. Thanks for sharing.
  15. Done the same with a dslr at iso 400 and barely got usable subs at 10 minutes exposure time, due to light pollution and noise. The histogram on the camera display peaked at 1/3 from left side, but after background extraction in PixInsight, I got mostly noise left. At my dark site, I can easily double the exposure time, andstill have a dark background. Sky conditions determine how deep you can go. In my image of the pleiades, I got faint dust at 10 min exposure time, as compared to almost no result at my light polluted site. That's why I'm saying that there is no golden rule for (single sub) exposure time. It's too much dependent on setup and local conditions. So, @RayD, keep experimenting. As you get more experienced, you'll develop a feel/knowkedge for what works best for you. Good luck