Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

wimvb

Members
  • Posts

    8,776
  • Joined

  • Last visited

  • Days Won

    4

Everything posted by wimvb

  1. Alejandro Tombolini has worked examples at pixinsight.com.ar that show how to use the script. I use it after dbe and background neutralization, but before colour calibration, since it will affect white balance. Then I follow up with deconvolution of stars only, before stretching. If you mask stretch the V channel, you end up with plenty of colour in the stars.
  2. HSV repair script. It's under scripts -> utilities. You can combine them in the linear stage, or (masked) stretch the V before combining. Star reduction (morphology transform -> morphological selection) with contour mask can also give more star colour.
  3. Been there, done that. Nowadays my camera is always on RAW. Even when used as family camera. (Which it is less nowadays, as mobile phone cameras are so much easier.) Nice pic, btw. I'm not sure that calibration frames make sense when shooting jpeg, since the calibration process needs single pixel information, which is lost in the conversion process. Flats may work, removing vignetting. Bias and darks may deteriorate the lights due to added noise. There's only one way to find out ...
  4. Since you're testing PI, here's a tip from a dark lord: PI has a noise evaluation script, and you can do image statistics (process -> image -> statistics), which you can use to evaluate the three stacked images. The difference between the images should be in the background noise, and any hot pixels. You should also examine the stacks for any "walking noise" or streaks left over after stacking. If you want to test stacking in PI, I suggest you use the batch preprocess script, since you're only on the trial version. The feature that should win you over to the dark (dark) side is DBE. Crop the image, apply DBE, and see the magic force.
  5. I agree with Neil, that's a good start. The easiest way to proceed from here is to expose more frames, download deep sky stacker to combine them, and use whatever image processing software you have to increase contrast. Good luck, and have fun
  6. That would be a great challenge, if it were not for one tiny detail: the mount used. Non autoguiders range from poor students/kids using an alt az or eq3 mount, to semi-professionals having mounts that don't need guiding. The challenge would be unfair. But who am I to stop people from taking this challenge?
  7. Great, Neil. Much better than my attempt at ngc2903. What camera settings did you use & nr of subs?
  8. I know that seeing can be bad, but this bad? I guess you meant "no flats" Nice catch, anyway. Cheers
  9. Nice catch. Ha nebs are difficult witj an unmodded camera, the results depend very much on camera make and model. With this target you did very well. Thanks for sharing
  10. I don't think the horzontal lines are from dew. Have a look at a single sub and see if the dark lines emanate from bright stars. I have a issue with dark lines like this when stars are becoming over exposed. http://wimvberlo.blogspot.se/2017/01/correcting-dark-lines-in-dslr-astro.html?m=1
  11. I have found that a longer dew shield can be better than a dew heater. My guide scope dew shield extends about 30 cm in front of the lens. Haven't had dew or ice yet, despite not using a dew heater. Otoh, I have had dew and ice when using a heater alone.
  12. Nice result. You'll notice that being able to do 3 minute subs will vastly improve your image quality. Unfortunately DSS seems infamous for sucking the colour out of images. Sometimes it helps to blur a copy of the image and turn saturation all the way up. Use this as the colour information in an LRGB combination. Thanks for sharing,
  13. Done that, been there. (Never got the t-shirt ). Made a dew heater for my st80 guide scope, but didn't really solve the dew problem until I also made a dew shield, which extends about 30 cm or more in front of the scope lens.
  14. Dunno, one could argue that the Ha emits red, so it should be red. Otoh, it will probably also emit some Hb, which is blue. That would pull the colour towards magenta. Your Rosetta and Veil have a more traditional look. I would leave it the way it is, even if it isn't the way it "should be". I like it.
  15. Impressive. Never seen such a red pelican.
  16. It's possible to start the indiserver at boot. That's how I do it. No need to remember the command line. I believe I got it from here http://www.indilib.org/forum/general/1145-auto-starting-indiserver.html
  17. Great catch. As you already noted, more data will improve the result. If you can increase exposure time to 1 minute, and catch 60 good frames, I think you can have a great image. It also seems to me that you are at the limit of what the Gimp can do. You can probably improve the image in photoshop or PixInsight. Thanks for sharing
  18. The first time I boot a RPi, I use an hdmi cable attached to a tv. Once wifi is set, I use windows remote desktop to access the RPi desktop. To connect in command line mode, I use PuTTY. On some Linux variations, I had to start the x-server, to be able to access the desktop remotely. I can never remember the exact command, and have needed the help of my dear friend G. Oogle on more than one occasion. The x-server can be started from ssh (PuTTY), btw. Hope this may clarify some. Ps G. just provided me with this link http://www.raspberrypiblog.com/2012/10/how-to-setup-remote-desktop-from.html?m=1
  19. As long as the Pi and the desktop client are on the same network, you don't need internet, just your local network. With INDI installed, you eventually don't even need the remote desktop anymore. The Pi's INDI server will be talking directly to the INDI client on your laptop. I found that installing INDI and PHD on a Raspberry Pi under Ubuntu Mate, was even easier than under Raspbian. BTW, in my experience it is wise to have a few spare micro SD cards. You can change RPi configuration by just popping in another card. A short while back, my PHD configuration was acting weid. I just changed the SD card and switched over to Lin_guider, which I knew would work. The same applies should the RPi break down: just get another RPi, load the SD card, and you're good to go. Try to do that with a laptop.
  20. A short update, not too far off topic. Raspberry Pi with INDI is a great way to set up a remote imaging rig, even if "remote" is from the garden to the living room. RPi is the server which connects to the hardware, and a laptop can act as a client (the control computer) Until recently, the client software (Ekos/Kstars) had to be on a Linux machine, even if that machine was running in a virtual mode on a Windows computer. But now, Ekos/Kstars is also available for the Windows operating system: https://edu.kde.org/kstars/ https://edu.kde.org/kstars/#download I downloaded and tested played with it, and as far as I can tell, it works. Mount control and CCD/Camera control is as under Linux. The one thing that doesn't work yet is the astrometry.net offline plate solving tool. But it is possible to use the online tool (nova.astrometry.net).
  21. Wow, those tadpoles really stand out. Great. If you can do that in a 2 hour gap, you must have a permanent setup, I guess. It usually takes me an hour at least to set up my rig and align. Thanks for sharing.
  22. I don't think it was phd, but rather me. As it turned out, phd was already in an ubuntu repository, so installing was much easierthan in Raspbian. No need to check dependencies, just link to the ppa. Now I need to learn ekos/kstars. Sounds familiar?
  23. Snowing here, after yesterdays short cold spell. (Why does this site lack an emoticon for snow?) Enough opportunity to tinker, but never enough to test what you tinkered with. BTW, as you may have read from a recent post of mine, I managed to get a Raspberry Pi working with Ubuntu mate, INDI and PHD2. For some reason I had less luck with Raspbian + INDI + PHD2. Maybe because I installed INDI after PHD2???
  24. ?? Can you elaborate? The center spot of my mirror came off during cleaning, and I replaced it as per numerous articles on the interweb. Was this not right? What's the cause if the difference?
  25. The primary isn't too difficult to remove. The whole cell is held in place with just 3 - 4 screws. To center the center spot, it's good to have transparencies with a circle the size of the primary. For collimation I use a self made cap and a barlow with my laser. It does require a center spot on the primary, however. If you google "barlowed laser collimator", you should reach a few explaining articles on the matter. Good luck
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.