Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

wimvb

Members
  • Posts

    8,855
  • Joined

  • Last visited

  • Days Won

    5

Everything posted by wimvb

  1. The beauty with this scope is that all critical distances are inside the tube. This means that once you have it collimated, you can just pop in the camera and you're set. If you only use the camera without oag/filter wheel, you may want to add a spacer so you don't need to pull out the extension tube or the draw tube. These can cause flexure. With my ASI, oag and filter wheel, focus is only 10-15 mm from all the way in, and it seems stable. But I put a piece of aluminium tape on the extension tube to make the focuser more stable.
  2. Great image. I agree with Vlad, don't try to remove the halos.
  3. Only 5 (!) subjects for me last year. All with the SW MN190 and ZWO ASI174MM-Cool. Not always as much data as I would like. ngc 2683 ngc 4565 Arp 217 (ngc 3310) M106 M63
  4. I took the liberty of downloading your original L image and your final colour image. I then combined the two in PixInsight. Never mind the colours, this kind of combination should be done with original data, and not with lossy jpegs. It seems to me that your data has more to offer than what is in your final image. Even with just 3 hours of L, you have more of the jets of M82, and even some of the IFN that is surrounding these galaxies. There's also more structure in your original M82, which you should be able to keep with careful stretching and hdr processing.
  5. Nice! You're getting there. The extra data really lifts the image. I think you can safely darken the background a little to give the galaxies a little more punch. Also, when doing noise reduction (which, btw, you may not need with 8 hrs lum data), consider using a mask that has the background at 50%. This will leave some noise, but at the same time also the natural variation as a texture. What is your sky darkness? At 23 hrs captured, you may have ifn in the background.
  6. It's easier if you show a few pictures with what you have.
  7. The T-ring should have a bayonet connector that fits on the body of the canon, without any lenses attached. The other side of the ring screws into the field flattener of your scope. The T-ring should look like this https://www.firstlightoptics.com/adapters/skywatcher-dslr-m48-ring-adapter.html
  8. The problem with imaging under light polluted skies (including moon lightl) is that you need a very long integration time to get the noise down to an acceptable level.
  9. No, unfortunately not, but information about how to make one are in the thread. The code is on github. https://github.com/wberlo/AutoDither Using the dither bix oresumes that you have a mount with a st4 port and a snap port (camera port). Most skywatcher mounts have these nowadays.
  10. It doesn't have to be random. Before I started using guiding, I dithered manually. Worked absolutely fine. @Pankaj: here's my recipe. set the slewrate to 1x sidereal. After image 1, press the dec+ button for about as many seconds as your pixelscale (minimum of 1 second). This will give you a dither distance of 15 pixels. Let's call this one dither step. Take image 2. Then press ra+ for one dither step. Take image 3. After that press dec- for one dither step Repeat the same after image 4 (dec-) Take image 5. Press ra- for one dither step. Repeat efter image 6 (ra-) After images 7, 8, 9 each, dither in dec+ After images 10, 11, 12 each, dither ra+ After images 13, 14, 15, 16 each, dither one step in dec- After 17, 28, 19, 20 each, one step in ra- Etc. This will give you an outward spiraling dither pattern. Been there, done that, used a cheat sheet, and was never bothered by walking noise again. This will even work if you have a (moderate) backlash in dec. If you get tired of doing this manually, and you're not afraid of some diy, here's the automated version
  11. Very smooth. I like it.
  12. Yes. There's so much marketing hype in filters that it's best to (learn to) read the spectral curves before buying.
  13. AfaIk, moon filters (neutral density filters) are used to image/look at the moon. As Olly wrote, they dim the light to a manageable level, so it doesn't blind you. The only filters that work in moonlight are deep red filters when used at 90 degrees to the moon. As with sunlight, moonlight is scattered in the atmosphere, and the sky will be blue at 90 degrees to the moon. A red filter will block this blue scattered light and create more contrast. This is the same as using deep red filters in b/w daytime photography, and the reason why Ha imaging works during a full moon. The filter that you have, blocks light between 550 and 600nm (if it's similar to Baaders https://www.baader-planetarium.com/en/baader-neodymium-(moon-and-skyglow)-filter.html) and is basically a light pollution filter that blocks the light from sodium and mercury lamps.
  14. Is this really necessary? Indi is distributed, ie N x M solution with any number of servers and any number of clients. I haven't tested this, but I wouldn't be surprised if only one rpi is needed. Also, check this link https://indilib.org/develop/developer-manual/100-3rd-party-drivers.html
  15. Ok. Pixinsight can do platesolving of course, but not autofocus. It seems that development of the INDI modules in PI has stalled.
  16. ... and that's where it performs best. I never liked that pcgcontrol-through-hand set solution of skywatcher. Glad it all worked out well.
  17. Strange. Could it be thrown off by the overall brightness? How does it perform when it's darker?
  18. Do you run ekos on the Pi, or on a laptop/pc? Whatever you use, you can control the other camera from pixinsight. Just connect to the remote pi, and use the ccd process. It's fairly straightforward.
  19. Recently, Göran @gorann and I had a private discussion about this galaxy and faint structures surrounding it, and he reminded me of this thread. Having nothing better to do atm, I decided to reprocess the data, using a few new methods and tricks I picked up in PixInsight. So here's my latest, and maybe even final, attempt As before, I created a synthetic luminance from the RGB frames, since the original luminance has calibration artefacts (shifted dust bunnies). I processed the RGB data for maximum colour punch, and the L data for maximum detail, and trying to preserve the objects in the background. I also managed to pull out a few Ha regions in the galaxy. The data can take a fair amount of deconvolution, as well as stretching. I hope I didin't overcook it too much. Also, note the very faint smudge at the bottom of the image, towards the left. This is a small galaxy that goes by the name SDSS J124324.90+322854.2, poor thing. It is about at the same distance as ngc 4631 (some 40 Mly according to SED), but I haven't been able to find any data relating them to each other. There is however, supposed to be a faint tidal stream going between ngc 4636 (the Whale) and ngc 4656 (the Hockey stick). Unfortunately, this image doesn't show that level of detail. https://www.researchgate.net/figure/Final-ROSA-L-image-of-the-NGC-4631-environs-The-image-is-approximately-81-81-in-sky_fig2_267338882 Edit: This composite image shows the size of the tidal structure surrounding ngc 4631. There is a lot going on in the background, but the data doesn't show any sign of the extension between ngc 4631 and ngc 4656
  20. Vcc of the sensors is connected to 3.3V on the esp board, Gnd can of course be connected to any one of the ground pins, easiest is the pin directly next to 3.3V I2C pins are: SCL is pin D5 and SDA is pin D4. Both these pins are on the same side of the board as 3.3V Since distances are short, these pins can be directly connected to the corresponding pins on the BME board and the MLX board. These pins are marked with SCL, SDA, Vcc and Gnd. There's no need for extra resistors or capacitors as the distance to the esp board is only a few inches. I connected the MLX board with short leads, because this device needs to be glued in a hole in the enclosure. The BME board can be soldered onto the V-board (strip board). The layout was very much like the picture I posted in my original post, with the BME just above the esp, and the MLX connected with leads. I sealed the enclosure with silicone, otherwise I could have opened it and taken a picture. But maybe this helps.
  21. Thanks, but there's no need to go through that trouble for me. What fl do you intend to use?
  22. Very nice, and it does look rather faint, indeed. If luminance gives you larger stars, maybe your subs are slightly overexposed. I use 120 s subs for L, but 240 s for rgb. That is with a cmos camera. Just a thought.
  23. You can always resample the image to get any nr of pixels you like, but it won't get you any more detail. The only resampling that can get you slightly more detail is drizzle integration. You'll need a fairly large number of subs that are (somewhat) undersampled. It comes at a cost of reduced signal to noise ratio. (There ain't no such thing as a free lunch. Certainly not in astrophotography.)
  24. The easiest way to get rid of those magenta stars: Invert the image (process - intensity transformations - invert) Apply scnr (process - noise reduction - scnr) Invert
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.