Jump to content

wimvb

Members
  • Posts

    8,950
  • Joined

  • Last visited

  • Days Won

    7

Everything posted by wimvb

  1. I agree, either light leak or a reflection. Take an image withe the lens cap / telescope cover lock on and take a few images. One image with the suspected slot and one each with the two neughbouring slots. Do you have a manual or an electronic filter wheel?
  2. Better to get to the cause of the problem: get a Mak-Newt 😁 (that’s what I did)
  3. Btw, the new astro gadget to have https://www.astrobin.com/fnu3de/?nc=all if you’re handy with a hacksaw and hand drill you can probably make on for a Newtonian.
  4. A bit different from the Hubble image https://www.nasa.gov/feature/goddard/2017/messier-74
  5. As its name implies, it is the weight part of wbpp that makes it preferable to ”manual” pre-processing. But you can use subframe selector to create the weights. And if you use lrgb processing, you only need to do that for luminance. RGB is much more forgiving.
  6. What computer do you use? I recently upgraded mine to one with Intel i9 processor, 32 MB RAM and 1 TB SSD, running Windows 11 Pro. It made a huge difference, but wbpp still takes a while to run.
  7. Fyi, todays apod is a composite of Hubble, Subaru and JWST data. UV to IR spectral range. https://apod.nasa.gov/apod/ap220718.html
  8. $T is the image that you apply PixelMath to. It will be replaced. In this case, it is the starless enhanced image. Copy1 and copy2 don’t need to be stretched as much as the starless, enhanced image. Copy1 will therefore have smaller stars without halos. You can also create copy1 when the image still is linear and use a colour preserving stretch on it, such as arcsinh stretch. Then create copy2 from copy1 and use starnet on that. It’s all a matter of experimentation.
  9. I don’t use starmasks generated by starnet. Instead I first copy the image w stars (copy1), make the original starless and copy that again (copy2). I process the original, and add the stars back in with $T + copy1 - copy2 This tends to give me slightly better stars in the recombined image than if I use the starmask from starnet.
  10. You get what you pay for.
  11. One more month to go until astro darkness returns. Between tinkering with my gear and working around the house, I also rework old data. These three galaxies form the group Mathessian 266. There is a very faint tidal structure between them as well as equally faint IFN. To do this target justice, I really need more data, at least 10 - 20 hours more (this is "just" 13 hours worth of data). If the weather cooperates, I will redo this project from scratch with all new data. Taken with my ASI294MM on the SkyWatcher 190MN in October 2021.
  12. I found an article on Researchgate that says substrates are either ZnSe (zinc selenide) or CdTe (cadmium telluride), depending on which wavelengths they need to cover. Don’t know if that’s what they ended up using.
  13. Most likely elliptical. M87, which hosts the black hole that was imaged by the Event Horizon Telescope, is a nearby example of this type of galaxy.
  14. Nope, the lenses themselves are made from this semiconductor. Germanium has about 50% transparency for wavelengths longer than 2 um. As a bit of trivia, germanium is also valued as a material to make amplifier transistors in particularly electric guitar amplifiers. According to a guitar playing former colleague.
  15. A dslr lens is much simpler, because a well known test pattern is used to determine the lenses characteristics, so the lens is well known. In the case of gravitational lenses, the lens is unknown. It's like looking at a landscape image and trying to determine both what aberrations the lens has, and what the landscape really should look like.
  16. In order to do that, you would need to know the exact mass distribution in the lensing galaxies, and the exact position of tbe lensed galaxies. You could actually change the problem; figure ot the distribution of matter in the lensing galaxy, assuming that the lensed galaxy is an ordinary spiral. This could then be used to study the foreground galaxies. It's similar to determining the mass of say, Jupiter by studying the orbits of its moons. With lensing galaxies, you would probably need to do simulations, but it could tell a lot about the mass distribution and black holes in those galaxies. Studying only one such scenario wouldn't give much. But if you do this for several gravitational lenses, you'd build a knowledge base that would allow further refinements. I have no idea if scientists are already doing this.
  17. Near infrared would be doable. There are a few things to consider: 1. The transmittance of glass in your telescope. Glass can block IR. From 2.5-3 um glass can be opaque. In IR cameras, germanium is used as a lens material. A reflector doesn’t have this problem. Also, chromatic aberration can be a serious problem since refractors aren’t designed for IR wavelengths. All in all, a reflector is a safer bet. 2. Reflectance of mirror coating. For wavelengths longer than 500 nm (green), gold has higher reflectance than aluminium. As we all know, gold doesn’t reflect blue light. But it does reflect IR very well. 3. Camera needs to be sensitive to IR. Longer wavelength IR is heat radiation. You need to cool the camera a lot. The camera shouldn’t have IR blocking glass, of course. This is the main problem. IR sensitivity if silicon cmos is poor. One could try persuading ZWO to make a camera with a FLIR sensor. 4. The atmosphere blocks some IR wavelengths. IR light up to 5 um wavelength is ok. All in all, NIR astrophotography up to 1 um is doable, but longer wavelengths will be a challenge.
  18. Most likely not. Catalogues are based on deep sky surveys, and these are done with much lower resolution, both from space telescopes and ground based telescopes. The fov of the jwst is much too small for use as a survey telescope. No doubt that the images that the jwst delivers will be used to catalogue the galaxies in them, but that will only cover "grains of sand" spread out over the celestial sphere. The "nearby" galaxies that act as gravitational lenses in the jwst deep field are at the edge of what amateur telescopes can image today. But at the fl that we use, these galaxies are only a few pixels across, and even these are not always identified in catalogues.
  19. Fun facts I just checked some technical data on the jwst. fl 131.4 m D 6.5 m (F-ratio of 20! That’s one slow looking glass) Mirrors coated with 0.1 um or a total of 48 grams of gold. That should be the cheapest part of the whole telescope. Apparantly the various instruments have a fov of about 1-2 arc minutes, the angular span of a ”grain of sand at arms length”. To me, that makes the deep field image even more impressive.
  20. Diffraction patterns are always at right angles to the edge causing the diffraction, and emanate from the (image of the) light source.
  21. Gold isn't the problem. The atmosphere and cooling are. Our best chance at JWST like images is when Deep Sky West or iTelescope set up shop on the far side of the moon.
  22. Here's my attempt at your data. It is a very nice image of this small planetary. Imo, the image is over exposed, but not for the obvious reasons. The nebula is quite bright and the brightest stars in the image are close to over exposure. This in itself is not a problem, but when you do a background correction in DBE and Background Neutralization, you get false colour in the star cores. With a bright target such as M57, there is a risk that the nebula is also affected by this. Mathematically, what is happening is this. The core of a bright star is all white in the image (in PI colour coordinates: (1, 1, 1)). This includes any light pollution. The background, due to light pollution, is a bit green (0.05, 0.1, 0.05). Background Neutralization tries to correct the background to, say (0.01, 0.01, 0.01) by subtracting the difference: (0.04, 0.09, 0.04). The star is affected by this and gets a core colour (0.96, 0.91, 0.96). It loses green, and gets a magenta colour. That's what you see on the left side in the above before/after image. The outer edge/halo of the star is not affected, so this gets is natural colour. Had the star been exposed below saturation, the light pollution would have been added, and the uncorrected star before Background Neutralization would have looked green. After BN, it would have its correct colour. If your images suffer from LP, you need to make sure that the main target (ic M57) will not get become too bright in the singel exposures. Over exposed stars can be corrected with the HSV Repair script in PI, but if the main target has pixel values above the correction threshold (default, 0.5), those pixels will also be "corrected" and can end up having a wrong colour. Btw, stacking will not make a target brighter or lead to saturation. Pixel values are averaged. However, the noise will be less, and PI tries to use a stronger stretch in the Screen Transfer Function. This is all good, because the only purpose of the STF is to show what is in the image, especially in the darker parts where you'd want to see any defects or gradients. The default STF was never intended to be used as a permanent stretch anyway. If you want a more pleasing view during processing, you have to manually adjust the STF. I rotated the image such that North is top. The very brightes stars and M57 suffer from reflection halos. This is strongest for the star on the right. The halo around M57 is not extended nebulosity, because it is of the wrong colour. The extended nebulosity is red (Ha), and the halo is green in the image.
  23. 600 s at unity gain seems high to me. To compare: I shoot L at 0 gain and 180 s @f/5.3 for all targets. RGB 240 s (a bit short, I could extend to 300 s) at the same settings. For 7 nm Ha I use 240 s at gain 200. All with the ASI294MM. Also, I don't think you need different exposure times for different targets, unless you are imaging the Orion nebula, possibly the Andromeda galaxy. Cmos cameras have enough dynamic range, and you get the final results in post processing. What will determine exposure time most is level of light pollution. If you have vastly different light pollution in different directions, you may need different exposure times. But even then, you can get away with either an average time or simply by avoiding the worst LP.
  24. Imo, it’s better to find the cause of the oscillations and try to fix it. High frequency oscillations are usually caused by the gear close to the motor. I had a 10 s oscillation that was caused by a loose belt. Tightening the belt fixed it. Guiding can remove slow tracking errors, but not fast ones.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.