Jump to content

wimvb

Members
  • Posts

    8,946
  • Joined

  • Last visited

  • Days Won

    7

Everything posted by wimvb

  1. I agree, now it's demoted to IFO. From mysterious UFO to common DHL .
  2. I haven't used the Eagle Core, but from the PLL site it looks like this is a lot simpler than ASIAIR. There is no plate solving, for example. If you want to control other cameras than zwo and dslr, have a look at Stellarmate or plain old Ekos/Kstars on a raspberry pi. This does what Eagle Core does, up to complete observatory automation, including weather monitoring. And it (Ekos) is free.
  3. Geometrically, an oag configuration is similar to the secondary in a Newtonian. Just that the prism is small section of that virtual "secondary", as in the drawing. But shouldn't the prism be at 45 degrees then, and the guide sensor shifted? Or am I completely off here?
  4. The data is ok, and results in a colourful image. It's just that to get there I have to push the colours more than I'd like. And lrgb combination results in a muddy look. Dark nebulae are hard.
  5. This is proving to be a tricky one; so far, all I have is a muddy mess. I tried to keep the colour saturation in the dust down. In the rgb data there's little evidence that the dust surrounding the reflection nebula must be red, even though it is often presented that way. But so far, this path doesn't lead to anything presentable.
  6. That may very well be the STF. Apply the same STF settings to both images. A better way to evaluate noise is either the script under Image analysis, or the standard deviation in image statistics.
  7. Strength should be threshold, my bad. Here's an image I'm currently working on where I used this recipe. This is an extreme crop, so you can see the individual pixels. Chroma noise reduction was done AFTER colour calibration. As you can see, the colour is much more neutral after noise reduction, but the image looks as noisy as before. This is intensity or lightness noise. There still is a weak greenish large scale structure left after noise reduction. Also note that stars were not affected. These are the settings I used for this image I got away with less noise reduction in this image. But since I didn't use all 8 layers, there was still some structure left afterwards.
  8. That's why I use the "manual" workflow: dark integration flat dark integration flat calibration light calibration light registration light integration All with these settings (will load in workspace 1 in PI) RGBIntegrationProcess.xpsm But, you can set the exposure tolerance in BPP to zero and load the flat darks into the darks section. PI should then group them correctly.
  9. Won't this calibrate your darks with Flat Darks? PixInsight groups darks according to exposure time (these groups are called "bins" in BPP). It will then use darks that best match lights and flats to calibrate light frames and maybe even flat frames.
  10. I assume that you've seen the diagram on the ZWO site about how to connect oag, main camera and guide camera. ZWO generally provide all the spacers you need with their cameras and oag. You can focus your guide camera this way: Target a bright star with your main camera and focus with a Bahtinov mask. Leave the mask in place and move the mount so that the same star is in the guide camera fov. Then adjust focus for the guide camera.
  11. You can skip calibration masters or frames, and PI will only warn you. But you have to deselect the use of a master calibration file in BPP. Btw, "the Book" was written before the days of WBPP. Therefore: no mention of it. Maybe in a future edition.
  12. I created a synthetic luminance from the rgb data, and will check if that contains the same amount of detail.
  13. Do you mean you expose L twice as long as RGB? From some of your images, I gather that for RGB you use 600 s @ bin 2x2, and for L, 1200 s @ bin 1x1. That seems a massive exposuretime, but is of course dependent on your gear and local conditions. To compare, I have a friend who uses 500 s with a Moravian G2 8300, a f/5 scope (8") and a sky darkness of 20.5, for his L and for RGB, all at bin 1x1. In this image, you used shorter times. How did that compare to your usual settings?
  14. So, it's not so much about data capture as it is about processing that data. In that case, I'd not change the exposure time. Although, if each of the colour channels shows enough details, you should probably shorten the exposure time for luminance (if it's the same as your colour exposure time). L captures all the colours at once, and there's more risk of bloated stars. You should be able to at least halve your L exposure time, and double the number of exposures.
  15. In Aladin, I have activated Simbad, and use the bulls eye at the left hand side. I use my own, but am not 100% sure the output is very accurate.
  16. Indeed, your luminance data has flaws. The central star shows rings in its core, even in the unstretched stage. The brighter stars in and near the reflection nebula have dark rings. I would expect this from a deconvolved image, where the settings weren't optimised. But you wrote that no processing has been done to the Luminance. I will try to repair the core, but not the ringing, and include the L if it adds to the image. You have some very nice colour data, that can take a brutal beating, even in the powertool called PixInsight. central star ringing artefacts (most notably in lower left quadrant) Btw, you can test if the artefacts in the L data are a result of pixel rejection during stacking: just integrate the registered L subs without high range/level pixel rejection. This will leave satellite trails and cosmic rays, but it's a good way to evaluate the stacking process. You can also run the subs through subframe selector and remove those with excessive fwhm values.
  17. Gee, I always thought "Dawson" was a completely harmless, ordinary, run of the mill kind of name. (How's that for selective reading?) Disclaimer: proper emoji implied.
  18. Yes, that page is a summary page of astrometry and photometry data on this object (a galaxy). No need to go this far, imo. The basic object page (click on the Simbad icon in the link you provided, then on the identifier in the first row of the table), has all the data you could possibly need, and then some. http://simbad.u-strasbg.fr/simbad/sim-id?Ident=%4010533343&Name=2MASX J15110350%2b1003419&submit=submit This page has coordinates (in several systems/units), distance data (redshift, z), and magnitude data (fluxes). Further down it will list children (eg single galaxies of a group, or supernovae in a galaxy, etc), siblings, and parents (eg group to which a galaxy belongs). I rarely dive deeper into the data than that. If you do, Simbad, Vizier and Aladin have documentation explaining features. http://simbad.u-strasbg.fr/simbad/
  19. Correct. Drizzle is almost never needed for amateur images. Only if the stars in your subs look very "blocky", ie are undersampled, can drizzle be of use. And then only if the shape of stars bothers you. In all other instances, drizzle does virtually nothing to improve resolution. But it does increase noise and image size.
  20. DeBayering should be done AFTER calibration and BEFORE image registration. As for imaging over several nights. Do you need to take the camera off? In order to get repeatable results, align the sensor with the ra or dec direction. Take a 30 s exposure. About 5 seconds in, start slewing in ra+ only at 1x sidereal speed. This will give star trails. Examine and rotate your camer "into" the trails. Repeat untill the trails are parallell to the longer edge of the sensor. This only takes a few minutes to do.
  21. Physics & math. My students are aged 12 - 15.
  22. No pressure, of course. Just before grading "season", and end of term activities. (Swedish schools never closed down because of Covid-19.)
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.