Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

wimvb

Members
  • Posts

    8,845
  • Joined

  • Last visited

  • Days Won

    5

Everything posted by wimvb

  1. The easiest way to test a configuration (swap drives, ssd vs hdd, etc) is to run a benchmark test. It's under Scripts > Benchmarks. I just did a benchmark test on my laptop (Acer AMD A8, 8 GB RAM) and the bottleneck for me is the swap time. My guess is that this is the most common bottleneck, not CPU speed. Btw, to free disk space, have a look at orphaned swap files. They are in C:/Users/NNN/AppData/local/Temp, unless another location was specified in Global preferences.
  2. There have been reports, on this forum and zwo user forum, lately of zwo filter wheels not being mechanically precise. If you move from one filter to another and back, the filter won't return to its exact position. This is, in my opinion, totally unacceptable, but zwo seem to find it ok. I have also suspected this to be the case with my (zwo) filter wheel, because I've had trouble calibrating out dust bunnies lately. I'm not saying you should avoid zwo filter wheels, just do your homework before you give any manufacturer your hard earned money.
  3. Have you optimized the settings in General Preferences (under Edit menu)? Here you can set the number of processers that PI uses, and the swap directories.
  4. Worse than that. The glass isn't that expensive. It's the thin layers of (most likely) quartz and toothpaste whitener on that glass and only a few micrometers thick, that cost. Good nb filters are made by depositing alternating layers of silicon dioxide (quartz) and titanium dioxide (also used to make toothpaste and paint look white) or zirconium oxide (a ceramic material used in dental caps).
  5. "There ain't no such thing as a free lunch." The information that is used to "create" detail is not available anymore for noise reduction. If the drizzled image were of the same quality as the undrizzled one, you could resample it and increase the signal to noise ratio (the "free lunch"). Pixinsight creates the dark and flat masters and stores them in a directory called "masters". If the exposure time of darks and dark flats differs, pixinsight will put them in different groups (called "bins"). The preprocessor script has additional features for handling grouped data
  6. Astrodons are probably best. Good filters are expensive, because they're so hard to make right. The important characteristic of a filter isn't necessarily what it lets through, but rather what it blocks. That is, how much light outside the transmission band doesn't make it through the filter. Good filters are very efficient in blocking unwanted light. They also have a very high hardness and can withstand a fair amount of abuse. But they are expensive. A good place to investigate various filters is probably astrobin.com. Look what results others achieve with filter brands that are of interest.
  7. Why would you invest in an APO for guiding? An ST80 would do just as well. Long fl reflectors generally need off axis guiders, but otoh, people have successfully used finder guiders with their Quattro. If you want a cheap solution, you could get a 60 mm finder scope with a finder-C adapter and either a qhy5 or asi120 camera. The camera can then be used with any other guiding solution. Even if you invest in an APO imaging scope, you will still need a guiding solution for that. An APO as guidescope doesn't save you anything.
  8. That's an excellent NA neb. Regarding removal of the halos, imho there really is only one long term solution: a Tak deserves more than cheap zwo filters. Consider investing in filters with good antireflective coatings. A short term solution is to create a mask and erode the halos. Have look here, a bit down the list http://pixinsight.com.ar/en/processing-examples.html
  9. +1 for Making Every Photon Count. Your Canon probably isn't modified for Ha imaging, so don't expect too much regarding emission nebulae. You can image star clusters, galaxies and reflection nebulae with your camera. Because your scope is reasonably fast (f/5), there will be misshaped stars (coma) in the corners. A (second hand) coma corrector is quite cheap, eg Baader mpcc or skywatchers coma corrector. The NEQ6 is a very decent mount, and combined with the right scope, you should be able to enjoy it for a long time. But the 300p telescope is like a sail on any mount, and as @Waldemar wrote, you will need to take care where you put it. For imaging, you can in due course, invest in a more suitable telescope. For visual use, you should rotate the scope in its rings, so that the focuser is at right angle to the counterweight shaft. Polar alignment isn't super critical. Start with putting Polaris in the center of the polar scope. Then use the polar alignment routine in the handset to improve alignment. A two star alignment should then put any dso within the field of view of a low power eyepiece.
  10. You can't. Any change in altitude setting implies doing a new polar alignment. But ehen you achieve balance, you either leave the setup as is after redoing polar alignment, or you mark the devetail bar and counterweight shaft for next time.
  11. I think that the reason for this is that your Ha images are under exposed. The clip filter will filter out any light other than true wanted signal (Ha), while the RGB image also contains sky glow or light pollution. The RGB image also has more amplification than the Ha image, because you have a higher ISO value. Your RGB image is therefore a lot brighter than the Ha image, and you have to stretch that (Ha) more to reveal the data. This causes the noise to also show up. You'll see that once you process the RGB image and apply background neutralisation (aligning the colours in a background sample) and white point balancing, that your RGB image may actually be noisier than your Ha image.
  12. I live so far north that for accurately balancing the scope in RA, I first have to lower the mounts latitude setting. In the extreme case of an astronomer on the North pole, RA balancing wouldn't work at all. So, where from are you observing/imaging? Second, for a Newtonian, the balance point will shift with RA/DEC, depending on how the focuser is oriented with respect to the mount. Most imagers will have the focuser pointing in the same direction as the counter weight shaft. For observing, you'd probably want to turn the scope in its rings until the focuser is at right angles to the cw shaft.
  13. Platesolving on a Stellarmate replaces any star alignment, except polar alignment. You just point the scope in your planetarium program (the Stellarmate "client" app) at the dso you want to image. The scope will then slew to what it thinks is the correct position. Depending on how you've set it up it will be close or a mile and a half off. Stellarmate will take images and correct the scope's position until the star field in the images matches the star field in the planetarium program, within a certain accuracy. For Stellarmate, the default accuracy is 30 arcseconds. Depending on your scope focal length, mount pointing accuracy, and sensor size, you may want to adjust that value. For widefield you increase the tolerance, and for long focal length on an accurate mount, you decrease it somewhat. The first time you tell Stellarmate to point at a target, it may take a few corrections to center it. But the more points/targets you add to the Stellarmate pointing model, the more accurate it becomes.
  14. Very nice. The extra data makes all the difference.
  15. Very nice catch. I think that you can increase contrast a bit, to make it more "dramatic". Just don't lose the structure in the dark nebula ("Gulf of Mexico" region).
  16. That graph is a bit missleading. Dark current doubles every 6 - 7 degrees of temperature increase. This means that lowering the temperature does have an effect. Otoh, you don't want to run the cooler at 100 %, because you don't have any control then. I would run it at about 75% max. Well, that shows how wrong I can be. I looked at the product page of the 071 and compared to my 174. The gain curve for my 174 seems steeper than for the 071 and unity gain is at a higher gain setting. Unity gain for the ASI071 is at setting 90, while for mine it is at 189. Also dynamic range is a lot lower for my camera than the 071. At 100 gain steps above unity, I could never use exposure times of minutes.
  17. Congratulations. Cloud cover included in the order? Re gain settings, I would start with unity gain and exposures of about 2-4 minutes. Keep the temperature moderate (-20 C). For narrow band (if you do this) , high gain (200 - 300) and maybe 300 s exposures. Then experiment from there. If for some reason guiding doesn't work, use high gain even for ordinary colour, and exposures 10 - 30 s. Just keep the total integration time constant. Cooled cmos works best with a large number of subs. Also astrobin is a good reference if you look for images taken with a similar setup. Have fun.
  18. That looks great, indeed. One question: if you mapped the filters as HOO, how did you get the intense blue?
  19. Stacking software will first align the images (star alignment) and rotate those that need it. Just load them all at once
  20. Do you need to drizzle? The only time you get an improvement with drizzling is when your images are undersampled. Otherwise you just get large files.
  21. Just stack the darks. I've included process icons in a later post of that thread. Also check out the preprocessing tutorial on lightvortexastronomy.com
  22. Cmos cameras generally don't like long exposure times because of possible amp glow, for example. During RGB imaging you collect more photons per second than during nb imaging. In order not to have too long exposure times for nb, you can increase the gain. Just remember that the total integration time needs to be the same: more short exposures in stead of fewer long exposures.
  23. Load these process icons into your workspace and apply FlatCreation.xpsm Use DarkIntegration and load with your dark flats. Save the master dark as DarkFlatMaster.xisf Then use FlatCalibration. Load flats and master dark you just created. Finally, use FlatIntegration and load the calibrated flats. Save the result ast FlatMaster.xisf It's basically the workflow from LightVortexAstronomy. https://www.lightvortexastronomy.com/tutorial-pre-processing-calibrating-and-stacking-images-in-pixinsight.html See sections 2 and 3
  24. You can also use the calibration process to calibrate the flats and make a master flat. 1. Integrate dark flats into master 2. Use this master to calibrate flats 3. Integrate flats to create master flat
  25. Here are updated process icons. Just before stretching (histogram transformations starting with process 18) you can do a masked stretch where you lower the Target background from 0.125 to 0.014, and the number of iterations from 100 to 15, background clipping from default 0.0005 to 0.0001 - 0.0002. msacco_m31_noasinh.xpsm Good luck
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.