Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

vlaiv

Members
  • Posts

    13,106
  • Joined

  • Last visited

  • Days Won

    12

Everything posted by vlaiv

  1. Tim, I just came back from a mini session in my basement - received artificial star and went to asses quality of Samyang 85mm F/1.4 lens with my ASI1600. I used SharpCap to capture images just to compare star shapes in center of the frame and on the edge at different aperture settings - F/1.4, F/2, F/2.8 and F/4. I planned to capture Lum, R, G, B and narrowband Ha and ~540nm Baader solar continuum (good for assessing quality of optic at near best visual wavelength - nice for refractors / achromats), but only manage to do Lum tonight. In any case - I noticed that SharpCap behaves the same - I needed to change exposure length to compensate for aperture stop and used FWHM measurement for best focus at each setting. Changing of exposure length had immediate effect on measured FWHM value although it should make no difference. This of course has to do with noise floor as longer exposure will improve SNR. Yes, SharpCap (at least version 2.9 that I have installed on my laptop) behaves like that - measured FWHM depends on SNR of the star - higher the SNR larger the FWHM.
  2. I think that DSS is not that good in determining FWHM values of the stars. What happens here is that Gaussian shape fitting depends on estimated noise floor. Binned image will have lower noise floor. Perhaps try using AstroImageJ to measure FWHM? I found that it is much more consistent. Another important point is. What are your subs like since we are talking about OSC camera? Is bin x2 in drivers producing monochormatic image? It is generally advisable not to bin CMOS subs in drivers, but rather use software to bin your subs later. OSC camera needs to be debayered first, and regular interpolation debayering + binning afterwards will not produce proper binning results - it will lower resolution but SNR improvement will not be as expected. This is because interpolation already used average of surrounding pixels and created correlation between them. Correlated pixels when binned won't produce expected SNR improvement like truly random noise samples.
  3. These are just checksum files that you can use to verify integrity / validity of installer.
  4. https://www.firstlightoptics.com/skywatcher-mounts/skywatcher-heq5-pro-synscan.html Pretty much the same as my experience vs https://www.firstlightoptics.com/skywatcher-mounts/skywatcher-eq5-pro-synscan-goto.html Mind you, FLO lists this mount as EQ5 not NEQ5 as most other retailers. For example, here is text from Astroshop.eu: which have it as EQ5 in product name but use term NEQ5 in product description. In any case Skywatcher made quite a bit of mess with using that N in front of what everyone knows as EQ5. In the end - HEQ5 is much better mount for AP, so you should look at that one.
  5. There is not difference between HEQ5 and HEQ5 pro - there is only one version of the mount. Maybe you thought of NEQ5 - which is just EQ5 class mount (I had to go and check twice thread title and your first post - it is so easy to confuse the two - HEQ5 and NEQ5). In any case, EQ5 mount is going to have less weight capacity, and indeed it is maxed out with 9-10Kg but for imaging I would still use the same rule of thumb - go with about 2/3 of total capacity. I think people load EQ5 with 8kg at most for imaging, but not really sure.
  6. HEQ5 has load capacity of about 15Kg, but you don't want to load it that much. About 2/3rds is recommended maximum for AP and I tend to agree with that. I pushed my HEQ5 up to almost 15Kg and while mount copes with that much weight - it is not stable for AP. Sorry about low quality image - long exposure, a bit of hand shake. This is Skywatcher dob OTA - 8" F/6 scope, weighing about 11-12kg with 60mm guide scope and camera, tube rings and imaging camera. To balance that, 3 CWs of 5Kg are needed. I now image with 8" RC scope that has less than 10Kg, I also use OAG and 1Kg counterweight to balance ota in DEC. Overall I think I'm at 11Kg, and that is about as high as I would comfortably go with Heq5 for serious imaging. Here is another image from that period - RC8" + ST102 mounted side by side. Again 12-13Kg+ but much more stable configuration because OTAs are not long. Managed to balance it with 10Kg CWs - but pushed to the edge of CW shaft.
  7. I think it is based on FreeBSD, so Unix underneath but not 100% sure. In fact, here is wiki article on XNU (macOS kernel): https://en.wikipedia.org/wiki/XNU it looks it is some sort of hybrid with some parts of BSD and later FreeBSD inside.
  8. On Mac? Does Mac have APT package manager?
  9. I'm just happy and I feel the need to share ... Got everything working and I'm very pleased with how it works: - RPI is in AP mode and accepting phone and AzGTI connections - Camera is happily connected to RPI and takes images via Web interface - Mount connects and slews via web interface. IndigoSky has very basic imaging capability, but I think it will be all that I need for the time being. Awaiting 30mm guide scope from FLO (ordered about a month ago, but still not in stock?) and I'll be also testing the guiding. I know this is ubuntu thread, but I strongly believe all of this with a bit of effort could be also made on ubuntu server for RPI. Not much to it really - server is installed and configured, wifi is configured in AP mode - all very simple settings. I even added AzGTI to have reserved address on DHCP (dnsmasq) and save that address in Indigo server conf (auto discovery is not working well for some reason).
  10. Hot pixels should not change day to day. They are due to manufacturing artifacts and current leak in sensor substrate (material sensor is built from and where those pathways for current are printed). There is no perfect insulator and every insulator will "fail" at some point. Current will leak into particular pixel and that will make it hot pixel. Since there are physical stresses on substrate from both cooling and electrical forces - hot pixels can change during life time of camera, but they don't do it over night. It is however advisable to change your master darks once or twice a year because of this change in physical properties of sensor "material". As for answer to original question: 2. stack as individual subs - after their respective calibration (each set with their calibration frames if they are different).
  11. For some reason I missed that, but managed to find it in git repo.
  12. I have RPI4 with 2GB of ram. I think that is more than enough for light desktop usage. At least it should be. LXQT uses up something like 300-400mb of ram so there is at least 1.5GB left. I still have not figured out problem with SDDM and startup. For some reason it won't start on boot if graphical target is selected as default target. It starts in test mode and XServer starts from command line. I installed XRDP and remote desktop sessions work normally. Now, I'm about to test INDIGOSKY distribution. First problem (there is always little something with linux isn't there? ) that I'm facing is that I can't login into ssh because I don't know credentials for this preinstalled image. Indigo sky page says it is default raspbian / raspberry pi os installation with indigo server installed, but pi/raspberry is not working on ssh for some reason.
  13. I fancy the idea of AzEQ avant mount as light travel and versatile mount. It can operate in both Az and EQ mode and there is small tracking motor that can be useful for planetary. Only drawback that I can see is small payload - around 5Kg. Would it hold 4" short refractor at about 4kg? One thing is sure - it will be on the edge of mount capability if it even holds it ok. Another option might be something like this: https://www.teleskop-express.de/shop/product_info.php/info/p8256_Giro-Ercole-Mini-Altazimuth-Mount-for-Telescopes-up-to-9-kg.html + nice sturdy photographic tripod - not something that I know enough to recommend, but I did hear name Manfrotto mentioned often.
  14. Just installed Ubuntu server 20.04 & lubuntu-desktop on my new RPI4. So far works, although it won't start sddm greeter for some reason - I need to login in console mode and then startx to get desktop. Have another sd card with indigo sky loaded. Will need to play a bit with that - check how AzGti and my ASI cameras work with it. So far rather pleased
  15. Depends on how you look at it - if it works for you and won't cause any issues down the line - then it is fine. If you are strict in how you design your software then yes, it's sort of a code smell.
  16. Ah sorry, that should be private static SerialPort ... (static member in static class)
  17. Try something like this: public static class ObservatoryPortProvider { private SerialPort ObservatoryCompPort = null; public static SerialPort GetObservatoryComPort(string SelectedComPort) { if(ObservatoryComPort==null) { ObservatoryComPort = new SerialPort(); ObservatoryCOMPort.PortName = SelectedComPort; ObservatoryComPort.Open(); } return ObservatoryComPort; } public static void CloseObservatoryComPort() { if(ObservatoryComPort!=null) { ObservatoryComPort.Close(); //I'm just guessing there is close method ObservatoryComPort = null; } } } With above class you can then do something like: MyComPort = ObservatoryPortProvider.GetObservatoryComPort("COM1"); //or however you intend to call it ... It will always return the same instance of the com port. You can close it once you no longer need it with: ObservatoryPortProivder.CloseObservatoryComPort();
  18. I would say that is sensible.
  19. Method you are trying to call is not static / class method, this means you need to have instance of the class in order to call that method. Here is what you should do: private void ButtonQuerySerialPort_Click(object sender, EventArgs e) { ObservatorySerialPort myPort = new ObservatorySerialPort(); ListOfSerialPports = myPort.GetAvailableComPorts(); } Also, consider having myPort as class variable rather than local variable in one method if you are going to reuse that port.
  20. I'm not going to worry about future collisions now that I've seen what sort of asteroid punching capability we have We can blast them to bits with 10cm/s at any time
  21. I would not call that touch and go
  22. Because with interference filters, width of layer is what determines reflection. When light hits at an angle - it is effectively going thru thicker layer. Fast F/ratio means that some of the light is hitting filter at shallower angle and that light might be reflected instead of passed because of this - effectively creating aperture stop. This image shows LPS filter and how it shifts towards the blue depending on angle of incidence of light ray.
  23. That is interesting info. ASI178mcc is supposed to have UV/IR cut filter built in - according to ZWO: And I used Hutech IDAS LPS P2 - which is also supposed to act as UV/IR cut filter as it has following response curve: In fact, it is even somewhat minus violet since it does not pass wavelengths below 410-415nm according to above graph. Next time I plan to try it with Baader Contrast Booster filter - it has similar response to IDAS LPS between 500 and 600nm but cuts into violet part of spectrum more aggressively: I bought that one for use with achromatic refractor to eliminate both a bit of LP and violet halo. Should be getting artificial star tomorrow, so I'll be able to run extensive tests on different aperture settings and filter combinations to see what sort of stars I can expect. I also got very low profile rotator and quick changer - this will allow for RGB type of imaging with ASI1600 - Maybe doing individual channels and focusing for each - R, G and B will yield better results. Btw, this is what stars look at F/1.4: Interestingly, all have well defined "core" but also have significant "skirt" Profile plot of one star in each channel
  24. Btw, I did not give tech info - above is one hour of exposure - 60 x 60s from red zone (SQM around 18.5). Hutech IDAS LPS P2 filter used. Mount is AzGTI. No guiding.
  25. ASI178mcc is rather harsh on this lens as it has 2.4um pixels, but lens does have some chromatic aberration stopped to F/2. Here is the same image processed in two different ways. One is RGB compose in Gimp after calibration, stacking and some processing of linear data in ImageJ (background wipe, simple color balance). Second is RGB composition in ImageJ. For some reason (I think I know why), it has rather intense orange color cast, but luminance layer is much better processed - I used green for luminance here and did RGB ratio color transfer (I need to rewrite background wipe to account for color image - as is, I did "per channel" background wipe, but I need to do it combined to keep color balance). In any case, second image is much better looking with respect to star shapes and overall sharpness. Green channel is rather nice. Here is comparison of R, G and B channels on a few stars: I also took some images at F/1.4 (fully open) with intent of making mosaic and binning them to size where this sort of issues were not objectionable, but everything dewed up after an hour so most of that data is not good. I'll process it at some point just to asses star shapes when fully open.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.