Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

Steve in Boulder

Members
  • Posts

    94
  • Joined

  • Last visited

Everything posted by Steve in Boulder

  1. By the way, the annotation feature is really nice to use with galaxy groups like NGC 7331.
  2. Jocular will read the exposure time from the FITS header, so Starlight Live must not be recording it.
  3. Happened to me too when I upgraded to 0.5.6. Per Martin, the problem is a stored value for sharpening for a previous version. Edit ‘gui_settings.json’, change the value of ‘unsharp_amount’ to 0. Worked for me.
  4. One possible enhancement for chromaticity would be noise reduction/smoothing. Or just move the color binning control to the main view.
  5. Yes, I needed to reshuffle to have any success at all, and I also turned down the number of stars for the first sub. Even then, I was losing some B subs.
  6. You could also use FWHM for automatic sub rejection, perhaps. In lieu of manual sub rejection, that might serve as a "Jocular Lite" for memory-challenged users.
  7. I made my first attempt last night. I used a very crude approach -- comparing the midpoints of the curve within each passband -- and wound up with 12 or 13 seconds for B, 10 for R and G. Here's an example. Not sure it was really worth the (rather lame) effort. Despite the additional integration time, I was having some alignment issues with the B subs. Some thin clouds had moved in and the moon was coming up, not sure if that could be the reason.
  8. By coincidence, I'd just been looking at the color correction material in Berry & Burnell. Instead of reference stars, perhaps there's an alternate though cruder approach. Determine the altitude from the platesolver and find the relative ratios for atmospheric transmittance for R, G, and B. The QE for each R, G, and B passband can be found from the published graphs for the camera and the transmittance for the color filters likewise. I intend to start down this path next session by simply adjusting exposure times for each image based on the relative areas under the QE graph for R, G, and B. Just by eyeballing the QE graphs B&B's chart of atmospheric transmittance (Table 17.3), the QE adjustment will dominate above, say, 45 degrees in altitude.
  9. I notice in retrospect that Berry & Burnell (p. 488) mention the use of longer integration periods for L than for RGB. As they discuss (p. 478), this works because "the human eye and brain readily perceive noise and errors in luminance, but are relatively insensitive to noise and errors in chrominance."
  10. The performance improvement for my ASI178MM (3096 x 2080 pixels) is very noticeable. I previously would bin 2 to avoid lags, but no need to now. Jocular is a pleasure to use with the 12MB bin 1 files. Thanks for all the work on this Martin!
  11. A bit of a senior moment, there, sorry to say! Let's try again with the same DSO instead of a different one.
  12. Yes, the noise reduction from color binning occurred to me when thinking about this approach. I tend to like a punchy (high-contrast) image, thus the hyperstretch (for brightness) and bg slider at maximum (i.e. the - mark). Here's the first one with asinh stretch but otherwise similar settings.
  13. I did a little experiment to see if it's reasonable to reduce the total time spent on RGB subs when using LRGB mode, the theory being that L gives the most detail and RGB is there for flavoring. Here are three views of M51 (please bear with my choices for stretching, sharpening, noise reduction, and saturation, even if not to your taste!). They seem pretty close to me. Maybe this wouldn't work as well for other sorts of targets.
  14. I don't know how LAB color space affects this, or the addition of the L subs, but as a user I think I'd want to be able to specify what color (in RGB terms) each narrowband should appear as. So if I wanted to see Ha as red and OIII as green, so I could easily tell the signals apart, that's what I'd get.
  15. This may be, and likely is, a stupid idea, but could Jocular simply read a palette color from its configuration file for each narrowband? Then use combinations of narrowband filters as I suggested above. That is, stack L + Ha subs, then switch to L + OIII and continue stacking, with each narrowband sub assigned its configured palette color. That could implement a multi-narrowband feature with minimal effort. Unless it’s a stupid idea.
  16. Ah, I was presuming there would be a non-grayscale color assigned to the narrowband. Otherwise one could just stack L and Ha subs as mono.
  17. Forgot one! Does Jocular expect a naming convention for the narrowband subs, ala the RGB subs?
  18. Martin, I plan to start experimenting with the narrowband features soon. A couple questions: How is the palette determined? Is there something to configure in the JSON file? Can one do, for example, an L + Ha stack, pause the stack, switch to L + OIII, and resume stacking (perhaps just adding OIII subs)?
  19. Well, four images by three users is certainly enough testing, right?😎
  20. Very nice results from sharpening plus noise reduction, in both instances.
  21. I’ll soon find out what the ASIAir does, though I can use a script to rename them and/or revise the FITS header appropriately for Jocular.
  22. Martin, does LRGB mode work with the watched folder, and if so is there a naming convention for the files?
  23. When I try to connect to ASI cameras, I get "Cannot find ASI library in resources." The Jocular resources directory just has .dlls, true type fonts, and JSON files.
  24. How should I install the ASI Camera 2 Mac library to work with this? I believe this is the library that is in the ASI Camera SDK, is that correct?
  25. I don't have anything useful or intelligent to say on that -- I've only used the ASIAir for imaging so far, and it hides offset settings. The Live view seems to work well with the calibration frames I take using the ASIAir, but I don't know what offsets are used.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.