Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

Martin Meredith

Members
  • Posts

    2,270
  • Joined

  • Last visited

Everything posted by Martin Meredith

  1. Hi Andrew You might get more joy posting this in the EEVA section as what you describe is what many do (though not me, as I have a cabled connection to my scope). You don't mention possible camera options but it is worth thinking about this at an early stage as it will impact on what drivers and live capture software you can use. The choice for Macs is more limited than for Windows. cheers Martin
  2. Not monthly, but split into Winter-Summer (separate volumes), is the Night Sky Observers' Guide (NSOG). Since objects that culminate higher than say 30 degrees are typically observable for several months, having the seasonal split is reasonably effective. Another great book (for many other reasons) is Garfinkle's Star Hopping. This one is organised in terms of monthly sky tours. NSOG covers more objects while Garfinkle covers a lot of interesting star-lore. Martin
  3. Thanks for posting this image. It's one of the few Messier's I've not yet observed (not entirely sure why as it is gets to a healthy 24 degrees here in Northern Spain. And now I'm intrigued to find out the back-story of the young stars.... I don't have an image to share, but I do have a fake image I created from the GAIA data (down to mag 21) which clearly shows the presence of blue-ish stars (as well as a few anomalous colors that I take full responsibility for!). Martin
  4. Luminosity plus narrowband is now integrated into Jocular as a separate mode. You would collect L + Ha, say (or L + OIII, or whatever) and engage the L+ mode which maps the narrowband data to a separate layer. The saturation control then manages the amount of blending. I want to do some testing myself as I don't have any narrowband + L FITs to demonstrate it with (most of my narrowband EAA took place before I made a habit of saving FITs), but if anyone does wish to test I'm happy to send the updated code. This gif that I made years ago with SLL shows the kind of effect that will be possible. In this case it is an on-off animation but in the tool the blend is gradual. Martin
  5. Agreed, seeing the first of the complete set of LRGB is always exciting. My approach depends on what I'm viewing. If I expect to be looking at objects in colour (typically GC, OC or PNs) then I'll do something like 4 each of RGB then L. If I'm looking at a galaxy or group of galaxies I'll typically start off in L and if the galaxy looks like it might be interesting in colour, I'll add RGB as an afterthought. You can add in any order so topping up poor subs is easy enough. What is quite surprising to me is how little colour is needed to produce 'decent' colourised images (particular for bright objects like open clusters). Faint PNs like your Abells do need lots of colour though. What actually happens in my case is that if I select LRGB, the software will collect data in the order BGRL. The rationale is that in the case of multiple filters, they are best ordered by increasing transmissivity so that when the threshold for star detection is computed on the first sub, it is more or less guaranteed to be sufficient for star detection on subsequent filters. If you find yourself having star detection issues I suggest starting with B. This is going to be particularly important if you are planning to combine narrowband with L - start with Ha or Oiii. On the narrowband front, I hope to implement special processing fairly soon for this scenario. The idea is that if the software detects L + 1 narrowband filter, it will map the narrowband to, say, red, and overlay it on the L, with the saturation slider controlling the blending. I still need to think about the best approach when it detects multiple narrowband filters. Watch this space! The C filter is useful for extracting more photons relative to L, particularly in the IR part of the spectrum where galaxies are emitting quite strongly. One day I will do a proper test to see what difference it makes. Martin
  6. Lovely captures Bill. Now that we are definitely not in summer I really must get back to PNs... As you're finding, this is one area where colour helps a lot in detecting these elusive objects in the first place -- though some of the Abells are so faint nothing much is going to help except more exposure time. Colour also shows up any problem subs nicely... Its interesting to switch off hot pixel removal ans see how ugly the result is. Work-flow wise, are you starting with RGB then adding L, or starting in L and then adding RGB (then more L perhaps?). Martin
  7. Here's a shockingly bad shot with a handheld phone just a few moments ago, but it serves to show how close they are. The visual view thru the eyepiece (TV 8-24 zoom) + Borg 77mm frac was superb & unforgettable.
  8. Good to see some shots of rarely-viewed open clusters. Kudos for keeping at in blustery conditions. Were you guiding your guidescope with a different guider? Martin
  9. Nicely caught Bill. It was clear even without the reference image, although the colour in the image from the other Bill makes me want to get the old RGB filters out for this one. Your Lodestar + reducer setup looks to be working well too. Martin
  10. Lovely collection of objects. Triangulum has some decent sights. I observed the NGCs this time last year, and they fit comfortably into a single field of the Lodestar. They're also listed as members of the VV catalogue, VV 1034 and 1035. This is a field where a noisy stretch is worth it in my opinion to bring out more detail (including just a hint of some rather far-flung arms for the face-on spiral NGC 974). I was forced to check whether these are real or imagined and I find them present on the SDSS image (below) which also shows just what a beauty this galaxy is. VV 1034 is NGC 970, the smaller galaxy (pair?) mid-way along the right hand edge of the equatorial triangle at the top. Its great to see some UGCs too. In EEVA mode these galaxies can show some fine detail. I really like your UGC 2023 -- it's intriguing shape makes me want to find out more. The only one I've observed is UGC 1281 but I have it filed under its Flat Galaxy Catalogue entry (FGC 195) Martin
  11. Kudos for being out at 4.30am! I've a shot of this from last year (end of March!) -- doesn't add anything except slightly lower resolution (which you can see in terms of the close pair mid-left). I'm assuming that its a star occluding the core of the larger galaxy and not the core itself? It looks very stellar. Martin
  12. Yes, you can figure out how big the measurement error is, but the measurement error is for 1s subs. In the absence of a model for what is going on, this doesn't tell us anything about measurement error for 10s subs (which is what we want, I assume)? That is, your 'additional statistical data' is for 1s subs, not for 10s subs. In this specific case we do have such a model (photon statistics) so we can generalise to 10s subs. But since we have such a model anyway, we don't gain any information from taking 1s subs. (repeating again the caveat that of course having 1s subs allows us to do outlier removal). Consider counting raindrops falling into the famous bucket. You measure every 1s and get a series of values. You average them and look at the standard deviation and work out the measurement error for 1s measurements of rain falling into a bucket. If you take a single 10s measurement, sure, you don't get a measurement error for 1s intervals, but why would you want to if you are taking a 10s measurement? If you want to know its measurement error, you take multiple 10s measurements. Its 1s apples versus 10s oranges.
  13. When you chop a sequence into N parts and measure standard deviation, you are measuring the standard deviation of a short unit (ie you are measuring statistics of short units) You are not measuring the standard deviation of the longer unit. Knowing about the standard deviation of a bunch of 1s subs does not provide any additional information concerning the equivalent 10s sub. It tells you how reliable the mean of 1s subs is. So to answer your question of how do you know how good is your measurement: yes, you need to take multiple exposures. But multiple exposures of the same length. Simply chopping it up only tells you how good your measurement of the shorter length is. Now in practice, if there are external factors that are likely to affect the value of the final measurement, such as clouds, satellite trails etc, of course it is worth breaking the measurement up and then rejecting outliers, hence my caveat above. But if there are no such external factors operating you are not gaining any information by splitting the longer sequence up.
  14. Hi I have to disagree with the consensus here. Unless you are planning to detect/remove outliers from a sequence of estimates, I don't think there is any statistical reason to prefer 20 measurements to one measurement, assuming the measurements are taken in one long sequence. That is, the statistics are the same. Essentially you are collecting the same photons whether they are collected in one long burst or N shorter bursts. Any benefit from the fact that you have 20 measurements and can compute a standard deviation is, I think, illusory. After all, why not take 20000 x 1ms measurements if having more measurements is beneficial? Concerning the difference in magnitude estimates based on stacking and averaging, bear in mind that if you take the average of many magnitude estimates, depending on your software, you are taking the average of logarithmic-like quantities so they should be converted to linear units prior to averaging, then converted back to magnitudes. If you are computing the magnitude based on stacking, then (most likely) the stacking software is operating on linear quantities already before doing the final conversion to magnitude step. cheers Martin
  15. As Cosmic Geoff says, the tricky bit is finding the right field, especially since Pluto is passing through a star-rich portion of the sky. Here's an animation of 3 shots on sucessive days from 2015: These were 4 x 15s subs but Pluto is actually quite bright and easy to catch with short exposures (it is detectable in 1s exposures with an 8" f4 scope), so if you were to happen upon the right part of the sky you could take a brief exposure and expect to see it, but only after a lot of searching! Good luck Martin
  16. Hi Tony Mike explained it well. Just to add that the B6 at the end is the spectral type of the primary. Stellar 'infolines' give a variety of information, the meaning of which you can find in this document: technical.pdf It really is a beautiful star field around the comet and that line of stars really stands out. cheers Martin
  17. I agree with all of the above, and I actually vastly prefer the unprocessed version 🙂 Martin
  18. I can't really comment on visual. In EEVA a UV filter is sometimes employed to reduce star bloat for blue stars when using an achromat but bloat won't be a problem for a Newtonian reflector. As for the IR side, there is plenty of energy at those wavelengths in DSOs so I wouldn't use one. I have the Baader LRGBC set and I always use the C (clear) in preference to the L (which is restricted to the RGB range ie cuts IR/UV -- there is quite a large difference in response). If you do decide to use filters at any point via a filter wheel or filter slider, it is worth having a parfocal C filter to avoid the problem Mike mentions of having to refocus between changes. I'm interested to see how you get on with your EQ platform for EEVA purposes. I don't know of many who use one but it is something I've considered myself. I imagine it will be necessary to keep exposures very short at the focal length you're going to use unless tracking is absolutely spot on. Martin
  19. NIce. Looks very sensitive. What is the top image? The image name suggests M97 but it doesn't look like the Owl to me. Gain is generally fixed on the Lodestars. is the Pro meant to have variable gain? Martin
  20. I'm not using cv2 but instead this https://github.com/colour-science/colour-demosaicing/blob/master/colour_demosaicing/examples/examples_bayer.ipynb and it seems to be working fine within Jocular (not yet part of the release though). The tricky part is knowing that the incoming FITs is indeed in need of debayering... there is some inconsistency in the use of FITs header info to signal this 😉 Also considering supporting Canon/Nikon raw format but that will take a little longer. My reluctance is that handling large pixel-count sensors is pretty sluggish for compute-intensive things like live LAB colour, but I have a few ideas on that front that I want to try out. cheers Martin
  21. As far as I'm aware, astropy (which I think you're using here) doesn't do the de-bayering step, does it? I've found a package that I can use for that though -- just needs some time to test it. Martin
  22. Hi Gord Thanks for the Ubuntu information. I think you are the first person to try v0.3 on Linux so it will be interesting to see how it goes. I wonder if the RPi has enough oomph (and memory) to be able to service Jocular, which is fairly compute and storage heavy due to the particular niche it occupies i.e. supporting recompute/realign and exposing the entire stack at all times. Jocular doesn't support OSC cameras just yet but I'm hoping to debayer any OSC fits than land in the monitored directory into 3 separate RGB subs as Tony suggests on the other thread. Colour in Jocular is based on histogram-free LAB colour space which I think most will find simpler to use in an EEVA context than manipulations directly in RGB space. Control of cameras and other peripherals is definitely the least-well developed point in Jocular at the moment but if you can do what you need to via INDI that sounds like a good combination. BTW I'll be working on plate-solving for the next release, mainly for automatic annotation purposes, but I guess with INDI mount control integrated it could be useful. Martin
  23. I'm so glad someone else has been able to try out the LRGB side of Jocular! These are great captures. I love the hint of pink in NGC 2392. Soon I'll add synthetic luminance so that LAB colour is possible even if we only have RGB. Martin
  24. Lovely results. I'm going to aim for that PN when I get a chance. I'm also considering upgrading to an Ultrastar in the new year so I'm keen to see how much different the Pro makes. Martin
  25. Mesmerising! I know almost nothing about variable stars but I wonder if there are any really fast variations (though not so fast as the Crab pulsar) that could be visualised using this approach? Likewise, occultations by things other than our moon. Motion of the moons of Jupiter? How many pixels/minute? (displaying my ignorance here!) Martin
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.