Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

Martin Meredith

Members
  • Posts

    2,270
  • Joined

  • Last visited

Everything posted by Martin Meredith

  1. EEVA is great for Uranus, Neptune & Pluto, and the minor planets/asteroids, and for picking up the moons of the planets too, but not really for getting high resolution views of the brighter planets, where I believe the typical approach is to take 100s or 1000s of very short exposure and automatically select some fraction of them. It is that aspect that makes bright planetary EEVA problematic in the sense of getting near live views, though I imagine it is possible in principle. I don't know of any software that does it near-live. StarlightLive is not designed for bright planetary work. Stacking in a typical EEVA setting is based on stars whereas for the bright planets it is based on features. (I don't do this kind of observing so perhaps I am out of date here). Regarding the Lodestar x2, it is a great camera and the only one I've used for the last nearly 7 years, and I find it generally excellent for deep sky EEVA so long as you pair it with a scope that delivers a reasonable arcsec/pixel. I operate at 2.11"/pix but ideally you would go a little lower. Martin
  2. Under a bright moon last night I had a look at some open clusters in the general area of Auriga and Perseus. Auriga is home to Messiers 36, 37 and 38 and I visited them, along with some NGCs, but for me the most interesting objects of the evening were from 'lesser' catalogues. Later I'll post some of the King clusters, but I'll start with Stock 8, which lies about 1.5 degrees due South of M38. This is only the second Stock I've observed. I was forced to observe Stock 24 in Cassiopeia a few weeks ago because it is also Berkeley 3, and I have been 'after' the Berkeleys for a while.... Stock 8 lies in the Perseus Arm of our galaxy. Although the catalogue I'm using suggests the cluster is 18' in diameter, more recent studies place it at around 6', which ties up with that I can see here. The yellow-orange star is naked eye Phi Aur (mag 5.1) at around 450 light years away. The white star is also just about naked eye at mag 5.9, so this cluster ought to be very easy to locate. It isn't too clear perhaps in this shot, and is washed out by the moon, but the the cluster is surrounded by nebulosity (from which it presumably formed). This is an HII region known as IC 417 (Sh 2-234). I imagine this would look really interesting in a deeper exposure. Apparently Stock 8 is an extremely young cluster -- only 3 million years. It was fascinating to read a little in [1] of the large scale structure around Stock 8. Apparently, there is evidence for a giant shell of hot gas whose high expansion velocity implies that it was the result of a supernova explosion. This in turn triggered the formation of many stars in the region. Stock 8 is surrounded by 33 early-type massive stars that may have been formed in this way. Stock 8 is moderately massive, with a total of up to 600 solar masses. Fig 1 in [1] shows a striking false colour composite which makes the H-alpha component of the shell very clear. One of the jewels of this field (but not part of the cluster as far as I can tell) is the carbon star OP Aur, shining 'redly' below and to the right of Stock 8. This is a Mira-type variable with a maximum of 12.7 but which gets 3 magnitudes fainter in a cycle of about 500 days. It is so red that the jump when my red subs started coming in was immediate. In fact it doesn't even show up after the blue filter at all. The panel shows individual 15s subs in each filter together with an RGB (not LRGB) composite (width ~ 1'). This cluster doesn't need the long exposure I gave it. I was hoping to pull out more of the nebulosity but will revisit on a moonless night. cheers Martin [1] https://arxiv.org/pdf/1612.00697.pdf
  3. I've come to the conclusion it is always worth doing EEVA if the skies are clear. I tend to avoid galaxies and fainter PNs in such conditions but I can see that when they contain a recent SN that's another matter. It's interesting to see your comparison with last year's image to see what is being lost. In the case of NGC 4414 you're still pulling out a lot of interesting stuff in spite of the moon. Open clusters are absolutely fine with quite bright moonlight I find, even in colour. I was out last night and looked at quite a few, mainly within the same quadrant of the sky as the moon (which was so high anyway it hardly made much difference which way I pointed the scope). I was surprised also to find that some patches of nebulosity came through quite well. I'll post a few pics later. Martin
  4. Just a note that the latest version (0.4.2) includes experimental support for OSC camera debayering and binning. Thanks to AKB for requesting and testing these features. I will update the documentation ion due course but if you want to use it right now, to configure debayering, go to the Watcher panel in configuration and choose the Bayer matrix (one of GBRG, RGGB, BGGR or GRBG), or leave it on mono if you don't use a colour sensor. Note that I don't currently support non-RGB schemes (ie CMYK). Binning can be selected on the same panel. It is important to understand what will be done to your FITs that your capture program drops into the watched directory:🙄 1. Debayering will produce 3 new FITs, one for each of RGB, and these will get incorporated into the Jocular ecosystem. This means they get placed in a directory under captures/session/object e.g. captures/21_03_23/Messier_42, so you can find them and Jocular can easily reload them in the future. Your original colour FITs will be copied to a subdirectory called 'originals' e.g. captures/21_03_23/Messier_42/originals, so you still have access to them in case you want to use them with other applications. 2. Binning will similarly produce new fits (just one, not three!), and they get assimilated in the same way, with the original being kept. It would have been possible to apply binning each and every time you reload ie not produce new FITs, but I took this design decision to speed up subsequent reloads. On this last point, I want to stress again that Jocular only really works in a comfortable way for relatively small pixel counts (*) I would say it is happy enough up to 1 to 2Mpixels. That's why if you have a large colour or indeed mono sensor I would encourage you to use 2x2 or even 3x3 binning to get the most out of the tool. Of course, you could bin on camera (and lose the data for ever!), or bin in your capture program, or use ROI, but if you want to preserve the full sensor size and benefit from some of Jocular's functionality at the same time, one approach is to get Jocular to do the binning and store your unbinned originals at the same time. Cheers Martin (*) Jocular occupies a different niche than other EEVA applications in that it supports mid-stack 'changes of mind' (forgot to use darks? click darks and click reprocess; want to see if median stack combination will get rid of that satellite trail; click median and its done -- that kind of thing). Above all, no wasted photons restarting the stack! This is compute and memory intensive for large pixel count sensors. The same goes for things like gradient removal or LAB colour combination, which are real time for small sensors, meaning you can interact directly with the image without going thru the intermediate stage of histograms.
  5. All I can say is that the GUI toolkit I use runs on Linux, Windows, OS X, Android, iOS, and Raspberry Pi. I have no experience at all with iOS so I've no idea if it will work. A priori I doubt it. I don't have an iOS device but the code is on GitHub if someone with iOS experience wants to give it a whirl...
  6. AD Most things seem to be controllable via Windows but my interest was seeing what was available to run directly from my MacBook without parallels etc. I couldn't connect to the mount via Stellarium nor could I see any kind of hand controller panel, but the direct Mac->mount (AzEQ6 in my case) connectionworked more or less out of the box with Ekos. That is handy for me as I'll be using Ekos on the RPi too, at least to start with. That's not to say its an easy program to use -- I have the feeling it tries to cater for every possible option and in doing so makes it hard to find the simple stuff, at least at first -- but hats off to the developers, it does at least work for us Mac users! Martin
  7. I suddenly remembered I had Ekos here on the Mac and have just this second got it connected and slewing! Thanks Martin
  8. Thanks Grant. I have from time to time looked at EQMac and noted its broken-ness. the last update seems to be 2016 so I'm not certain whether it is being developed still. I imagine there must be alternatives but provision for Macs isn't always the best... Martin
  9. [I posted this in mounts but maybe it is better here in software] I have the Lynx Astro cable which I plan to use to connect my RPi to my Skywatcher AZ-Eq6 mount and control it from Linux -- for which I am assume there will be no problems. Meanwhile, as another option, I'm wondering if it is possible to control the mount directly via Stellarium running on the Mac. I have it all configured (I think) on the Stellarium side, but the command to slew doesn't do anything as yet, presumably because the mount is not in an aligned state? Since the physical hand controller is unplugged to make way for the Lynx cable, I'm going to need a software controller to carry out the alignment (unless I'm missing something -- entirely possible!). So to my question: what is suitable program to control the SynScan mount from a Mac such that I can align it (and then later, I hope, use Stellarium to slew)? Thanks in advance Martin
  10. Waiting for your image to load 🙂 Open source + Python means someone has almost certainly done it already, and better than I could have done.
  11. I'm not sure if this should be in mounts or software... I have the Lynx Astro cable which I plan to use to connect my RPi to my Skywatcher AZ-Eq6 mount and control it from Linux -- for which I am assume there will be no problems. Meanwhile, as another option, I'm wondering if it is possible to control the mount directly via Stellarium running on the Mac. I have it all configured (I think) on the Stellarium side, but the command to slew doesn't do anything. Certainly it seems to be possible from reading the Stellarium documentation, but perhaps the mount is not in a ready state as it needs to be aligned before slews are accepted? If so, with the physical hand controller unplugged I imagine I am going to need a software controller. Does anyone know of such a thing for the Mac? Anyone using Mac/Synscan successfully? Advice gleaned from the wider web is patchy and inconclusive... Thanks in advance Martin
  12. Ah, those deliberate extra spikes... I have a misaligned secondary and this being EEVA I haven't done anything about it in the last 6.5 years since I bought the Quattro. That accounts for one set. I've no idea where the others come from. I would say they are second harmonics of the first set somehow. Perhaps it is time to disassemble everything and clean the mirror at the same time. Martin
  13. This is the result of dropping the OSC image you sent me into the watched folder. There is a user-configurable Bayer pattern which is used to split 'alien' (ie none-Jocular-created) subs into RGB. The LRGB module then creates a synthetic luminance automatically to deliver the veritable joys of LAB-colourspace manipulation. Of course, real-time LAB work is rather painfully slow as you might imagine (move slider, wait 2 seconds, ...). But in principle it is all there now. I think the solution is to bin large images that result from debayering, as you suggest (or get the cores to do some work for a change). There are a few different debayering algorithms available of which I think I chose the slowest for this example, though that is just a once-only cost. I can make the choice of algorithm also user-configurable. I hope this does not result in a sudden rush of OSC camera users 😉 My only OSC camera is the Lodestar C and that uses a CMYK arrangement....
  14. The angular scale needs a bit more work so it changes to more sensible units for those with larger sensors... Glad you like the annotation. I assume you've downloaded all the deep catalogues? Posting here is fine by me! Martin
  15. The frame size around image is for snapshots so won't affect this. I've just explored switching off the processing I'm doing and it occasionally causes star extraction artefacts that interfere with platesolving (*), so I need to find a better solution. (*) The processing I'm doing attempts to remove edges that can appear when the subs move between alignments. These edges can cause things to look like (bright) stars, which are then selected in part in preference with the real things, causing occasional platesolving failures. I'm filling things in with a local estimate of the image background with variance controlled by an estimate of the noise distribution, but it clearly isn't doing the right thing for your images. I'll find a different solution which doesn't involve messing about with the edges. You might think it is just a case of not selecting stars that are too near to an edge, but my images at least can move quite a bit, and also I need all the stars I can get for small FOVs 🙂
  16. Looks like something I added to mitigate artefacts in star extraction is not quite working as intended for your subs. Give me a while and I'll switch off the new processing...
  17. Yes, you can use that command at any time. What it does is update a file called .jocular in your home directory which simply contains the path to the data directory. BTW If you still want separate paths for the watched folder I could sort something out now that I have the mechanism in place. I'll give the debayering/super-pixel some more thought. It is all quite new to me but yes, I have some libraries that seem to do the job. I just need to sort out the user-interaction part, to keep things simple. As you might remember we looked at the FITs headers and there was no clear/consistent convention as to whether a FITs needed debayering, and if so, with what matrix. One way would be for the user to insist on debayering everything and specify the matrix. I believe this is what other software requires. Martin
  18. Thanks Rob. If you're using Windows then you'll probably run into the same issues with native capture that others are having. As I'm trying to keep the code 100% platform-neutral (internally too ie no 'if Windows do this else do that'), I'm using a library for handling USB that is meant to work with OSX, Windows and Linux, but I'm having difficulties with the Windows side -- something related to permissions. As I don't have access to a Windows machine myself it is hard to work out what is going on. Annoyingly, for the 1 hour I did have access to a Windows machine it was working, so there is an existence proof, but it has proved hard to replicate! If native capture turns out to be hard to get working, I have a plan B, which is actually turning rapidly into plan A. I've recently acquired a RPi running Astroberry and I want to be able to talk to it from Jocular, but I am determined to remain platform-neutral. The possibility I'm considering is writing a simple(?!) server running on the Pi that converts ALPACA-styled requests coming from Jocular into INDI. This then should also work on Windows using the ALPACA-ASCOM bridge. I don't intend to go the whole hog though: I'd just support a simple subset of commands for the main camera, filterwheel and mount. How difficult could that possibly be?! Martin
  19. Yes, it ought to. The first time you start it up you need to tell it where your data folder is jocular --datadir <your path> (Its a long story why this is a command line option and not configured within the program itself, but it has to do with the ugly filebrowser component that was available in the toolkit I'm using...) Debayering didn't make it into this version as I got carried away with annotation, but I can look into it again. The issue as I recall was being able to detect automatically whether a FITs needed to be debayered (and how) or not, but I could transfer that responsibility to the user... Now the code is under proper version control I will be able to release updates much more frequently so I will see if this is something that can be done soonish. cheers Martin
  20. Fascinating objects. I can add Arps 145 and 146, plus my take on Arp 148. These come from the last few years. Ignore some of the labelling of seeing etc; these come from a rogue version of Jocular. Arp 145 consists of a very clear doughnut ring (PGC 9060; type S0), and the ingressor (?) UGC 1840, type S0-a. The interesting thing about these, observationally, is the presence of several foreground stars and the possibility that the fuzzier 'star-like' things are knots or a galactic nucleus in the case of the spherical object. The ring itself is not uniform, seemingly mottled in its Northern sector. There is an intriguing tack-shaped vertically-oriented bit of fuzz just to the NW too. I probably mentioned before that I have a great fondness for this field as it is the one I used when testing the live stacking alignment algorithm. I couldn't get it to work for ages until I worked out I had the x and y coordinates reversed... Arp 146 in Cetus is a tiny smoke ring, about a third of the apparent size of Arp 145 but similar in many respects, including the non-uniform ring presumably left by the passage of the other galaxy. This is 1.1 billion light years away while Arp 145 is about 262 million LYs distant, so the actual sizes of these collisional pairs must be quite similar. Further away still is a z=2.1 redshift quasar which at mag 19.0 is quite bright. Finally, here is my take on Arp 148 -- certainly nothing added relative to Mike's as far as the ring pair is concerned but always fascinating to see the torpedo-like ingressor. Interestingly, these must be of a similar actual size as the earlier two Arps given the apparent size/distance=500 MLyrs relationship. All I can add are a couple of very distant quasars. The z=3.0 one corresponds to a light travel time of 9.5-11.5 billion years according to my 'go to' redshift calculator: http://astro.ucla.edu/~wright/CosmoCalc.html cheers Martin
  21. Jocular v0.4 is now available. The main change is support for deep browsing/annotation of your captures. Installation is also greatly simplified 🙂 You can find details here: https://transpy.eu.pythonanywhere.com/jocular/index.html Please let me know of any issues. Martin
  22. Hi Andy It is already on GitHub but maybe Google hasn't picked it up yet. I'll advertise the link when I've completed the next update (this weekend). Martin
  23. Hi Steve Starlight Live (SLL; formerly Lodestar Live) is a separate application (developed by Paul Shears of SGL) that can be used to do EEVA-like things with the SX range of cameras. SLL is excellent for EEVA and is how I got into this game in 2014, being very easy to use (and because it worked on a Mac!). I would recommend you start out with SLL with your Ultrastar, actually. I developed Jocular because I wanted further capabilities to support my EEVA sessions, but was very much inspired by the simplicity of SLL. Jocular extends the capabilities of SLL in the following ways: automatic management & reloading of previous captures; observation planning; gradient-removal and noise reduction; greater set of stretch functions and histogram-free image manipulation; complete control of the stack; object and session logging; automatically applied calibration library; auto-flats; annotated snapshots; live LRGB and L-Ha combination; local platesolving and very deep annotation (this is in the latest version). On top of that it is open source and 100% platform-neutral, so anyone can join in on GitHub and take it forward. SLL and Jocular can work together, and that is how I started and how many prolific users still use Jocular. SLL is used for capture and is configured to save FITs automatically to a 'watched' folder that Jocular then automatically loads from. I added native support for the Lodestar and the SX filter wheel in the last version and that is the way I use it now (ie without SLL). There are some issues that we are trying to sort out getting the SX cameras supported natively on Windows (it will work, just a question of the details). My intention is to support the Ultrastar natively too (this part is easy and is being held up because of the Windows issue). If you or anyone does want to try out Jocular, please wait for a couple of days (no more) because I'll be releasing v0.4 and the installation procedure has been greatly simplified 🙂: users just need to ensure they have Python and then do a 'pip install jocular'. cheers Martin
  24. Radek suggested doing the following, which worked for me sudo systemctl stop gpsd.service sudo systemctl stop gpsd.socket sudo dpkg --configure -a sudo systemctl restart gpsd.service sudo systemctl restart gpsd.socket
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.