Jump to content

SkySurveyBanner.jpg.21855908fce40597655603b6c9af720d.jpg

Jocular: a tool for astronomical observing with a camera


Recommended Posts

Jocular: A tool for astronomical observing with a camera

I'm pleased to announce the first release of Jocular, an open-source, cross-platform application for doing EEVA-like things.

I'd like to acknowledge the sterling work done by SGL members Bill S, London_David, AKB and RobertI for testing the tool and providing some great suggestions. Thank you! 

1512968295_ScreenShot2019-04-04at22_27_41.thumb.png.d98e943cf7762857f619fc1bd054259d.png

 

Features

  • observation planning via an extensive DSO database
  • DSO/session/equipment logging
  • management of incoming observations
  • reloading of earlier observations
  • live stacking
  • many stretch functions
  • several stack combination methods
  • inverse (‘negative’) view
  • automatic black point estimation
  • gradient removal
  • bad pixel removal & mapping
  • full access to, editing and animation of the stack and its constituent subs
  • support for constructing and automatic deployment of a calibration library (darks/flats/bias)
  • annotated snapshot captures
  • save as FITS
  • interoperability with other tools through open formats
  • a recompute philosophy throughout which enables all decisions to be reversed on the spot

Important caveats 

  1. Jocular requires an external capture engine and supports mono only at present for FITS with under 2M pixels. 
  2. Jocular is a source-code distribution and requires a Python 3 system plus the installation of a few dependencies (first-time-only).

Downloading

Getting help

The best way to get help in the first instance is to PM me and I will summarise any issues of general import on this thread.

Videos

Search for 'martin meredith jocular' on youtube (the interface has changed a little since those were made but not much).

 

Edited by Martin Meredith
replace user guide with one with better resolution images
  • Like 13
  • Thanks 7
Link to comment
Share on other sites

Thanks Robert! Just hope it works for anyone wanting to use it. As I mentioned in the guide, I am very open to suggestions for new features and happy to work with people trying to get it going on different systems such as Linux where it has yet to be tested.

BTW I just replaced the users guide with a new version since I noticed the screenshots were of poor quality. So anyone who has downloaded it already ought to replace it with the new version (the written content is the same).

Martin

  • Like 1
Link to comment
Share on other sites

27 minutes ago, Martin Meredith said:

...the written content is the same

Very good it is, too.  Terrific work on the documentation, as well as the application itself!

  • Thanks 1
Link to comment
Share on other sites

Just read the instructions to this and it looks a very exciting development

I look forward to trying it in the future.

Unfortunately for now all my cameras are OSC....so I won't have the ability to use it right now.

But well done on producing innovation to this field of astronomy :)

  • Thanks 1
Link to comment
Share on other sites

  • 4 weeks later...

Very nice, Martin. I looked at a couple of  the informative youtube videos and the helpful userguide you put together. Great. Can't wait until you have incorporated scripting to run Nebulosity or some other software for capture - I will definitely try it out with my mono Lodestar X2 at that point!

In the meantime, I may try seeing if I can get Nebulosity to manually control my LX2 for captures and use that to populate  the capture folder.  However, even that may have to wait for me, as I have all my astro equipment packed up at the moment.

Edited by alphatripleplus
Link to comment
Share on other sites

Thanks Errol! I'm nearly ready to return to Nebulosity scripting. I had it working for an earlier version so it shouldn't be too much effort. Just finishing off colour at present and I definitely want to have 1-shot scripted captures with the filter wheel in time for open cluster season.

Martin

  • Like 2
Link to comment
Share on other sites

This looks great! Unfortunately I only have a color camera and it's 6MP so I can't try it out, but this looks like a fantastic piece of software! Looking forward to see where the development goes!

Link to comment
Share on other sites

  • 3 weeks later...

Martin, I was wondering if you had any thoughts on whether utilizing NINA as the capture engine for Jocular would be a possibility. As NINA is open source and they support a lot of cameras, I thought this could have some potential benefits perhaps. However, I believe it is Windows only.

Edited by alphatripleplus
Link to comment
Share on other sites

Hi Errol

Thanks for the suggestion. I had a look but I think it would be quite hard (but probably not impossible) to do that (different language/OS). I think going down the Nebulosity scripting line will be the first step to having captures under the control of Jocular. Actually, when I first started developing Jocular, I thought lack of capture functionality would be an issue (for me), but having a separate capture engine running on a separate virtual screen and switching between that and Jocular is quite comfortable (almost like they're part of the same program!). The reason for wanting to integrate a capture engine is to handle more complex captures in a single click (auto-flats, dark library construction, RGB, LRGB, SHO etc). Hoping that will be in the next release. The internals for RGB, LRGB etc are all there now so it is just a matter of getting the time to do the Neb integration as smoothly as possible (there are a few wrinkles like the lack of scriptable abort capture to sort out first).

cheers

Martin

 

 

Link to comment
Share on other sites

  • 7 months later...
  • 3 months later...

Jocular v0.2

Here is a minor update to Jocular along with a new 3-step minimalist installation procedure for MacOS, Windows or Linux. These steps only need to be followed the first time you install Jocular, so if you already have Jocular on your system, just pick up the jocv002.zip below, unzip it anywhere and you're ready to go. 

 

Main changes

1. Left and right arrow keys scroll through the subs

2. Up-arrow selects/deselects the current sub (so it doesn't contribute to the live stack)

3. Down-arrow toggles between sub view and stack view

4. There is now an automatic black point option that simplifies setting the background level

 

Instructions

Step 1. Install python.  If you already have python3 and pip on your system, you can skip this step. Otherwise, visit https://docs.conda.io/en/latest/miniconda.html and install miniconda for your operating system (choose the Python 3.7 version). Follow the installation instructions given there (could take a few minutes). Once complete, check that all is fine by opening a terminal window (command window in Windows) and type the following to see the version number (3.7.6 in my case):

   python --version 

Step 2. Download the latest Jocular release here:jocv002.zip (210k, yes k!) and unzip it to anywhere you like. Navigate to that directory (jocv002) in the terminal window, then change to the jocular directory by typing

   cd jocular

Step 3. Install the required packages

Mac users

   pip install -r requirements.txt

Windows users

   pip install -r requirements_windows.txt

When that has finished (a few mins on a slow link) you can launch Jocular by typing

   python jocular.py

 

There are instructions in the user guide for adding the DSO database and the example capture of Arp 145 to practise on.

Please send me a direct message in case of any issues and I will update this thread. I am not aware of any Linux users to date, so it might be necessary to create a Linux-specific requirements file.

I am still working on a LAB-colour version for later in the year.

cheers

Martin

  • Like 3
Link to comment
Share on other sites

  • 3 weeks later...

Thanks for developing and sharing this Martin - great job.

I was looking for something like this (to do live stacking from an INDI imager) and thinking I would have to write it myself (in Python).

Haven't tried it live under the sky, but looking good with acquired fits files.

Cheers,
Callum

 

Link to comment
Share on other sites

  • 5 months later...

Hi

Wormix and I are (I hope) going to solve this via PM but in case anyone else is installing the earlier version of Jocular, please use a version of Python prior to 3.8 since Kivy (the GUI library I'm using) doesn't work with v3.8. Rather than use miniconda, I now recommend going to python.org for the download and choosing say python v3.7.9. This issue will be resolved at some point when Kivy releases v2.0 (it is fixed in the Kivy 2.0 release candidate).

v0.3 of Jocular is also out for beta testing should anyone wish to try it out. Just send me a PM. Here's a manualv3.pdfdraft of the manual. The first page lists the features. It is reasonably stable for OSX (retina and non-retina screens) but there are still a few issues related to native capture on Windows to sort out.

Martin

manualv3.pdf 

Edited by Martin Meredith
Link to comment
Share on other sites

1 hour ago, Martin Meredith said:

Hi

Wormix and I are (I hope) going to solve this via PM but in case anyone else is installing the earlier version of Jocular, please use a version of Python prior to 3.8 since Kivy (the GUI library I'm using) doesn't work with v3.8. Rather than use miniconda, I now recommend going to python.org for the download and choosing say python v3.7.9. This issue will be resolved at some point when Kivy releases v2.0 (it is fixed in the Kivy 2.0 release candidate).

v0.3 of Jocular is also out for beta testing should anyone wish to try it out. Just send me a PM. Here's a manualv3.pdfdraft of the manual. The first page lists the features. It is reasonably stable for OSX (retina and non-retina screens) but there are still a few issues related to native capture on Windows to sort out.

Martin

manualv3.pdf 2.58 MB · 1 download  

Can confirm now sorted after I installed in a v.3.7.9 virtual environment. 
 

v0.3 looks interesting - first picture suggests now support for OSC and image capture?

 

Could you ping it across to me and I’ll give it a try on my Mac


now, if only these flies would clear.....

 

Cheers

Link to comment
Share on other sites

Glad to hear that! Will send it in a moment. V0.3 supports native image capture for the Lodestar and I plan to add support for the Ultrastar very soon (days away but needs to be tested as I don't have access to that camera). It also controls the SX filterwheel (but no others). No, there is no OSC but it supports mono + filters (LRGB, manipulated in LAB space).

Martin

Link to comment
Share on other sites

  • 4 weeks later...

Jocular v0.3 is now available. Jocular does 3 interlinked things: (1) live EEVA capture/processing; (2) automatic organisation of captures to allow reloading/reprocessing of previous captures; and (3) simple but extensible DSO planning with export of observing lists etc.

This is a substantive release with considerable simplications and quite a bit more functionality, including live LRGB processing and native support for the Lodestar and SX electronic filter wheel, arbitrary user DSO catalogues, interface customisation and easier calibration.

Feel free to have an explore and do let me know (by PM or this thread) of any issues.

Jocular is an open source distribution that works on OSX, Windows and Linux so long as you have a Python installation. Development has been done on a Mac with and without retina screens. Unlike in the past when I suggested using miniconda, I'm now recommending getting this directly from python.org and choosing a pre v 3.8 release (3.7.9 is fine). This is because the GUI toolkit I'm using (Kivy) does not support Python 3.8 at the moment. If you already have Python 3.8 or higher please PM me so I can advise on how to get the script to ignore 3.8 during installation.

The attached zip contains the distribution. You can separately download a couple of example captures, one mono, one LRGB, to play around with. There is also a document describing how to import Vizier and other catalogues.

I've provided a bash script for OSX and Linux users, and a batch script for Windows users. These install Jocular with a single command once you've got Python. Just download the zip, unzip it anywhere, open a command window, navigate to the directory where you unzipped it, and type ./joc in the command window (OSX/Linux) or run the joc.bat file (Windows). The first time round this will check that you have a Python system available, then it carries out the required dependency installations in a virtual environment (basically, this keeps the code hermetically-sealed from anything else you may have in the Python ecosystem), then starts Jocular (the whole things takes a minute or so). Subsequently, you will launch Jocular with the same command and it will be up and running in a second or two. There is a README that repeats this information.

If you wish to use Jocular for native capture on Windows, you will need to ensure that you have libusb installed. This can be done most easily using a small executable from https://zadig.akeo.ie which is just used once to do the installation. Although I don't have access to a Windows machine in general, I did manage to install it on one and so have an existence proof that native capture will work, but I would be interested to hear of others' experiences as testers have had issues trying to get the USB side working.  

The main limitations are (1) very limited native support, but you can still dump FITs into a watched folder, and you can now dump master calibration files too; and (2) restricted to mono cameras, but there is support for mono + filtered colour using LAB space manipulations. There is a toolkit-related issue I'm aware of when using an auxilliary screen where the scaling doesn't work, but I believe this will be fixed in the Kivy release candidate, so is a temporary issue. I'm also going to produce a separate window that can be used for outreach pretty soon.

I'd like to say particular thanks to SGL members Bill S, Mike JW, Callump, AKB, London David, Wormix and CatBurglar for testing, encouragement, suggestions, bits of code, ideas, and generally being patient while I sorted out various issues. Also thanks to Terry Platt from Starlight Xpress for some pointers on coding for the Lodestar & filterwheel.

I'm sure there will be some teething problems, but nothing that can't be fixed. I will produce some additional resources (videos etc) in the coming weeks. Please check further down the thread for any updates to the distribution.

Good luck!

Martin

 

Jocular (1M) Updated 26 Nov 2020: jocular.zip

User manual (1.7M): manualv3.pdf  

Description of how to import external catalogues (1.9M) jocular_imports.pdf

Example images (27.7M) (unzip and copy into captures directory and restart Jocular)  examples.zip

 

 

Edited by Martin Meredith
prettified snapshots
  • Like 3
  • Thanks 2
Link to comment
Share on other sites

  • 3 weeks later...
On 10/11/2020 at 17:05, Martin Meredith said:

Jocular v0.3 is now available. Jocular does 3 interlinked things: (1) live EEVA capture/processing; (2) automatic organisation of captures to allow reloading/reprocessing of previous captures; and (3) simple but extensible DSO planning with export of observing lists etc.

This is a substantive release with considerable simplications and quite a bit more functionality, including live LRGB processing and native support for the Lodestar and SX electronic filter wheel, arbitrary user DSO catalogues, interface customisation and easier calibration.

Feel free to have an explore and do let me know (by PM or this thread) of any issues.

Hi Martin - just wanted to let you know I'm excited to discover Jocular, as a Python programmer mostly on Raspberry Pi platforms and Ubuntu on desktops for development of astronomy applications for telescope and camera control. It's great to see a cross platform tool for EAA, since I'm currently building a 24" F3.3 scope that will be run with RPI/OnStep and will be a dedicated EAA scope for use in RASC outreach programs.  I'm actively developer Python apps on RPI using INDI so I'm really interested in both scope control and camera control (I use a ZWO ASI224MC at the moment) on an INDI platform for EAA use.

I'll be installing the code on Ubuntu VMs and Raspberry Pi's and will let you know how things work. 

I note that on Ubuntu 20.01 you need to add

                 apt install libudev-dev

Or the hidapi install dies with a missing library.

  • Like 2
Link to comment
Share on other sites

Hi Gord

Thanks for the Ubuntu information. I think you are the first person to try v0.3 on Linux so it will be interesting to see how it goes. I wonder if the RPi has enough oomph (and memory) to be able to service Jocular, which is fairly compute and storage heavy due to the particular niche it occupies i.e. supporting recompute/realign and exposing the entire stack at all times. 

Jocular doesn't support OSC cameras just yet but I'm hoping to debayer any OSC fits than land in the monitored directory into 3 separate RGB subs as Tony suggests on the other thread. Colour in Jocular is based on histogram-free LAB colour space which I think most will find simpler to use in an EEVA context than manipulations directly in RGB space.

Control of cameras and other peripherals is definitely the least-well developed point in Jocular at the moment but if you can do what you need to via INDI that sounds like a good combination.

BTW I'll be working on plate-solving for the next release, mainly for automatic annotation purposes, but I guess with INDI mount control integrated it could be useful.

Martin

Link to comment
Share on other sites

I'll have a poke around in it today on Ubuntu. I don't have a Pi4 (latest higher performance CPU) here at the office but I'll bring one in to see how it works - of course I can also just use a Pi as an INDI server for imaging, solving and mount control remotely, put the image files on a network share,  and run Jocular on something beefier.

If you're interested I wrote some code to control my OnStep equipped 16" F4.5 Dob including plate solving in Python and the INDI wrapper - see below. Excuse the one long massive program, I usually prototype that way then break it up when I refactor! I'm fairly new to Python although I've been in the software development business for 40 years.

https://github.com/gordtulloch/PiINDIControlPad

 

Edited by GordTulloch
Link to comment
Share on other sites

5 hours ago, Martin Meredith said:

Jocular doesn't support OSC cameras just yet but I'm hoping to debayer any OSC fits than land in the monitored directory into 3 separate RGB subs as Tony suggests on the other thread. Colour in Jocular is based on histogram-free LAB colour space which I think most will find simpler to use in an EEVA context than manipulations directly in RGB space.

Pretty easy to create RBG subs from a OSC camera so I'd agree thats the way to do it. I do that for a photometry pipeline I'm working on for a DSLR as I need each channel seperate.

def splitrgb(FITS_FILE):
  inputfile = FITS_FILE
  hdu_list = fits.open(inputfile)

  image_data = hdu_list[0].data
  indices=(0, 1, 2)
  image_r = image_data[indices[0], :, :]
  image_g = image_data[indices[1], :, :]
  image_b = image_data[indices[2], :, :]

  image_header = hdu_list[0].header
  red = fits.PrimaryHDU(data=image_r)
  red.header=hdu_list[0].header
  red.header.set('COLORSPC', 'R ', 'PCL: Color space')
  red.writeto('tmp/red.fits')
  green = fits.PrimaryHDU(data=image_g)
  green.header=hdu_list[0].header
  green.header.set('COLORSPC', 'G ', 'PCL: Color space')
  green.writeto('tmp/green.fits')
  blue = fits.PrimaryHDU(data=image_b)
  blue.header=hdu_list[0].header
  blue.header.set('COLORSPC', 'B ', 'PCL: Color space')
  blue.writeto('tmp/blue.fits')
  hdu_list.close()

  return

 

Edited by GordTulloch
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.