Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Lodestar Live Version 0.8 (Beta) Download


Paul81

Recommended Posts

Here is version 0.8 (beta) of Lodestar Live. It mainly contains some minor tweaks and a fix for the zoom issue in 0.7. I am still working on live view stacking, so keep an eye out for a future update!

Thank you for everyones continuing interest and feedback, there have been some really good future feature suggestions made. For the time being I am taking notes, as I am fully concentrated on live stacking so won't be adding anything major until that is completed.

There should be a readme file in the ZIP containing installation details - it is as simple as un-zipping the file and double clicking though! (Windows users might need to install the Visual Studio 2010 x86 redistributables).

OSX: LodestarLive.app.0.8.osx.140414.zip

Windows: LodestarLive.0.8.win32.140414.zip

From the change log:

- Dark frame acquisition now shows stacked dark frame rather than individual dark sub-exposures.
- When restoring a master dark from FITS, once loaded the master dark is displayed
- Corrected issue that resetting dark or image exposure does not reset the file count in the exported file name.
- Fixed minor mathematical error in FWHM calculation.
- SIMD Optimisations for RGB display processing.
- Persistence of the state of the 'Modify All' checkbox on Display Processing tab widget between programme executions.
- Fixed zoom selection bug. User can now drag out a zoom window in any direction.
- If the target name has been set then the target name is appended to the raw exported FITS image file name. All illegal filename characters are automatically removed and replaced with a full-stop.
- If the telescope field is empty in the target information is blank then the telescope details are not added to the image information text (optionally) added to the exported processed image.
I have seen a few comments on the processing side of things - if you want to have a 'practice' during the day, you can load previous raw FITS exposures at a later date to play with by starting the application from a  terminal prompt and specifying the argument '-load-image <FITS FILE> for mono exposures (Lodestar-M) or '-load-image-cygm <FITS FILE>' (Lodestar-C). Note, this only loads the image into the display processor, so it does not subtract darks or anything like that. Also note I only use this to test the display processor side of things rather than being a fully fledged feature (I see offline processing more a job for applications like Pixinsight or Nebulosity).
I look forward to seeing peoples pictures and happy live deep sky viewing!
Link to comment
Share on other sites

  • Replies 51
  • Created
  • Last Reply

Thank you for the kind comments!  :laugh:

I thought I would post some progress with regards to the live view stacking. I have been working away at all the stages / algorithms that go into stacking the individual exposures as they come in and creating a live stack which is updated and displayed with each new image. So what you will see (in the next version) is that the target you are looking at will grow in detail the longer you observe it.

I have been working on the code in an isolated test harness and have been using data sets I have previously captured whilst video observing (I also have data sets kindly donated by Martin to further test the robustness of the algorithms).

So cutting to the chase before some of the detailed stuff:

This is a single 30s exposure of M51 (so this is currently the kind of image you will see using LodestarLive):

post-9673-0-77375600-1398195966_thumb.pn

This is what it will 'grow' into after observing it for 2 mins 30 sec (i.e. 5x 30s) using the application's SUM stacking mode:

post-9673-0-33055700-1398195985_thumb.pn

This is a single 30s exposure of the Whale Galaxy:

post-9673-0-09781900-1398195997_thumb.pn

This is after 2 min 30s of observing (this is actually only 2 mins of data as one of the 5x 30s exposures has been auto rejected on quality grounds) using SUM mode:

post-9673-0-66882000-1398196009_thumb.pn

And here is what it looks like after the same period but using the MEAN mode:

post-9673-0-91698000-1398196021_thumb.pn

I will point out the live stacking mode is still under heavy development, but I wanted to wet the appetite as it were. The above images have no dark subtraction applied to them (this currently breaks my star detection algorithm) so are actually more noisy than what I would expect once the work is completed. Also to clarify, these results are using my own code and algorithms, the stacked images are not from any of the popular processing programs and although they have been composed offline, they are the kind of result that would have been produced live once the work is complete.

So a little more technical detail. As each exposure comes in, the stars in the image are detected and matched to the key frame image. This is then used to compute a transformation matrix to register the new image with the key frame. A key frame is generated on the first exposure, and then after a number of subsequent exposures. The number of exposures between key frames are selectable by the user. The idea here is that the reference image is refreshed after so many to account for things like field rotation such that transformations are kept less extreme. The new image is then combined with the existing live view stack.

Initially the app will support the following stacking modes (which you choose via the UI):

SUM

Combines exposures by summing pixel values between each successive exposure. Will quickly saturate the 16-bit range for bright targets, but for the faint stuff will help bring out the detail. Great for when you only want to spend a short time observing a target. This mode also has the effect of fattening the histogram which yields better detail on faint targets.

MEAN

Combines exposures by calculating the mean pixel value (a rolling mean which is updated as each exposure comes in). Mean is effected by outliers but is likely to be good for bright targets and again when you only want to spend a short time observing the target.

MEDIAN

Combines exposures by calculating the median pixel value (a rolling median which is updated as each exposure comes in). The median suffers less effects of outliers (e.g. satellites, planes, random hot pixels) but needs longer to get a good result, so you will need to observe for at least 10-15 exposures worth.

I have then implemented a hybrid of the above:

SUM + MEAN

A number of exposures are summed (the user sets how many) and then this 'super sub' is then mean combined. This may not prove much use (I am interested to see) as mean is less tolerant of outliers which summing can exaggerate.

SUM + MEDIAN

Same as above but median combined. I think this one will really bring out detail in faint objects but you will have to be in it for the long haul, I would expect probably 20-40 exposures to get the best (so maybe 10-20 mins which realistically is how long I spend visually observing a target...)

The hybrid mode will have the benefit of summing (i.e. fattening the histogram) but should remove more of the noise. I'll probably add a sigma reject into the stacking but I want to get a fully working version out there first before grooming in extra features like that.

Bad frames will automatically be rejected (by looking at the star quality, number of stars and the back projection error from the transformation estimate). I will have an option to turn that off / on. There will also be an 'undo' button, so if it stacks something and for some reason the result has degraded the live view it will roll the live view back to what it was before. You will also be able turn the live stacking off (so it will behave as current), and I will also allow you to apply a variety of noise filters to the displayed output.

Exciting stuff, but there is much more to do. My star extraction currently needs more work as it fails when the image has been dark subtracted and also fails on other data sets I was testing (there is part of it I knew was weak so I am currently working on that bit, I think the majority of it is fine). The star correspondence (i.e. matching stars between frames) seems to work very well as does the transformation estimation so I think they are OK. The stack mode math is pretty simple so I think that is OK. The image transformation currently is bilinear, but I want to move to bicubic for better quality and I need to sort a couple of other bits out (notice the black 'border' in M51 down the side - that will be resolved). After I have all the data I have working in the test harness I then need to integrate it into the main app code and add the user controls etc so I suspect this will be a while before it is available.

Will keep the forum up to date with progress!

Link to comment
Share on other sites

Good stuff :)

Two things as I note you're using mono images:

* I'd advise you to think about debayering at this point. Reason being that each star will have a distinct FHWM value for each colour in the filter. As the star moves over the bayer matrix you'll find the fhwm values change depending on the colour.

* You'll need to subtract the darks prior to debayering, if you're doing flats this will need doing ahead of debayering too.

Did you use filtered FFT convergence or static feature based algorithm?

I use a FFT based mechanism - it's possible to use both transition and rotation. Although susceptible to non-linear atmospheric disturbance, it's possible to use optic flow filtering to remove the portions of atmospheric churn. This works nicely for solar for example:

Btw - if you create a vector space at the same time as doing the star analysis - you can create a PSF deconvolution mesh. Then apply a PSF deconv on the fly to each pixel - creating a deconvoluted image. The addition to this is you can do the same for guider images - then applying the integrated PSF deconvolution mesh to the long term exposures. Software Active Optics :)

Link to comment
Share on other sites

SkyGibbon, on 26 Apr 2014 - 01:20 AM, said:

Sorry I forgot to mention the pics were done with a SW ED80, 5 - 60 second (unguided) stack for each, adjusted in PS. Shot from a white zone.

You have captured and post processed some nice "images". I thought that  Lodestar Live allows one to adjust their images on the fly in mere seconds for "observing"? I ask this because you mentioned that you forward your images into PS for further adjusting and this now becomes imaging at not video observing. I am bringing this up because it now seems that not only has Cloudy Nights video forum converted into a CCD imaging group which CN already has, but now SGL is now turning into an imaging group which defeats the purpose of originally starting this group. I would like to see some images posted that use "only Lodestar Live" for adjusting the way you would presently see it displayed on your computer not images that have been later post processed using CCD processing software.

Chris

Link to comment
Share on other sites

You have captured and post processed some nice "images". I thought that  Lodestar Live allows one to adjust their images on the fly in mere seconds for "observing"? I ask this because you mentioned that you forward your images into PS for further adjusting and this now becomes imaging at not video observing. I am bringing this up because it now seems that not only has Cloudy Nights video forum converted into a CCD imaging group which CN already has, but now SGL is now turning into an imaging group which defeats the purpose of originally starting this group. I would like to see some images posted that use "only Lodestar Live" for adjusting the way you would presently see it displayed on your computer not images that have been later post processed using CCD processing software.

Yes, you can adjust on the fly. He was talking about adding stacking on the next release is why I posted these.

There is a gamma, contrast, brightness and exp setting in lsar live that can be adjusted on the fly. These pics pretty much represent what I saw. Only saved darker for the stacking purpose.

Link to comment
Share on other sites

SkyGibbon, on 26 Apr 2014 - 12:02 PM, said:
You have captured and post processed some nice "images". I thought that  Lodestar Live allows one to adjust their images on the fly in mere seconds for "observing"? I ask this because you mentioned that you forward your images into PS for further adjusting and this now becomes imaging at not video observing. I am bringing this up because it now seems that not only has Cloudy Nights video forum converted into a CCD imaging group which CN already has, but now SGL is now turning into an imaging group which defeats the purpose of originally starting this group. I would like to see some images posted that use "only Lodestar Live" for adjusting the way you would presently see it displayed on your computer not images that have been later post processed using CCD processing software.

Yes, you can adjust on the fly. He was talking about adding stacking on the next release is why I posted these.

There is a gamma, contrast, brightness and exp setting in lsar live that can be adjusted on the fly. These pics pretty much represent what I saw. Only saved darker for the stacking purpose.

Oh okay so your saying that you only open the images in Photoshop and stacked them using layering and nothing else what so ever? I think when stacking is included with Lodestar Live it will be fantastic way to observe and share. I really am hoping someone like yourself or anyone else using a Lodestar colour or mono CCD camera can broadcast this on NSN using Lodestar Live. If they are what you said and look this good it is a real shame that you owners cannot share these very nice views with the rest of us who just like to watch live deep sky astronomy shows and share some good astronomy chat. It is not hard at all to broadcast on NSN and actually it is free to sign up and all you need is another free program to display your image called Manycam, so there really is no excuses for not only sharing this awesome new software that Paul generously created along with a Lodestar with the rest of the world. One main bonus of video observing is for public out reach and teaching so come on guy's lets get together are share.

Chris A

Astrogate

Link to comment
Share on other sites

Oh okay so your saying that you only open the images in Photoshop and stacked them using layering and nothing else what so ever? I think when stacking is included with Lodestar Live it will be fantastic way to observe and share. I really am hoping someone like yourself or anyone else using a Lodestar colour or mono CCD camera can broadcast this on NSN using Lodestar Live. If they are what you said and look this good it is a real shame that you owners cannot share these very nice views with the rest of us who just like to watch live deep sky astronomy shows and share some good astronomy chat. It is not hard at all to broadcast on NSN and actually it is free to sign up and all you need is another free program to display your image called Manycam, so there really is no excuses for not only sharing this awesome new software that Paul generously created along with a Lodestar with the rest of the world. One main bonus of video observing is for public out reach and teaching so come on guy's lets get together are share.

Chris A

Astrogate

I do have an account there. That was my first go at it and I will definately get it going live as soon as my weather gets better. It's been 1 day good, 6 days bad this spring.

Link to comment
Share on other sites

SkyGibbon, on 27 Apr 2014 - 1:52 PM, said:

I do have an account there. That was my first go at it and I will definately get it going live as soon as my weather gets better. It's been 1 day good, 6 days bad this spring.

Okay that is great and please let me know if you need any help getting started on NSN. I just posted to Paul what is needed (all free) and how to do it. Please see the post here

http://stargazerslounge.com/topic/215199-centaurus-a-omega-centauri-and-a-few-others/#entry2306448

Cheers,

Chris A

Astrogate

Link to comment
Share on other sites

Paul,

Have you thought about super-resolution (drizzle) as you'd get this for free (almost).

The alignment mesh would be a higher resolution, then simply interpolate to create a 2x sized image. After continuing to stack you should see additional details appearing in the stacked image.

Nick

Link to comment
Share on other sites

Paul,

Have you thought about super-resolution (drizzle) as you'd get this for free (almost).

The alignment mesh would be a higher resolution, then simply interpolate to create a 2x sized image. After continuing to stack you should see additional details appearing in the stacked image.

Nick

Hi Nick,

Its a good idea with super resolution, but I probably won't think about trying something like that until I at least have a solid live stacking version tried and tested and well used out in the field. I have played with super resolution in the past to enhance the detection performance of something beyond what its optics should in theory allow and had mixed results. Having said that, it was never truly finished off so no doubt had bugs and other issues that effected the outcome!

Paul

Link to comment
Share on other sites

I see in another thread that DoctorD got confused on what he was looking at because his finder was badly aligned.

Wouldn't it be great if you could integrate LodestarLive with the plate solving engine from Astrometry.net?

Link to comment
Share on other sites

I also agree, it would be a useful add-in. This wouldn't be on any short term roadmap, but I certainly won't rule it out in the future (probably as an optional plug-in).

I have been quite busy with the live stacking, I have a number of data sets which are being very useful and things are starting to come together nicely in the standalone test harness. I'll post some updates at the weekend.

Link to comment
Share on other sites

The straight forward implementation on that would be to take the key frame - pass it to a seperate thread with the astronomy.net (or local astronomy.net instance if there's enough CPU power - perhaps dropping the priority in favour of the realtime stacking) then allow the stacking/viewing etc to occur whilst it's doing the analysis. The subsequent frames aren't going to differ in location so.. Once complete - annotate on a overlay..

Link to comment
Share on other sites

Looks great Paul - will this work with the Lodestar X2 Autoguider?

Yep sure will, if you check out some of nytecams recent posts he is using the X2. So far the extra sensitivity defiantly shows, I am looking forward to seeing what it can do coupled with the live stacking mode!

Link to comment
Share on other sites

Another quick update on the progress of live stacking! Quite a few improvements have gone in and things are starting to come together quite well now, with most test cases running with pretty good results (thanks to Paul and Martin for kindly sending me test data which has been massively helpful). I still have a few issues to sort out, so not quite done yet. I have been working in a stand alone test harness rather than the main code body, so once the current lines of work are complete I need to code tidy and then integrate into the main code and add all the UI elements. Although the stacking code works in full colour, the test harness only outputs monochrome FITS files for simplicity (which I then load in Lodestar Live, apply display processing and that is what is posted here).

To clarify, although the stacked versions of these images have been processed offline (by my own code), it is what you will see live in Lodestar Live when the work is complete after the total continuous observing time indicated

M1 Crab Nebula Raw 30s image

post-9673-0-42412000-1399716790_thumb.pn

M1 Crab Nebula after 210 seconds of observing:

post-9673-0-61238300-1399716754_thumb.pn

IC434 Horsehead Nebula Raw 30s image:

post-9673-0-95788300-1399716781_thumb.pn

IC434 Horsehead Nebula after 210 seconds of observing:

post-9673-0-70194700-1399716737_thumb.pn

NGC4490 / NGC4485 raw 30s image:

post-9673-0-41770100-1399716802_thumb.pn

NGC4490 / NGC4485 after 10 mins of observing:

post-9673-0-31410400-1399716765_thumb.pn

Link to comment
Share on other sites

Oh I forgot to add that the NGC 4490 / NGC 4485 image had increasing amounts of interference from the moon over the 10 min observing period, conditions were not ideal! I was quite pleased when I saw the result!

Link to comment
Share on other sites

Excellent results, Paul. What you are doing is wonderful and well appreciated. I have an X2 on order after seeing the results that you and Nytecam are getting. Will the stacking feature be included in the Mac version? As a Mac user, I would also like to express my appreciation to you in providing Lodestar Live for Mac. I live in Hawaii so your work is spreading around the world. Thanks for all of your efforts. I can't wait to try out the Lodestar and Lodestar Live.

Link to comment
Share on other sites

Thank you for your kind comments! I am actually quite humbled my little app has made it from the UK all around the world from Hawaii to Australia!  :grin: I am just pleased other folks can get some use from it and enjoy the night skies.

I was in Hawaii a couple of years ago for my honeymoon, and naturally visited Mauna Kea. That was an evening I will never ever forget, the sunset at the summit was stunning and the night sky was something else. The thing that sticks in my mind the most is the crystal clear Milky Way, and seeing it all the way to the galactic core. I need to see that again!!

I can defiantly confirm live stacking will be in the Mac version. All the code is cross platform compatible, and in fact my primary development platform is the Mac. The lack of Mac based software was one of the reasons I started on the project. That and although there is software out there to do what Lodestar Live does, its all too complicated. Astronomy is my hobby, I want to enjoy it and not endlessly be fiddling with multiple software packages.

Hope your camera arrives soon, looking forward to seeing those Hawaiian skies again!  :laugh:

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.