Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Lodestar Live Display Processing Videos


Paul81

Recommended Posts

Hi All,

As there are a lot of new users and chat on the forum about display processing and colour in Lodestar Live, I thought I would make a few short videos on the subject.

Apologies for the extremely amateur production, I didn't even write a script so there are probably more 'Errs.. and Ermms..' than I would like. In these videos I have used my testing mode to use user FITS files (taken using Lodestar Live) and input them to the program as a simulated camera. Although these are only quick videos, hopefully they give sufficient introduction to the concepts that can then be applied when out observing.

First up is a video on colour balance in which I am using a flat field (provided by HiloDon) as demonstration:

http://youtu.be/a3fMi6xBmaE

Secondly is a short demonstration of processing a live stack of M1 (provided by DoctorD):

http://youtu.be/GuBrMcLPPXA

Thirdly, here is a video showing some of the forthcoming changes in V0.11 - I hope these changes will make things more intuitive:

http://youtu.be/bwQvU3Zo_Pg

Even though the controls are designed to be as simple as possible, it does take a little practice, so don't forget if you export your raw FITS images in your next LL observing session you can load one in by using the '-load-image' and '-load-image-cygm' command line arguments so can practice display processing - see my previous post http://stargazerslounge.com/topic/213955-lodestar-live-version-08-beta-download/?p=2291414.

Thank you all again for your kind words and support for LL. I am hoping to get some time soon to finish off V0.11 and get it on the forum for use. In the meantime I look forward to seeing and reading about the results of folks observing sessions, and it is great to see a wide community of people getting into this type of observing!  :grin:

Cheers

Paul

Link to comment
Share on other sites

Hi Paul,

Really great tutorials.  Thanks for all of your efforts.  I realize now that I didn't really know what I was doing.  Looks like 0.11 will work a lot better.  Can't wait to try it.

Just one question.  If I have a FITS file, how do you open it in LL?  I tried opening the file by selecting open with LL, but the image doesn't come up, only the program.

Thanks,

Don

Link to comment
Share on other sites

Hi Paul,

The videos are very helpful. Thank you! Is it possible to download them from somewhere? So that one can watch them off-line.

Is it possible to stack multiple FITS files in test mode? I have several hours of fits files from the other night. It would be nice to play with them as the clouds have now moved in. I would also be happy to send you some FITS files, if that would help. And if you let me know, where to send them.

Thanks again,

--Dom

Link to comment
Share on other sites

Hi Paul

Thanks for the videos - looks like I've been using Lodestar Live correctly!!

V0.11 looks like it will make things even  easier, giving more time to enjoy the images.

That M1 data was one of the first things I captured when I got my Lodestar-C, I'm looking forward to viewing it with live stacking.

Clear skies

Paul 

Link to comment
Share on other sites

Hi Paul or Don,

I didn't have any FITS files handy, so I downloaded one of the flats Don had created to try out the command line loading you mention:

open -a LodestarLive.app --args -load-image-cygm Image_Flat.Field_2014.8.24_11.11.54_00056.fit

This DOES open LodestarLive, and seems to try to load the file (I put it in the same folder for simplicity), but then I get a string of errors:

[bool FITS::FITSReader::load(const QString&)] Operation m_kFitsFile.open (QIODevice::ReadOnly) FAILED

[bool Data::Image::loadFITS(const QString&)] Operation kFitsFile.load (FILENAME) FAILED

Is it something I've done wrong with the command line in Terminal, or with the FITS file, or... ?

Cheers,

- Greg A

Link to comment
Share on other sites

Hi Paul or Don,

I didn't have any FITS files handy, so I downloaded one of the flats Don had created to try out the command line loading you mention:

open -a LodestarLive.app --args -load-image-cygm Image_Flat.Field_2014.8.24_11.11.54_00056.fit

This DOES open LodestarLive, and seems to try to load the file (I put it in the same folder for simplicity), but then I get a string of errors:

[bool FITS::FITSReader::load(const QString&)] Operation m_kFitsFile.open (QIODevice::ReadOnly) FAILED

[bool Data::Image::loadFITS(const QString&)] Operation kFitsFile.load (FILENAME) FAILED

Is it something I've done wrong with the command line in Terminal, or with the FITS file, or... ?

Cheers,

- Greg A

Just a guess, but suggest you use the complete path to the filename...hope it helps.

Martin

Link to comment
Share on other sites

Hi Paul and All,

After my attempts yesterday (see http://stargazerslounge.com/topic/225395-first-light-with-lodestar-x2c-and-llive/ Post #8), I did more thinking about how one could increase colour saturation. The final conclusion reached is that it would be difficult, if not impossible, in RGB or in any system that doesn't have an explicite luminosity component.

The reason is that colour saturation is a variable that is (should be) independent of luminosity. So one needed a way to keep luminosity constant, while increasing colour saturation. In RGB, when we increase the brightness of any of the colour components, that automatically also increases luminosity.

The best way, I could come up with involves temporarily changing into the cylindrical HSL coordinate system. In this system the axial coordinate measured on the perimeter of the circle/cylinder is hue, the radial component is saturation and the axial component is luminance.

The plan of action would be as follows.

1. Convert every pixel's RGB values into HSL coordinates. (There is a formula giving one-to-one correspondence.)

2. Increase the radial "saturation" coordinate of every pixel by the desired multiplier (ideally, implemented with a slider).

3. Convert the new HSL coordinates back into RGB.

I believe that the additional computational burden should be managable for the relatively low number of pixels of the 1/2" sensor.

If not, then we could think about figuring out the formula, that does the same but directly in RGB coordinates. I am not sure though, if this offers any real saving of processing time. All three coordinates of all pixels will still need to be recalculated. As an example, a not fully saturated green pixel also has non-zero red and blue components. Increasing saturation means increasing the green component and, at the same time, decreasing the red and blue components so that the overall luminosity of the pixel remains constant.

Cheers!

--Dom

Link to comment
Share on other sites

Hi Dom and All,

I, too, applied Paul's instructions to last night's viewing. I was pleased with the process and most of the results, but I agree that I couldn't get the color saturation up. The only way I could was to open the image file into a photo processing program to tweak it. Having said that I am still please with what I see on the screen and the simplicity of the system and process. I am still amazed to see the Hicksons and the Horsehead in a matter of seconds in the comfort of my viewing room. I'm still working on getting the color balance right, but here are a few captures from the last few nights using Paul's instructions. All the images can be seen on this link:

http://stargazerslounge.com/gallery/album/3358-new-lodestar-x2c-images/

post-36930-0-80996900-1411093995.jpg

post-36930-0-11537400-1411094022.jpg

Here's M27 with increased saturation post processed.

post-36930-0-81693900-1411094111.jpg

Please let me know what you think and if you have any suggestions. I feel I've made some progress, but have a lot to learn. Dom, I think Paul discussed doing something with the RGB system, and hopefully he will respond to your post.

Don

Link to comment
Share on other sites

Nice shots Don!

Even on the unprocessed M27 you got more teal than I did with twice the exposure and several stops faster optics. It may simply be the case that the atmosphere here in Seattle is filtering out some of those colors. It clouded up here now so we may not see any more photos from Seattle until July 4. Next week I'll go back to New England and will try from there.

LL is a very nice product and opens up entirely new horizons for the Lodestar. My input is user feedback, not criticism.

--Dom

Link to comment
Share on other sites

Nice shots Don!

Even on the unprocessed M27 you got more teal than I did with twice the exposure and several stops faster optics. It may simply be the case that the atmosphere here in Seattle is filtering out some of those colors. It clouded up here now so we may not see any more photos from Seattle until July 4. Next week I'll go back to New England and will try from there.

LL is a very nice product and opens up entirely new horizons for the Lodestar. My input is user feedback, not criticism.

--Dom

Thanks, Dom.

I don't think your input was criticism. I'm not knowledgeable in the things you were discussing about RGB and it may be something that Paul is aware of and working on. I'm sure he appreciates the input.

I think you're right on the sky conditions. When we have clear skies, it is really clear. We can see the Milky Way in all its glory. Best of luck in New England.

Don

Link to comment
Share on other sites

Glad to see the videos have proven to be of help. I was also interested in feedback of the new controls in V0.11 as its always a worry changing 'established' features and controls.

Nice set of images there Don - that M27 is really nice and would easily standout in the deep sky astrophotography forum!  :laugh:

Dom - all user feedback is really appreciated. Astronomy is a great hobby with a smashing community and it has always been my desire to grow LL with the community as collectively we have better ideas on useful features than what I can come up with on my own. Your suggestion with regards to saturation is totally correct and indeed what I am working on (I have implemented something similar in a product for work in the past). The actual mathematical part is the easy part to add into the code, the user interface is tricker as I want to keep it simple and not have too many extra tabs (but still keep the app quite compact - this was an original design goal as a lot of people use netbooks or laptops with small screen so didn't want a bloated UI). I then have to optimise the algorithm to SIMD assembler to keep the speed up (I need to fix some existing V0.11 code in this respect). My main issue at the moment is time - the day job is very full on at the moment with is frustrating as I really want to get the next wave of features in. Will get there eventually though!  :grin:

Link to comment
Share on other sites

Thanks for the kind words, Paul. I patiently await your next release but am very happy with the present version. Your videos were really helpful and I appreciate your dedication to the astronomy community. BTW, my wife overheard the video when I was watching it and loves your accent.

Take care and thanks for all your efforts.

Don

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.