Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.

Atacamallama

PhD guiding "simulator"

Recommended Posts

I have been testing PhD and my mount by using the simulator option in the camera choice.

My question is, is this a full simulator that will also give commands to the mount? I am asking because when I try to connect up to the mount and start guiding (simulated of course) it tells me after a while that RA calibration failed because the star did not move enough. So either there is a problem with the connection to the mount, or the simulator does not actually try to command the mount.

TIA for any input.

FWIW I'm using a Vixen sphinx mount and have tried both the "on board" mount mode through the autoguider port, and the Ascom option.

Share this post


Link to post
Share on other sites
ok, but "on the night" you are able to guide, right?

Sorry, didnt make myself clear. Yes, it does guide perfectly on the night.

Share this post


Link to post
Share on other sites

What would be cool would be the ability to playback some of the video that loops. So for argument sake, you could playback the clip of video of the same stars you were tracking on the previous night and fine tune the settings in PHD / EQMOD and compare the results from the live session the night (or nights) before.

Obviously this would be dependent on the disk space in the PC as the video files could be quite large, but it would be the ideal way to simulate and tune the set up IMO

Share this post


Link to post
Share on other sites

Hmmm,

guiding on a star, is like... guiding on a star.....

The SGE provides the star "travelling" at sidereal rate across the screen...your guiding system "has" to work to keep in centred.

Share this post


Link to post
Share on other sites
If you have access to a laptop you could try the SGE program

SGE Star Guiding Emulator

I find it great to help sort things out before the clouds clear ;-)

That's an interesting piece of software, I haven't seen that before. I think I'll have a play with that :glasses2:.

Share this post


Link to post
Share on other sites
That's an interesting piece of software, I haven't seen that before. I think I'll have a play with that :glasses2:.

Yes but it's not very practical, especially as the goal is to get ideal polar alignment first before trying to guide... and if you have the scope perminently mounted in an observatory or don't have the luxury of a second computer then its a bit pointless.

I still think the idea of replaying captured footage in PHD would be a better way

Share this post


Link to post
Share on other sites
I still think the idea of replaying captured footage in PHD would be a better way

That's a very good idea. It would help you see what was going wrong and how to improve the guiding.

Mark

Share this post


Link to post
Share on other sites

Having said that...most of the issues/ concerns/ headaches/ pain we hear about on the forum about guiding, boil down to finding the "right" settings for PHD for the guide camera/ scope arrangement. Wouldn't it be nice to have all that sorted out before you spend the time "for real" under the stars....

Share this post


Link to post
Share on other sites
Having said that...most of the issues/ concerns/ headaches/ pain we hear about on the forum about guiding, boil down to finding the "right" settings for PHD for the guide camera/ scope arrangement. Wouldn't it be nice to have all that sorted out before you spend the time "for real" under the stars....

My point exactly.

I know there is an option to select "simulator" for both camera and scope, but having real footage would allow you to compare the results from a previous live session, thus tuning it to your own set up.

Share this post


Link to post
Share on other sites

OK.

Which "footage" from PHD would you record? The screen, the graph and or the settings summary screen?

Share this post


Link to post
Share on other sites

I was thinking of the footage from the camera... it would need to be compressed in some format so the HDD wouldn't become full of large files. It would be cool to have the option to check a box on the menu to record footage, say 5min, 10min etc, and then when selecting simulator it had the option to choose "from recording" and then play back the captured video allowing you to start tracking on it as if in real time, and thus watch the graph to see how the changes to settings have performed

Share this post


Link to post
Share on other sites

If I understand you correctly....

Without PHD, just make a "recording" of a star - moving across the guide camera FOV due to the tracking errors...

Then open this recording under PHD and use the settings in PHD to "lock on"

These would then be your "optimum" PHD settings for future runs.

Yes/ No?

Share this post


Link to post
Share on other sites
Having said that...most of the issues/ concerns/ headaches/ pain we hear about on the forum about guiding, boil down to finding the "right" settings for PHD for the guide camera/ scope arrangement. Wouldn't it be nice to have all that sorted out before you spend the time "for real" under the stars....

In my experience most of guiding issues/concerns/headaches and pain boils down to flexure, too much faith in sub pixel guiding (i.e not guiding at a sufficient resolution and introducing lags into the control loop by using to long a guide exposure .

Unless you're going to be imaging the same object every night I'm not at all convinced it is at all practical to determine a for a single set of "optimal PHD" parameters. The problem is that guiding is a closed loop control system and as such the control parameters all interact at some level. Also Because PHD only measure star movement in term of pixels this means that a recalibrate is required each time you move in target declination. The recalibration naturally adjusts parameters internal to PHD that affect the corrections being applied - but this in turn may also have affects on the responsiveness of the parameters available. So for example the overall effect of a minimum pixel movement or hysteresis setting may well vary at different declinations.

Chris.

Share this post


Link to post
Share on other sites
I was thinking of the footage from the camera... it would need to be compressed in some format so the HDD wouldn't become full of large files. It would be cool to have the option to check a box on the menu to record footage, say 5min, 10min etc, and then when selecting simulator it had the option to choose "from recording" and then play back the captured video allowing you to start tracking on it as if in real time, and thus watch the graph to see how the changes to settings have performed

I don't understand this. If you play back footage from a previous session than the star movements in that footage are only appropriate for the parameters in place at the time. If you change parameters during "playback" you are changing the mounts movement but the footage itself isn't going to respond accordingly, is it?

Chris.

Share this post


Link to post
Share on other sites

I agree with Chris..

SGE is the way to go if you want to try something like this...

Edited by Psychobilly

Share this post


Link to post
Share on other sites

I did knock up a rather rough guiding simulator for use in EQMOD developmnet . Essentially it just displays a star image on a black screen that moves around randomly (several images used to vary intensity and shape/focus). The app also reads the mounts RA/DEC motor positions to further adjust the image position. I then use a standard webcam aimed at the screen to provide a feed into PHD which is set to pulse guide the EQMOD simulator. Of course the purpose of this was only to test out EQMOD not to optimally tune PHD/EQMOD using real data.

I suppose some effort I might be able to make this test bed simulator into something a little more useful. I'm thinking guess we would need is an unguided PHD log (so PE, drift and seeing) as an input to modify the displayed star position. The "trick" will be calibrating the displayed screen image to reproduce the correct movement in the guide image.

Chris.

Share this post


Link to post
Share on other sites

Chris, thanks for your input, most of which went right over my head, apart from the obvious that I clearly missed.

It would be no good using the recorded footage whilst guiding... I now see that. So what's needed would be to record the star field through the same guidescope when the mount is in normal tracking mode, and then be bale to load this into PHD and run it with the scope in the approx position for the intended target. You could then at least tune the settings in PHD in the day time so that the calibration settings and graph are such that you could start guiding almost straight away under live conditions at night.

As far as I can tell there is no way of playing back an AVI / MPEG video within PHD. I'm still finding out what EQMOD can do, maybe offering guiding from within the application rather than having a secondary application such as PHD would be the ideal way forward, and if so having the option to use a video recording for daytime testing would / could be included ? :glasses2:

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.