Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.



New Members
  • Content Count

  • Joined

  • Last visited

Community Reputation

20 Excellent

1 Follower

About AlexK

  • Rank

Contact Methods

  • Website URL

Profile Information

  • Gender
  • Interests
    Astronomy, Android programming, Scuba diving, Windsurfing.
  • Location
    California, USA
  1. I hope it's not an off-topic yet. From the formula I have provided (AFOV * f = FOV * F) you can derive that AFOV = FOV * (F/f) = FOV * Zoom. So, you don't have to know the focal ranges of the objective and eyepiece, zoom is enough. In unaware of the zoom app just make it up. E.g. your 15x = 15/1 = 300/20 = 225/15 - any combination will have the same effect on the eyepiece FOV rings drawing on the star chart. To determine the FOV (if not provided in the documentation, or to find it more precisely) the best method is the timing of the star's transit: Where β is the derivative of the transit time measured, γ - the FOV you trying to find. For convenience just observe a star traveling a -> b horizontally (near the meridian).
  2. Or, also in DSO Planner, but now with the virtual 12" binoculars: It's easy to fake the AFOV circle of the binos knowing the equation: EpAFOV * EpF = ScopeFOV * ScopeF (or AFOV * f = FOV * F) To the OP. I had no idea it is possible to sketch with some white ink so nicely! By the way, I've been experimenting with sketching on my Galaxy Note 3 with its Wacom S Pen some years ago. Though just virtually. CN folks are saying that it went out quite interestingly:
  3. Wow! I didn't know that. Will study the Starsense's tech. Thank you for the tip!
  4. Wait a sec, but the Polemaster (QHY and Celestron) has the USB connection too. It should go into a real computer as well to make it work. The handset is just a microcontroller and has not enough computing power to plate solve images. And that's logical as otherwise, the price of the mount with it should be bumped for the price of a decent smartphone.
  5. We have so much of starry newmoon in Cali, that just after two cloudy in a row, I've borrowed my son's GearVR and purchased 3DVR planetarium app. It has its own darkness-adaptation-safe campfire! Add to that little robotic telescope farm in Chiley...
  6. Enjoy! By the way, we are readying the new version, which will have even more control over custom databases!
  7. I guess I'm the best person at the moment to troubleshoot that with you Just tried it on my Note4. Everything seems to be working just fine. Here is my Object Selection screen. The top portion is all you may need to tweak to resolve. See if something is different on your's: Most importantly, make sure the Object Types has all of them listed. I believe you will need only the "Custom" option checked as that's the custom objects database, but having them all checked does no harm (and also just a single tap at the top of the Object Types list). Also, keep in mind that for stellar objects (having no a/b dimensions in the database as all DSOs do) the Blackwell visibility model doesn't work well. So instead I would probably change the Filter type (under the "Primary search parameters") from "Visibility..." to "Max magnitude...". But for me, it works just like that and selecting 15k variables (the maximum allowed for selection). To test it's working, limit the search by, for example, limiting the min altitude to 80 degrees (I'm getting 2898 stars for 4 hours with my z12 scope): Hope that helps.
  8. Ahh! I've missed that turn. Then, sure thing, this is prob. the best solution.
  9. That ASPA is a much more viable idea, indeed: https://www.celestron.com/pages/all-star-polar-alignment# Thank you for sharing! Though, not helpful for the OP.
  10. Oh! Wait a sec. How do you have your list imported into DSO Planner? If you've been using the Observation List import feature then there will be no changes in objects by the visibility, as the Observation list is what you have selected to observe by multiple methods of selection (import is one of five available, object selector is another). In order to use the automatic "Object Selector" facility of the app (which is the only method filtering objects by their visibility score), you need to import your objects as the User Database. Only then they will become full-fledged objects visible to the "Object Selector" on par with integrated objects.
  11. First of all, make sure you have the objects' layers feature disabled. It's enabled by default to avoid confusion (to look like an ordinary star chart). In the Star Chart open the menu and uncheck two checkboxes in the "Star, object and image layers"/"Object layers". I know, that's confusing, I need to persuade my colleague to disable that by default. Then, make sure you have set up and selected your telescope and eyepiece. The default one, I think, is 8" which is powerful enough to show your objects (which are rather easy, i.i.r.c.) even in heavy light pollution. Finally, don't look for changes on the chart, you may miss them. See if the selection is changing in the Selection List (there is the total number of selected objects at the bottom). Also, keep in mind that objects will be removed only when they are under the threshold visibility value you have set. So instead of disappearing from the list they might just degrade their visibility score and remain listed (the visibility score is shown in the Object's Details screen per eyepiece). Try to set the score to 5 in the selector for experimenting.
  12. Just looking at the image of the new mount provided. The iPolar is nothing more than the factory-integrated Polemaster. There is no pole finding eyepiece but USB ports on that end, and there are no other possible openings for a camera not looking at the pole. Website? The Polaris star is a very small object, so yes, it can be obscured, but the polar region must be visible in a good vicinity anyway. The FOV of the Polemaster camera is 10 deg, so the Polaris could be no more than 5 degrees below the edge of the roof (palm width, which is nothing), otherwise, there are no stars visible for the camera to plate-solve. Yes, it is possible to make a wider FOV camera for that, but that will degrade the alignment accuracy which is already just 5 arcmin (the stated 30 arcsec max is just the camera's pixel pitch).
  13. That's nonsense. iPolar is just a cheap camera on the end of the small telescope in the polar axis. So it cannot polar align without the clear view of the pole. Actually, there is a simple computational way for that indeed, but in order to implement it, the iOptron CEM40 should be priced not just pathetic $2000, but $4000, as that will require two additional motors and gears installed to drive the polar axis for the "hassle-free" alignment without direct view to the Polaris... Any even cheapest GoTo system, even Alt-Az, can adjust its motion model to compensate for bad (or none) polar alignment, but in most cases, that's insufficient for the AP due to the field rotation and overall tracking accuracy provided.
  14. The Star Drifting polar alignment method is in the toolbox of astronomers from ancient ages. If you don't have a laptop for that, just do it manually (see this guide , for example). Tedious, yes, but with the artificial TNP method I have described earlier it's needed only once, the Polar scope can be fooled well enough for your typical backyard AP subs after that.
  15. Yep, your question has been answered already above: stars by the front thread, cross by the EP thread. Yes, I dealt with many illuminated eyepieces in finders, telescope and microscope EPs (but nowadays I prefer Telrad for pointing my Dob and rarely using the optical finder; by the way, my current finder has no illumination, as it's not needed in grey/dark LPZs I'm exclusively observing from, as the sky background is perceived bright enough with pitch black crosshairs). That's a known problem, especially in high power eyepieces. The reason why illumination makes the crosshairs fuzzy is the mechanical construction of the illuminated crosshairs. Usually, it's a piece of glass or plastic with the reticle lines carved-in from one side (usually eye-side) to some depth. The light is shining from the side and reflecting from the walls of these deep "canyons" into your eye. Walls are of a non-zero height so some of their parts are significantly away from the focal plane creating "ghosts" susceptible to the parallax. It's hard to fight that unless you just accept that flaw and try to amend it by visually finding the brightness at which you can see a straight edge of the "canyon" to use for focusing and aligning, and later for the pointing. Some "canyons" have the center ridge, some are only fully focusable on their rims. By the way, not lighted reticles are just painted on the glass, so almost not susceptible to the problem. Also, they are a good indicator of your darkness adaptation: if you can't see them against the sky, then you are not well dark adapted yet, so better to focus on that problem first. In a worst case, when you can't dark adapt well enough due to the light pollution use the finger trick: put your bare finger in front of the finder's objective end, the AFOV background will be highlighted with it enough due to the light pollution (just make sure the finger is exposed to the surroundings, not inside the finder ) and reveal the crosshairs shadow. Also, you can try to find the best reticle focusing spot by minimizing parallax instead of trying to focus on the crosshair features. Just remember about the SA, do that with focused stars in the view near the center of the crosshairs. Finally, at any rate, I wouldn't care too much about the finder's accuracy as soon as you can get the target in the main EP reliably with it.
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.