Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.


All Activity

This stream auto-updates     

  1. Past hour
  2. Just a quick progress report on the results of a first experiment with 'potentially-live' LRGB using LAB colour manipulations. Still some way to go and the whole process needs a lot of fine-tuning. The reason I'm using LAB is that it does allow easy separation of luminance and chrominance and seems to do the job of preserving hues regardless of stretch etc, which isn't easy to achieve in RGB space. (I've never used Photoshop but I believe it does everything in LAB internally?). The L component comes from those subs captured with a clear filter, plus synthetic luminance from the subs collected with RGB filters (there will be various options on whether to use synthetic luminance and how to combine with real luminance, but I wanted to test the full monty). All the black/white point setting and stretching is then applied to this 'total' L component. The RGB are individually stacked, then gradient removal applied, background subtracted, and combined using a colour-balancing approach based on equalising the brightness of a bright feature in each channel. This is a poor man's G2V technique for the moment but is said to work well 95% of the time. The RGB channels are then transformed to LAB space and the L channel thrown away and replaced by the total L component (the one which has been stretched). That delivers LRGB. But the interesting bit is to then change the chroma saturation using simple manipulations of the A and B components. This is what I'm showing below. Obviously one wouldn't necessarily go so far in saturation normally, but it is interesting to see it at work. One thing you can see in the high saturation case is the appearance of rings on the brighter stars. The centres were probably saturated in the original subs but in any case the effect can be ameliorated by smoothing the A and B components (like binning but preserving the original image dimensions). This is done in the lower right image. What I haven't yet done but I think is needed is to align the different colour channels separately as one can see the rings are somewhat offset from the centres. The main reason for doing this pilot is to see if it can be done in real time as there is a lot of computation just to do something simple like alter saturation. It looks like for Lodestar sized images the manipulations can be done in under 200ms on my 2015 MacBook Air, so that is useable. Obviously, things like stretching will also slow down when using LAB mode because of the need to convert between colour spaces constantly. The entire process is automatic. All the program needs is to know the filter used for each sub. Of course, the user will be able to control things like saturation and configure other aspects, but if EEVA users are going to use mono + filters then it is important that it can be done automatically otherwise it starts to become processing rather than observing. As always, comments and suggestions welcome -- this is all new to me although I believe it is pretty standard in astrophotography. I'd be particularly interested to find more LAB resources or hear from APers on where the rings are coming from and how to deal with them (within the EEVA canon of techniques!). cheers Martin
  3. I think I shall go for the simplest approach to this and try a sealed chamber for the camera again. I can use a 3D printed plastic adapter ring for the lens to thermally insulate it from the cold camera body and hopefully this may let the lens warm up above dew point. I've concluded that a standard axial fan should not fail from damp air as the motor part will be warmed by the current flowing through the motor coils and the air RH will probably be above dew point anyway. A standard CPU cooler on the hot side of the Peltier TEC may produce enough cooling whilst the warmed air can be fed up into the dome. Clearly this will be less efficient than water cooling but may be sufficient. It's worth a try. If the 60x65mm CPU cooler is not enough I have a heat pipe type with larger fan and cooling fins.
  4. upload the image to http://nova.astrometry.net/upload and this will come back with solved coordinates and an annotated image.
  5. "HK$10M of Astrological Equipment"
  6. Did it help to get that (those) off your chest, Stu?? Dave
  7. I'm interested in replacing my Vixen Polarie with one of these - like the no batteries concept + just seems easier to use. But has anybody got on feedback yet on actually using one please? Graham
  8. Cheers Olly. I'm not sure i ever remembered knowing about this rule of thumb but it's super helpful now that i know
  9. The "Greenhouse Effect" of the transparent dome etc. I would always hate being observed, while observing. I do sometimes "covet" a (flat top) roof observatory... Despite the known / specific objections to such things?
  10. Not sure I would want to be the first one in space with this vehicle, wouldn't mind a ride on a Soyuz though.. Alan
  11. There was an article in the Sky At Night magazine a few years ago, try that as a starting point. It's available here. Radar meteor observing doesn't produce radio images of the sky, just detection of meteor trails using radio signals reflected from the ionised gas left by the meteor. It has the advantage that cloud and daylight do not effect it so as long as the transmitter is working astronomy can be carried out 24/7/365 irrespective of the weather. The main software used is Spectrum Lab and the configuration files available also from the above link. I'm far from an expert on this as I've only just got my system up and running.
  12. It looks like Homer Simpson's car design., i.e. designed by someone with not a clue!
  13. This is why they test. Hopefully it's not down to a SuperDraco design problem and is related to something more easily fixable such as plumbing. SpaceX have shown that they can respond to problems and come up with solutions far more quickly than "old" space. Up to this test they were well ahead of Boeing's effort, so I hope that they recover quickly and manned flights take place in 2019.
  14. The SpaceX Dragon Capsule "Anomaly" etc. No one likes to gloat (schadenfreude?) on this one! On the face of it it doesn't seem to be "Good News"? There isn't a LOT of information on this... The tantalising (and disturbing) footage of... an explosion? https://news.sky.com/story/spacex-explosion-dragon-capsule-ends-up-in-flames-after-launchpad-test-11700557 Whatever: Reminded of Messrs. White, Grissom & Chaffee (Apollo 1) Space Travel is never "easy"... is always (potentially) dangerous etc.
  15. I don't use this feature, but AstroArt can annotate images......
  16. UDP does not normally use the same UDP port to send and receive data. Synscan App Pro sends on 11880 but listens on what ever port it creates ,as a client, and this is the port that the remote UDP server/client must respond on. Plus the UDP message must contain the whole message - no fragmenting allowed. Quotes from dev manual(s) "6. Wi-Fi Connection The same protocol runs on the SynScan Wi-Fi dongle or mount with built-in Wi-Fi module.  The Wi-Fi dongle/module runs a UDP server and listen to UDP port 11880 to accept commands from host.  The command must be sent in a single UDP package; the response is also included in a single package.  When the Wi-Fi dongle/module works in access point mount, its IP address is If it runs in station mode, the router that it links to allocates its IP address. 6. Useful Resources  Sample Code: https://code.google.com/archive/p/skywatcher/  Documents: http://www.skywatcher.com/download/manual/application-development/" The SW AZ GoTo mount seem's to be a problem ( I have 2) depending on the board version (think mine are v2.9 and v3.5) - my original prototype worked on one by not the other but I am told that putting the resistor in ,as you have done , corrects the "problem" - I just never tried it as I had moved on. As for your aims below why not just use Indi (or Gphoto2 to be more precise) and use a RPI - a RPI Zero(w) will run Gphoto2 and mount control via normal serial adapter . It does assume a Gphoto2 supported camera !
  17. Today
  18. Earlier this year for me, February I think. I wasn't expecting to see all four moons so that was an extra bonus. I still haven't seen Saturn yet due to neighbour's trees but I have calculated using Star Walk app that by this time next month it should be clearly in sight, can't wait! Deb
  19. James - that sounds very interesting - can you give more details of what you are up to, software used, pics of hardware installed etc. please? I looked into radio astronomy recently, but quickly realised that to get more than huge fuzzy blobs of not very much you need way more than a small dish! I cite the black hole photograph and what it took to get that - an amazing feat with a spectacular result. It would be fascinating to be able to start to do something along those lines, on a very small scale of course. The huge advantage is that cloud doesn't matter... I visited the Astrophysics Department at Oxford University last year and saw the two 2m dishes that they have installed on the department roof in action and still wasn't very impressed with results, from an aesthetic point of view. To start making any kind of detailed images with radio, I fear that a square kilometre array, or approaching that, is needed!
  20. Me too. The PDS is designed for use with cameras, so I second the suggestion if you might want to do astrophotography in the future.
  21. My imaging journey started with a 130 reflector on an EQ2. When the scope was touched it would wobble for a number of seconds, making it pretty awful for visual. Imaging was better as there was no need to touch the scope and it did ok once the wobbles settled down. However, there are a number of other shortcomings with the EQ2 that restrict its use for imaging. The 130 reflector now sits on an EQ5 - very stable for both visual and imaging.
  22. The EQ2 will hold a 50 short tube refractor and that’s about it I’m afraid. Think many beginners have been put off the hobby by that mount.
  23. I wondered if it was power related, so I tried a jump start pack and a thicker, lower resistance cable with locking ring at the mount end. Still had the image trying to do a swift exit left on one exposure.
  24. ROFL! I used to have an EQ2 clone and it was wobbly with a 70mm lightweight refractor. I recommend an EQ-5 or AZ-4.
  25. I think they sell it in B&Q Dave
  26. EQ6 extension pilar for my AZ-EQ5 mount. The EQ5 pilar was taller than I wanted so went for the EQ6 one. And yes that is an EQ6 tripod with 2” legs. I’m using a Geoptik adaptor.
  1. Load more activity
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.