Jump to content

740427863_Terminatorchallenge.jpg.2f4cb93182b2ce715fac5aa75b0503c8.jpg

NickK

Members
  • Posts

    3,804
  • Joined

  • Last visited

Everything posted by NickK

  1. I looked at the XU4 option. The main reason I went for the C2 was that it's passive cooled with little chance of being thermally throttled. This means there's no fan vibrations if the unit is mounted on the mount and the CPU is not going to get throttled and cause tracking/processing failures. The C2 runs ubuntu, does alignment, astrometry, star tracking, autofocus and capture with little issue of CPU performance. All of these embedded chips have a 32bit memory bus with 2GB of shared graphics/main memory - additionally the drivers aren't 100% optimal in most cases so the resulting performance differences aren't that much. The XU4 does have 8 cores and the big cores have 4x the cache.. but then linux has to cope with scheduling correctly for realtime such as tracking. As most people seem to use these for media playback systems, this isn't at the forefront of the development.
  2. I'm an ODroid C2 user. No fan running ubuntu. Using the eMMC card is far far faster. Careful with the USB ports - check how much current is available. ODroid have had a couple of designs where the USB headers were supplied with less current than the max 500mA for each port. Only issue I had with my C2 was the HDMI seemed to have a dry joint, but as I run headless anyway that's no biggie. I have compiled my own fully 64 bit version of INDI, kStars, a local Astrometry.net and all the drivers. I use a 128GB SSD to store the 30 gigs of indices and any snapshot images.
  3. I have an ODroid C2 64bit INDI custom compilation/install - it's possible but as the others have said you will find it less hassle picking up one of the astro packaged distros. Like you have have done development for .. too long, but now my time not sorting out big problems is more valuable! Astro time!!
  4. I think I will have to do an update.. and recompile everything as I have a bleeding edge install (at the time) - full 64bit on the C2 ODroid ARM. Determined to make more of an effort to get some sky time this winter.
  5. I'm thinking .. imagine a Alt-Az mount with one or more scopes with cameras. On a 20 minute image, the sky will rotate and leave arcs on the camera. However if the image is de-rotated in software using a 100:1 sub grid - what problems could you get? The arcs would vary in size however given you know the time and the shift, it should be possible to revert the image by transform stacking. The only issue is over saturated stars arching and obliterating the signal over the entire arc - however if you've got to that point then there's a bigger issue of exposure control The reverse stacking could also be used to reduce noise further. The main issue I can see is that as the atmosphere wobbles the arc will change - a bit like the thickness of a pen changing and the hardness of the nib being pushed down. You could even take a sub shot between the long exposures although this would warm up the camera it would allow registration if the algorithm isn't very advanced.
  6. Not much progress since the last update the job has been OTT busy :/ I have just watched an interesting lecture on deep learning for image processing. The interesting bit is that the model they use for identification looks similar to what I'm doing here i.e. using convolution filters and then a 3D output of correlation for the neural net. The difference here is that I'm using FFT full image they're using linear sub-images that are down-sized. In essence they're doing a correlation and then working out if the pattern of correlations from multiple feature filters = man, dog or raspberry based on the net recognising the location and the object input patterns it's trained on. The job has a (legal) mandatory 2 week vacation so as part of that I get to tune out and one of those is to have a think about this in further detail.
  7. ... repeating myself.. new post Interesting - the main issue is that steppers make it very hard to measure the current without being inline at the rate required. So the limiting driver is a good solution. What options for step movement? Well typically the 10:1 requirement for accuracy means you're only attempting to get a smooth movement but it's still not smooth and may vary in smoothness over the rotation.
  8. One of the INDI stepper drivers use it.
  9. One issue we've had with green heavy duty bags over the mount is that magpies and jackdaws seem to peck holes in it. They also become brittle over time, we're on our second. I have an inner tubular garden bag that has wire hoops , a heavy duty bin bag that guarantees water proofing and then the garden back over the top.
  10. I found yet another negative aspect - which is any PSF noise, at stacking, becomes a valid signal. I have an idea to remove the noise from the PSF (as I do with the main image) which should improve things massively. Now I'm comfortable that the processes are good - after the noise I may look to start being a bit more rigorous and optimise (possibly GPU it).
  11. So... I've been playing more.. Centre is a 3D plot of the input area.. showing the input with stars - including a saturated star (large sloped star). The output looks like it loses detail.. but it's down to scale - so the right plot shows the rebuilt saturated stars and deconvolution, the right shows an autostretched lower clipped (you can see the base of the PSF as weird circle) - simply because the full scale plots of the saturated stars really dwarf the normal signal level! So the output needs double precision! I'm currently using TIFF but I think I may need to switch to FITS! So improvements.. there is still a little noise being added by the PSF still (the little peaks near the larger stars) which is next on the hit list!
  12. The fun is that everything is still moving and it's all about having a cone of error that you're reducing iteratively - curve fitting. You can plate solve multiple locations, the time to move, sample and then plat solve needs to be built into the calculation. The reason that people pick locations on the horizon and zenith is to give the largest movement - thus, in theory, the best accuracy.
  13. NickK

    SGLXII Chit Chat

    Well got a good hour of clear gaps - great for push-to or binos. Andromeda through the binos
  14. NickK

    SGLXII Chit Chat

    Bright patch.. almost sunny!... actually - SUN \
  15. NickK

    SGLXII Chit Chat

    Accuweather says 1% chance of rain.. it's raining. I'd prefer 99% chance and it being clear!
  16. NickK

    SGLXII Food Timings

    Minor boo boo on my part - put 1x hog roast, not 2x.. normally there's plenty left over so if I could pay for the additional ticket .. otherwise Sandrine gets the hog roast!
  17. NickK

    SGLXII Chit Chat

    looking at the weather we should see some wind move the cloud at 11pm onwards.. but gaps rather than clears skies... but lucksall has it's own microclimate.
  18. NickK

    SGLXII Chit Chat

    I did manage to forget a 13A power extension.. so we have in tent power but nothing for my scope so will sort that out tomorrow.
  19. NickK

    SGLXII Chit Chat

    Seems warmish but it's looking decidedly cloudy (popping head out of tent).
  20. NickK

    SGLXII Chit Chat

    Was touch and go this morning with cold (now a cough)... but we're going to downsize as much as possible and hopefully get there at about 3 ish.
  21. What are you using for guider? OAG or another scope?
  22. NickK

    SGLXII Chit Chat

    Got a cold. Dreading the packing. However I know if I don't go I'll kick myself!
  23. There is a list somewhere with Mac software. The fast paced release window from Apple makes supporting without dedicated development difficult and time-consuming (read dedicated = expense = charged through purchases to the users). David's EQMac works nicely, OpenPHD2 does too.. It's possible to use parallels/virtualbox etc to use ASCOM. There is an efficiency hit.. I switched to linux - using a embedded odroid C2, Kstars/Ekos/INDI. I still develop on the mac (image processing mainly), but I've burnt out attempting to change the world - instead I want to use and take photos rather than code and support due to time constraints.
  24. Found another slightly larger bug.. in my logic - phase correlation is phase strength, so scale is irrelevant. Hence if you scale a psf to estimate, it will simply tell you the same things - all scales have the same correlation. Now if you do (x,z)*(y,z) correlation at the centre point.. that will give you the scale correlation without needing to estimate using iterations.. #start coding... (x,y) correlation will give you the pin point of the pst to cover with the saturation but ignores scale (it's why this method is useful) but for incremental step based iterative estimation.. it's not the complete solution (x,z)(y,z) correlation should then take 3D in slices to do estimation of the scale (rather than attempting todo a full 3D FFT thus should be faster
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.