Jump to content

SkySurveyBanner.jpg.21855908fce40597655603b6c9af720d.jpg

NickK

Members
  • Posts

    3,804
  • Joined

  • Last visited

Everything posted by NickK

  1. I think I will have to do an update.. and recompile everything as I have a bleeding edge install (at the time) - full 64bit on the C2 ODroid ARM. Determined to make more of an effort to get some sky time this winter.
  2. Not much progress since the last update the job has been OTT busy :/ I have just watched an interesting lecture on deep learning for image processing. The interesting bit is that the model they use for identification looks similar to what I'm doing here i.e. using convolution filters and then a 3D output of correlation for the neural net. The difference here is that I'm using FFT full image they're using linear sub-images that are down-sized. In essence they're doing a correlation and then working out if the pattern of correlations from multiple feature filters = man, dog or raspberry based on the net recognising the location and the object input patterns it's trained on. The job has a (legal) mandatory 2 week vacation so as part of that I get to tune out and one of those is to have a think about this in further detail.
  3. I found yet another negative aspect - which is any PSF noise, at stacking, becomes a valid signal. I have an idea to remove the noise from the PSF (as I do with the main image) which should improve things massively. Now I'm comfortable that the processes are good - after the noise I may look to start being a bit more rigorous and optimise (possibly GPU it).
  4. So... I've been playing more.. Centre is a 3D plot of the input area.. showing the input with stars - including a saturated star (large sloped star). The output looks like it loses detail.. but it's down to scale - so the right plot shows the rebuilt saturated stars and deconvolution, the right shows an autostretched lower clipped (you can see the base of the PSF as weird circle) - simply because the full scale plots of the saturated stars really dwarf the normal signal level! So the output needs double precision! I'm currently using TIFF but I think I may need to switch to FITS! So improvements.. there is still a little noise being added by the PSF still (the little peaks near the larger stars) which is next on the hit list!
  5. What are you using for guider? OAG or another scope?
  6. Found another slightly larger bug.. in my logic - phase correlation is phase strength, so scale is irrelevant. Hence if you scale a psf to estimate, it will simply tell you the same things - all scales have the same correlation. Now if you do (x,z)*(y,z) correlation at the centre point.. that will give you the scale correlation without needing to estimate using iterations.. #start coding... (x,y) correlation will give you the pin point of the pst to cover with the saturation but ignores scale (it's why this method is useful) but for incremental step based iterative estimation.. it's not the complete solution (x,z)(y,z) correlation should then take 3D in slices to do estimation of the scale (rather than attempting todo a full 3D FFT thus should be faster
  7. So I've had a play over the last couple of nights - including leave the laptop running (30mins per frame! CPUs are slow!) then used PI drizzle stack and autostretch: The PNG looses it a little (the PI 64bit version is a little more subtle on the noise square (I've yet to sort out the noise level adjustment so this is with noise). The two galaxies have appeared with some additional shape (in PI and a little above). I've also optimised this a little (hence only 30minutes!).. but there's still some more to go in terms of algorithm and optimisation.
  8. C2 with both Ekos and INDI on the same embedded board running bleeding edge. INDI is the driver and control process, not the display/analysis/guiding of images. Ekos does the capture and control, this seems fine - however likewise the FITS viewing is slow. An example of file system slowness can be seen when using astrometry.net. With annotated images the process takes minutes, however without annotation and just the RA/DEC results etc the process takes 14s with the entire 32GB set of indices in use. Astrometry is a separate project to INDI and Ekos - so the only shared component open source libraries are cfitsio and a few others along with the operating system. I have pointed out the efficiency aspect of INDI handling large images before (i.e. the architecture makes it inefficient) however this would point to the fact that it's more the FITS handling and the underlying OS filing system performance.
  9. Here lies the problem I don't want anything unplugged. I shouldn't need anything unplugged.. this is the user expectations vs developer (driver) laziness. It should automagically just work
  10. What I have noted is that Linux is about as susceptible to problems with USB-serial devices in use. For example, with an arduino and serial issues, if you disconnect in INDI then unplug and replugin the USB for the arduino the driver simply creates another /dev/ttyACM0 -> /dev/ttyACM1 ->/dev/ttyACM2 ... and eventually the device link fails requiring a reboot (as is a kernel space driver). It also randomly stops working. Never had a problem like that in OSX or Windows. OSX seems very reliable but that is likely to be the SP210x driver itself on the host system - not INDI and not Arduino's fault.
  11. The INDI folks have a few things they're looking at from my understanding: * lighter capture functionality * indi management through web interfaces rather than always needing a big application Just remember that there are some companies such as CloudMakers that are creating additional components that are quasi-commercial but outside of the INDI tree.
  12. This is the problem - asicam uses binaries that are provided by the manufacturer for specific architectures, however those like my ARM v8 64bit will fail to build or have a pre-built version in the repository. indi-asicam
  13. I developed a driver for ATIK on OSX and also rewrote TheSkyX plugin for ATIK on OSX for Software Bisque/ATIK. They both have different approaches and challenges, however for me one thing with the seperate user level drivers in apps is that if one component fails then it's faster to restart just that component that effectively shutdown the guider, the capture and restart and align.. That's less of a problem if you have more predictable clear skies compared to the cloud ridden and AP-limiting UK climate! INDI's development came out of observatory use, globally, in clearer locations. Now with the adoption by more "amateur" users, new requirements appear.
  14. My only concern with Ekos is that because it seems bound to kstars is that if kstars fails (due to a bug) it takes out the guiding/tracking/scheduling component of the system.. so you could end up 5 minutes out of a 8 hour unattended session. However with other apps it's almost the same and the growing number of users means the system gets a heavily tested - hence I'll accept that risk.
  15. One thing that is apparent in doing this method - you need good tracking! I was going to add some more commentary to the shot/post above but didn't have time. With this method the guider tracking method that is normally used is "centroid mass" so the system looks for the brightest largest mass and then works out the centre point of the mass. This works if your target isn't subject to atmospheric convolution. Most will then use a 2 second or so tracking shot to guide on. I know that people would suggest that utilising the star shapes in the final long exposure is the same, i.e. the sum is the same as taking the guider images, however from what I've seen it's more of a simplistic approximation that does not provide the fainter detail. The final image looses some possible signal into the noise due to the approximation. Is this a bad thing? No, just misses what is there. So if your tracking is that good you don't need guiding - then this method becomes even more valuable in that it's results are more reliable. In the shot above, ignoring the mismatch of noise floor that causes the boxes, the integration confirms the problems that "integration" using star matching actually has with wobbling guider stars. I'm starting to lean more towards guider-less and mount accuracy being the ultimate way to really recover detail in images. With my recent interest in automation and stepper motor work for the focuser I'm starting to think that possibly making a direct drive mount may be the way to get this. Then using this method to wring out the final detail possible with the scopes size. I can see why professional astronomers use a laser to provide a false star in this sense but that is not an option - and I think this method with the direct drive is possibly the way to go.
  16. The ODroid C2 thread I've pointed at uses Ubuntu Mate 16.04 64bit for ARM. If you use Intel/AMD then there is more support. The pre-built ISO for the VM is probably a good way to test the move into the world of Linux. Oracle's VirualBox will work nicely if your machine has enough memory/clout. Once you have that working you can switch over as fast as you want. The main things on linux are: * distribution - the collection of kernel and packages that are bundled together to make the version.. * kernel - the underlying OS version, this can be important for drivers etc .. if something says "recompile the kernel" then find a different path (it's convoluted and ongoing support is on your own head). * desktop manager - some have different requirements, others have links into applications that mean things don't always work as they should if funky features are used. * package management - this is *the* kicker for most.. each distribution has their own "tool" just to make it an Bottom.. * architecture - this is the CPU target (type and the 32/64bit for most). Stick with IntelAMD and you'll have a lot of support.. The cleverer you attempt to get.. the more you'll find you're managing linux rather than doing any stargazing. Linux can sometimes be a hard lesson in "it works for me..."
  17. I have an NEQ6+EQMOD and the indi driver works perfectly. It even supports GPS as a positional tool.
  18. https://stargazerslounge.com/topic/268824-automation-project/ Control, local astrometry.net based polar alignment and mount alignment/targetting.. on a credit card sized linux system You could always use something for a capture system and then a second system (whichever is more preferable) for processing..
  19. I figured I had enough playing with a single frame and attempting to pick details out of the output.. so after running overnight (CPU based code a little slow).. i7 finished at 0340 this morning. This is autostretched in PI and I know why the boxes are there (that's the easier part to remove).. I did a 37 images, then registered in PI and did a simple integration: The fun is this bit... galaxies that don't appear in the normal integration very well
  20. I've started playing with this again late last week on the train - with some decent progress in terms of the noise level matching. I may have a look at the sigma stretching that the objective fits library does (I now use raw) because it seems to provide a good stretching mechanism for human viewed images. I'm also hoping that with the automation/focusing projects almost done.. that once the job settles a little I'll have chance to capture more data.
  21. So I've been sorting out a new noise identification system in addition to the deconvolution.. Left - the raw 383L image (autostretched in PI) Centre left - autostretched deconvolution output prior to the noise system, not that some 'stars' are actually the noise 'hot pixels' in the original. Centre right - autostretched new output from the deconv with the new noise identification system in place (On close analysis these are "hot" in the sense of a normalised value of 0.0684 vs the background 0.0054-0.0061 range. I identify these as being suspect by standing 10% above the surrounding pixels then if suspect i check to see if it's a close match to the guider point spread function. If not then it's definitely noise..) Right - Aladin non-autostretched So the image on the centre right has significantly less noise (and thus false positives). I'll tune this some more as at the moment it's done on the input and to be honest I prefer the noisy input as signal is hidden in there. Still to go is reenabling the saturation scaling and a second deconvolution step which now the noise is gone/heavily reduced .. should be far better. This is one of the galaxies I'm after in the noise.. you can see the stars in the dim background (e.g. bottom left of the pixelated view) as well as the swirl of the galaxy.. (just in case you're wondering)
  22. Well I had a slow day today.. and ended up waking up at 7:30.. whilst the Mrs slept .. I decided to geek out a bit.. I've made the system not rebuild the saturated star at the moment - this is because I find that the estimation a little out and it then compresses the range for the existing background noise. However it does still scale the deconvolution (hence the two brightest saturated stars having a box - although this is too being quickly removed with some noise processing on the PSF. Here's the output on the left - inverse and flipped (to align with the Aladin image on the right). There's a slight difference in rotation between the two images- the Aladin image is aligned vertically during their observations - mine is as I observed it
  23. Just an example of the current working output but the system is pulling some interesting information out. Left - raw image FITS autostretched in PI, middle output of the processing with autostretch in PI and right a screenshot of the location in Aladin. You'll note a bit of mirror flipping - just look at the double star to orientate. Click the image the native image (or it may be hard to see the detail). I've still got some work to go (which will remove the squares for the estimates and a few other things). Just goes to show.. in your noise... there is still signal...
  24. Starting to pull together, the system is now far faster (orders of magnitude) with some additional optimisation and I wanted to show the star saturation system working. On the left is the output - note the sharp peak as you'd expect, and the right the raw image indicating the flat top of a saturated star. I'm continuing to work on this and I'm starting to see some good results.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.