Jump to content

SkySurveyBanner.jpg.21855908fce40597655603b6c9af720d.jpg

NickK

Members
  • Posts

    3,804
  • Joined

  • Last visited

Posts posted by NickK

  1. Just starting to write a full image version:

    Here's the guider star as a psf:

    852299742_Screenshot2020-01-04at16_38_08.png.4e12a2326694f5398d19b4f41f3e02ba.png

     

    Just don't attempt mesh() with a long exposure 🤣 ....zzzz

    Also there's a library called Atlas that can be compiled on your system - if Octave finds this it will use it supposedly. Essentially atlas provides multi-core/threaded support for maths functions.

     

  2. So if we take a single width line can convolute with a simple point spread function that simulates our atmosphere:

    1583870709_Screenshot2020-01-04at15_15_37.png.e839c53c77d11a8fbadca8b9cd5abfda.png

     

    Then add these convolved stars into our picture at positions 23, 400 and 850, then add random noise up to 20% in the image to simulate the camera noise. I've also added 20% noise to the psf above too so we have a noisy guider psf star:

    287392468_Screenshot2020-01-04at15_15_58.png.4c8bac72c47d1a38deea0e26fd8280ac.png

    Next we use phase correlation to detect the psf within the picture - the output here is the correlation, where we notices three peaks over 90% correlated these are at 23, 400 and 850 positions!

    1162554227_Screenshot2020-01-04at15_16_15.png.128fec8b6fcdf9ad03d66ec07d2f1e51.png

    Lets take the noisy image - take the 60 pixel psf sub images out of the noisy large image and stack them, we see the noise level of the averaged PSF reduce:

    1315134827_Screenshot2020-01-04at15_16_26.png.1376ca328688dea0c43aab352a12b7ee.png

    The more stars we have (ie the more PSF sub-images we have the more we can average out and remove the noise. You notice the noise is dropping - it's now about 10%.

    This process then allows us to use the new stacked sub images as a new average psf (here's where the x,y variation means our error is growing). The alternative is we can the simply map out the variation on x,y based on the image stars so we can build a function(x,y) that describes the atmosphere. Also we can use the new psf (plural depending if we want x,y) to be able to ask "how much of is this pixel noise" by looking at the correlation strength. the lower correlation the higher the probability of noise.

    We can then use a reconstruction filter to process the noisy image using the psf and also deconvolve the picture!

     

    The above is simplistic example due to the speed of Octave (it's single threaded). 

     

  3. 1 hour ago, Science562h said:

    I wrote a thesis on the Quantum Corrections Theory in cosmology, not quantum computing ... but,  what are you trying to make, a gravitron detector or camera? 

    Thinking if this could be used to clear up images statistically, although they're more use in that harder research! Although most QC commercial work is focused on the quantum inspired algorithms.

    I spent the last 6 months founding, building and driving one the worlds largest bank's quantum computing forum (financial, data science and security).. I don't have post doc maths/statistics required - the forum had members with post docs in probability, statistics, spin, semi conductor quantum ducting, quantum and particle physics.

  4. I have been away.. playing with the idea of building a DAC for my Mac mini with headphones from scratch (a R2R ladder DAC for those audio peeps). The fun is that I spend a week solidly researching the maths behind filtering, which is extremely closely related to images. Interpolation, FIR/IIR and (de)convolution are all seen as filters for example. 

    This morning, with a head full of octave Sinc Filters, it occurred to me that my previous technique for deconvolution detailed previously in the thread isn't far away from a possibly new noise removal method for astro images.

    In audio filtering the FFT is used as sound is a cyclic sine wave as a pattern and that fact means noise in the form of random sample noise, can be removed relatively easily.

    If we want to remove CDD noise - then we look for patterns in the signal in the image (ie the sinusoidal cycles for audio). It occurred to me that every bit of light travels through the atmosphere - with a blurring point spread function. This represents a repeating pattern.. a cycle of point spread functions. (Ignoring turbulence variation over x,y for the time being). That means we can identify signals that have travelled through the atmosphere and those that have not (noise from the CDD).

    That got me thinking - we could make a system that uses the atmosphere blurring on the long exposure to reduce noise. The PSF noise from a guide star or from the image can be reduced by stacking the stars that exist in the long exposure we've identified - this can then be used repeatedly to reduce noise in the psf itself. (there is a psf error here too based on x,y turbulence but let's ignore that- we'll just average it)

    So using this lower noise psf, we can re-phase correlate from the original image and use the correlation strengths and the psf as a reconstruction filter. Hey preso a new reduced noise image.

    As part of the reconstruction filter - it could also deconvolute.

    I'm going to investigate this using Jupyter Notebook with Octave (Python sucks) - using maths, using a 1D slice inputs from a real image and then do the same for a 2D  full image. Given Octave is single threaded and memory based, it may run out of memory - so I may have to code it up in C/C++ with FFTW. 

    • Like 1
  5. On 03/09/2018 at 12:21, jbrazio said:

    Sorry for the late reply but for some reason I'm not getting e-mail notification of the subscribed topics.

    @NickK In one hand powering down a stepper driver will make you lose accuracy exactly has you described, on the other hand leaving the driver in holding torque will make the motor heat. My main concern is not the driver itself but the motor, the 28BYJ-48 running at 12V gets pretty toasty. Has this is not a one size fits all situation I leave up to the user to choose by toggling MOTOR1_SLEEP_WHEN_IDLE. However there is a catch, you need to have full accuracy during an auto focus run, i.e. when you're moving on the focal axis and taking measurements and then moving back to the best position; you don't really need that much accuracy between runs, i.e. you auto focus now and then auto focus in one hour. For this reason even if you have MOTOR1_SLEEP_WHEN_IDLE enabled there is an undocumented flag that defines a timeout after the last move before the motor's power is cut, by default this is set to 5 seconds. This should be set to accommodate the time it takes for your setup to take a frame and analyse it before moving into the next position, this way your motor will be powered off during most of the session but still keep accuracy when it is really required to.

    If the changes required to make the DRV8825 work are as trivial as increasing the delay after a sleep instruction then I suggest the following:
    https://github.com/jbrazio/ardufocus/compare/patch/drv8825

    @NickK could you please comment ? Thanks.

    @swaxolez in fact there are two versions of the 28BYJ-48, the ULN2003 driver board is the same for both models. You must power the driver board either with 5V or 12V according with the motor type you have, however you should not power a 5V motor with 12V.

    @jbrazio the motor heat on this is a factor of voltage and power. This is why I use the current chopper controller (DRV8825). It chops the current at a very high rate resulting in very little heating of the coils. The stepper is stone cold even after hours of movement at 12V with load on a 3.8V.

    Could you add support for two limit switches? Open until the focuser approaches it's limit at either end, then closed stopping the focuser unless a command to move away from the limit is received. In original focuser I made the controller cycle limit to limit on startup/reset so that it knew the range and scale available.

    For multi-night runs, the software may need to be checked to see if it allows a re-calibration step at the start of the session. The focuser could then re-check the limits.

  6. Deconvolution is an interesting problem - typically a PSF is created and used to deconvolute the large image.

    I seems that this has moved on quite a bit over the last few years. Plenty if tensor or GPU versions but this one stands out:

    https://www.hindawi.com/journals/jcse/2008/530803/

    The simulation and parameterisation that is used for simulated annealing - can also be run on a quantum annealer (given the embedding and the constraints of mapping the algorithm to the topology).

    Essentially by using annealing to find the lowest energy point (ie the best deconvoluted image), the same could be run easily on both Digital Annealers (ie Fujutsu's Digital Annealer) and on the likes of D-Wave etc (see previous points on contraints). The result is a system that could estimate the PSF from an image to deconvolute it without needing to consume vast amounts of computational time.

    Would love to hear from the academics :D

  7. For the record - it appears R and Octave (Matlab clone) are single threaded. The libraries that perform functions may be multi-threaded in the function if they support it (unlikely for most) and Octave provides ndpar_arrayfun() and para_arrayfun() for parallelisation with n-dimensional arrays. Python you still have to essentially manage the parallel processes operating on the data as you would in c++ or objective-c++.

     

  8. 47 minutes ago, han59 said:

    I'm surprised you need more then single precision. In my program I define the arrays as single precision by purpose since memory requirements quickly increase if you have a few images in memory.

    If your looking for a new challenge, develop and document a de-mosaic routine for astro OSC images.  Especially when stars are saturated in the center and the stars are small, HFD/FWHM value 2 to 3 pixels. 

     

    For me it is just simple statistics, average value, standard deviation....

     

    The issues come when the psf for an extended saturated star is deconvoluted. As you're now forced to either rescale  or expand the range. I want to keep the detail.

    On the OSC, it's probably better running a video with a constant movement across the matrix. The frames and offsets from the movement should give a better image akin to super resolution filling in the gaps of missing colour data.

    I could say average value removing the outliers. It may be faster, however you have information that is not being used.

  9. 18 minutes ago, BabyPepper said:

    pick up a basler camera, they are python right of the box and include the SDK.

     

    I love image processing and would love to see what you add to the field.  I recently discovered a different way to process solar images, so i know that there are tons of tricks waiting to be unlocked.  Especially at the camera data output before it hits the SSD.      

     

    The reality of the situation, is that technology is here already that can instantly process the video signal on the fly :)    , and the neural network thing paired with an nvidia gpu for parallel processing. Well this can delete the blurry pixels  instantly to only record and rebuild the best data.  Its powerful enough and smart enough to stack on the fly as well..

     

     

    Yes, it's possible to use the GPU to inspect the noise - you don't need a neural network per-say as you could simply use a a kernel to detect the high value vs local values but better still (and a method I've used in the original code for this) is to use the guider image to detect if the noise is a remote object by phase correlation - if there's no correlation then it doesn't look like light that has come through the atmosphere.

    This was from 5 years ago.. using the GPU in the laptop with FFT phase correlation to align the image (you'll note the image stays reasonably steady but the white borders move):

     

    However this is post processing as the GPU in most laptops/cards doesn't have the required floating point range. I could use a eGPU with the mini (original plan) but at the moment.. this will do.

     

    Oh - if you like solar processing there was a nVidia deconvolution using swarms to create the PSF to help improve the image.

     

     

  10. 25 minutes ago, han59 said:

    I'm sorry for you loosing your job. 

    Python is an interpreter. For speed a compiler solution would be better. But if your comfortable with Python at least it is muli-platform. What are the ideas?

    Han

     

    The annoying piece is I keep having to re-write the code - first Grand Central, then OpenCL, then C++ because OpenCL was unsupported and Metal is now the in-thing however I need wider precision and Metal is single precision floating point. So.. it's either write for Octave (a free R clone) or Python.

    I understand Python is interpreted, as is Octave, however it seems more scalable with a move to arrays with pre-coded libraries.

    The old code did phase correlation with the PSF from the guide star in the Z axis to rebuild saturated stars. This works well but expands the dynamic range (hence needing larger range).

    I did realtime GPU based alignment and stacking too but this is more processing offline concentrating on the maths rather than re-writing based on Apple's whims.

    The reason I've picked something Linux friendly (python) is that I have a ODroid 4 core arm that runs INDI + kstars that performs plat solving all onboard.

  11. I've been away for too long :)  as it happens I've been made redundant again.. but not before spending 6 months playing with python and quantum computing as part of work.

    So that got me thinking .. and playing. I've started looking at using Python with astropy (astropy.org) that can load FITS images. Coupled with Jupyter notebook seems to work.

    As my MBP finally died, I've now switched to a Mac Mini i7 12core with 32GB ram which seems happy as Larry. So my intent is to move the GPU/c++ code I had for processing images into python and continue working on it whilst looking for a new job. I also have some new ideas for processing.

     

    • Like 1
  12. The issue with compute on USB is that (a) the data needs to travel both ways - one as part of processing and once for reading results.

    An eGPU with compute where the monitor is connected to the GPU would be more effective. As the data only needs to transverse the slower PICe/ThunderThingy bus once.

    Data (in CPU memory) -> GPU (stored in GPU memory) -> GPU processing -> GPU results stored in memory -> GPU render to screen -> SCREEN.

     

  13. 15 hours ago, stash_old said:

     drv8825 (same as A4988)

     

    Careful - the drv8825 will take more voltage, less current. I use a 3.8V 650mA stepper with 100:1 gearbox but use a 12V supply. Set correctly the 8825 chops the current automatically so you get no heat in the stepper and the 12V step overcomes back EMF.

    • Like 1
  14. Hi,

    Just had a look at the code whilst I have a biscuit and drink break.

    I see you're using SLEEP pin by default:

    • power to the H bridges is cut so if the load requires the current rather than friction to hold that could mean the the focuser moving especially with momentum generated by speed with mass.
    • re-enabling after a sleep, although it doesn't reset the microstep count in the driver, the stepper motor itself may step to the next winding (ie 1/2 or full step)
    • there's a 1ms delay before the DV8825 before the stepper driver is ready to take any instructions such as a STEP operation, there's a 250us delay using operations in call in step() before attempting the step in the 4988 driver. 

    In the end the DRV8825 doesn't get that hot, even when micro stepping so it may be better defaulting to not sleep.

    It should be an easy exercise to create the 8825 from the 4988 by a subclass and override of a delay method.. (or a little refactoring to make a DRV base class for both drivers).

     

  15. 4 hours ago, jbrazio said:

    If your suggestion is when changing from 1/1 to 1/4 micro stepping the counter should be updated i.e. multiply by 4 then that's a really insightful suggestion which I will implement ! :-)

    Ahhh.. this the twist :) I didn't get to make it accelerate or finalise it - life got in the way.

    For example - steppers values only sync up at specific steps in the cycle. At 1/32 micro stepping - only at step 17 does the step sync with 1x stepping. So to switch you would need to 1/32 micro step to step 17 before switching the 1/16 etc (which is now step 9)- this also impacts the reset position:

    // on reset the step position homed depends on microstep mode
    #define RESET_HOME_STEP_POSITION_1 1
    #define RESET_HOME_STEP_POSITION_2 2
    #define RESET_HOME_STEP_POSITION_4 3
    #define RESET_HOME_STEP_POSITION_8 5
    #define RESET_HOME_STEP_POSITION_16 9
    #define RESET_HOME_STEP_POSITION_32 17

    Now imagine switching between micro stepping like an automatic gearbox on the car depending on the velocity of focuser- with the stepping knowing it has 400K microsteps to do it may be better to sync through to 1x stepping then operate in 32 step leaps at 1x speed before slowing down to 1/32 again for the final sets of steps.

    Yes - I use micro steps for counting, however the step count is a max of 0x0000-0xFFFF or 65K steps. Which when you have a 1:100 gearbox and 1/32 stepping on a 1.8deg step and 2.8 turns of the focuser .. results in about 1.6 million steps limit to limit :D An extreme example.. however 65K for a full range isn't good.

    Lastly I was researching an 'automatic' way of detecting the range of the focuser. It attempts to find each limit switch and then calculate the full range of counts. So that way it will not attempt to move and hit the limit - it already knows there's a limit that will be hit before it accepts the move.

     

    I agree the MF protocol isn't great - many of the protocols in the industry aren't designed but evolved.. 

  16. Interesting the DRV8834 only does 2.5-10.8V max. The reason I went with the specific stepper and DRV8825 (sorry 2285 was my mistake!) was that it handled 12-14V which means you get good back EMF resist and is easily powered from a car battery/PSU that's powering the rest of the hardware (cameras, filter wheels etc).

    "1.9 us; they can be as short as 1 us when using the A4988." this is something to be weary.

    "DRV8834 only has two pins for setting its microstep mode" the DRV8825 has three pins M0,1,2 like the 4988.

    So basically the DRV8834 is a low voltage high current. Whereas the 8825 is a high voltage and medium current.

    You need some voltage headroom to overcome the EMF so the step is a solid positive step rather than having it fail to step when under physical load. The 3.8V stepper is recommended to have 12V to step but you do need to set the chopper part of the driver to limit the current or the stepper gets really hot :D

     

    The other thing I was looking at is adding an IR remote control for INDI for the apple mac IR remote :) then you can manually move the focus without touching the scope.

  17. I'd be happy slotting into to the driver for the DRV8825 - a current chopper driver.

    Only thing I couldn't see at the moment in the code is the stepping sync points when micro stepping. So when you change down to 1 then 1/2 then 1/4 .. 1/32 for example the point on the stepper when the stepping align is specific to the microstepping driver.

    Thread, development and sources: 

     

  18. Ahh I've already done just this! I have the INDI based moonlite controller (ODroid C2) --USB--> Arduino Uno -> DRV 8825 -> 3.8V 650mA stepper motor with 100:1 gearbox

    I made the moonlite focuser protocol have a few better commands.

    Torque on this is stupidly high - hence the need for a couple of stop switches. However it means I can hang a heavy set of cameras off the focuser. The Pentax only has a 1:1 focuser - hence the need to have both high torque and accuracy. The system can also step in 1/32.

    Quite happy to slot into this - can share the code if you'd like. Note entirely finished (ie it won't shift between stepping speeds) so will detect limit stops. Issue here is how to signify to INDI that the limit stop is achieved. These are some of the shortcomings of the moonlight focuser protocol.

     

  19. Well I'm getting bored .. with the cloudy winters night.. so I think I may pic thus up again :D I need some maths in my life and sometimes it's good to take a step back and view with fresh eyes.

    I may be also tempted to look at using AI on the residual noise layer in detecting if a pixel is noise or not. At this pixel value level, this is critical - the focus is not if this pixel is noise or not but how much of the pixel value is noise. So I will start researching the AI noise estimation and see if there's any mileage. Additionally I will start looking at some improvements for performance. I don't think it's worth hitting the GPU yet but there's definitely some algorithmic speed ups that could be done.

  20. I had an APC UPS for a home file server a while ago, very good idea considering the housemate pulled the power to plug in the vacuum..

    Shutdown handling, the APC I had used a serial connection to alert the computer that the power (a) was on battery power, and (b) was reaching a critical level to trigger a graceful shutdown of the server. not many astro systems can invoke a park process, close the roof and shutdown the PC etc automatically.

    Noise added to output, as the 12V systems we use are sensitive to noise finding the right inverter not to add noise into a AC-DC>battery->DC-AC->PSU->12V for a camera for example is going to be worth investment of the time to look for good options.

    My (unqualified) thinking is that having a 12V battery being charged by a charger that leads to the 12V equipment is like having a capacitor over a motor in that it should reduce noise.. Especially with PWM style charging that limit/chop the current to shape the current delivery.

    Lastly - deep cycle leisure batteries would be best. Although new cars have AGM style for the stop/start functionality.

    Although a car battery.. I have a Bosch S5 100Ah that I have used for astro.. you can get two days of astro out of it however on the second day the noise level on the camera caused by the deeper discharge dropping the 12V line increased enough to be visible.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.