Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

oaCapture 0.4.0 release


JamesF

Recommended Posts

As focus is often about the steepness (contrast) you could use a histogram of the steepness rather than a single average value. The as the image is coming into focus, the histogram mass moves into the steeper end of the scale. You can then score on the histogram characteristics - which makes it easier when you have large objects and noise.

I wonder if you couldn't just then use the total of all of the gradient values as the focus score?  The more "right-handed" the histogram becomes, the greater the contrast changes in the image and the higher the total of all of the gradients?

James

Link to comment
Share on other sites

  • Replies 74
  • Created
  • Last Reply

Aha!  I can stop the capture without a problem when there is no limit in place, but if there's a limit on the number of frames or time then I can't.  That's definitely a bug.

James

High James, have you had a chance to look into this yet.

Thanks

Dave

Link to comment
Share on other sites

I've decided the sum of the values of the gradients can't work as a focus score, so I need to come up with another idea.

The idea that the histogram of gradient values should be biased towards the right (and the more to the right the better) seems useful as a measure, but it's how one recognises that.  Of course it would be possible just to draw the histogram and have the user do the work themselves, but it's far easier for them if you present a number or a line and say "when it gets bigger you have better focus, when it gets smaller focus is getting poorer".

For the 8-bit case where G is the gradient value and n(G) is the number of pixels with that value, then sum ( G2 * n(G)) for all values of G looks like a plausible method for scoring, but I don't fancy doing that for a 16-bit image with 65000+ possible values for G.

More thought and research required :)

James

Link to comment
Share on other sites

I thought I'd compare the Sobel and Scharr convolutions just to see how different the results are.  The first here is Sobel, the second is Scharr.  All camera settings, lighting etc. were the same.

Scharr seems to do a very nice job of picking out the edges, but really doesn't cope well with the noise.  Sobel seems to do rather better with that at the expense of poorer edge detection.

And yes, I am sporting a rather natty mohican today :D

sobel.png

scharr.png

James

Link to comment
Share on other sites

I've decided the sum of the values of the gradients can't work as a focus score, so I need to come up with another idea.

The idea that the histogram of gradient values should be biased towards the right (and the more to the right the better) seems useful as a measure, but it's how one recognises that.  Of course it would be possible just to draw the histogram and have the user do the work themselves, but it's far easier for them if you present a number or a line and say "when it gets bigger you have better focus, when it gets smaller focus is getting poorer".

For the 8-bit case where G is the gradient value and n(G) is the number of pixels with that value, then sum ( G2 * n(G)) for all values of G looks like a plausible method for scoring, but I don't fancy doing that for a 16-bit image with 65000+ possible values for G.

More thought and research required :)

Yup. The characteristics are a little dependant on your system for actuating. What is the resolution of the focusing system? With backlash and other system variants.. 

Also image blur is also something you have to be careful of when focusing. One option for this is to detect the blur - and you're into FFT Phase Correlation then looking at the 2D phase correlation output. If the centroid mass is a sharp peak, then there is only one copy of the image.. if on the other hand the centroid mass shows as a streak or zigzag (i.e. not a single peak) etc.. then you've got motion blur.. The complication is that phase correlation has multiple solutions as FFTs will often give positive and negative matches.

At least you're only battling the design of the algorithm.. and not the GPU drivers too ;)

Link to comment
Share on other sites

Also image blur is also something you have to be careful of when focusing. One option for this is to detect the blur - and you're into FFT Phase Correlation then looking at the 2D phase correlation output. If the centroid mass is a sharp peak, then there is only one copy of the image.. if on the other hand the centroid mass shows as a streak or zigzag (i.e. not a single peak) etc.. then you've got motion blur.. The complication is that phase correlation has multiple solutions as FFTs will often give positive and negative matches.

I think if you're getting consistent motion blur in a planetary image capture you probably need to pack up for the night :)

James

Link to comment
Share on other sites

I've decided the sum of the values of the gradients can't work as a focus score, so I need to come up with another idea.

That was quick !

The idea that the histogram of gradient values should be biased towards the right (and the more to the right the better) seems useful as a measure, but it's how one recognises that.  Of course it would be possible just to draw the histogram and have the user do the work themselves, but it's far easier for them if you present a number or a line and say "when it gets bigger you have better focus, when it gets smaller focus is getting poorer".

I don't think you need bother with a 65536 bin histogram if you have 16-bit pixels. I'm sure you'd still have more than enough data if you just use the upper 8-bits of each pixel (reduce it down to 8-bit pixels .. 256-bin histogram) - just for the focusing routine.

btw, coming up with a suitable algorithum usually takes more than a day ;)

A good place to start would be to get that data (histogram) on screen ! .. getting a good visualization of the available data will always help makes things easier for you, sit and study it together with varying image inputs/edge gradients. If you see a pattern that corresponds to what it is you trying to detect, then think about how you yourself are able to see that pattern. In the case of the histogram of the sobel that will be to do with seeing a shift from the lower (left) bins to the upper (right) bins.

Link to comment
Share on other sites

A next step you could try James is to very simply assign a weight value to each of your 256 histogram bins, an example might be a nonlinear curve, say the square (might be a bit extreme though) of the bin index, bin 0 = 0 upto bin 255 = 65536, then multiply the weight by the bin value and add it to your single value 'sliding scale'.

Link to comment
Share on other sites

A next step you could try James is to very simply assign a weight value to each of your 256 histogram bins, an example might be a nonlinear curve, say the square (might be a bit extreme though) of the bin index, bin 0 = 0 upto bin 255 = 65536, then multiply the weight by the bin value and add it to your single value 'sliding scale'.

That's what I was looking at with the sum ( G2 * n(G)) thing that I posted about earlier :)

However, I've had thoughts overnight that the idea of the histogram being skewed to the right is wrong.  What I now think should happen is that the histogram will develop a big hole in the middle as when the image is in focus there should also be a large number of black/dark pixels.

So in fact a better measure might actually be sum (( G - mean )2 * n(G)), though perhaps there should be a square root in there somewhere too so the measure is a little more linear.

James

Link to comment
Share on other sites

However the histogram is not the values.. but the steepness. So if your histo comes into focus, you'll get lots of flat and lots of steep IF there's a sharp contrast object (i.e. stars or edge of planet). If the scope has enough focal length to get into the planet and you want to focus on the details within the disc then the resulting behaviour will be slightly different. On the second scenario, Sobel may cause issues itself as the edges of the interplanetary/solar features are softer.

Link to comment
Share on other sites

The first step Nick is get James to get into a habit of displaying his available data to himself in a way that allows him to start to visualize what it is he's trying to do, and to get him to discover/experience the pitfuls/advantages as he goes ;)

Sure you could solve it for him, but then that's not going to help him through the process of learning the pros and cons of whatever method he experiements with.

Link to comment
Share on other sites

Yes Cath :)  The first rule of coding should probably be "Check that your assumptions are valid" :D

What's surprising about this particular problem is that there's a lot of information on the web about how to do edge detection and what methods work well for contrast detection, but little I can find discussing methods of turning your detected edges into some sort of focus score.  Given that many cameras are claimed to use passive (that is, looking at the image on the sensor) contrast detection as an autofocus mechanism I'd have thought I'd be able to find more information. Even if it's the sort of thing the camera manufacturers keep to themselves, the mere fact that they consider it valuable enough to do that suggests that other people would be interested in researching it.

Of course it's equally possible that there is lots of stuff out there and I just don't understand it :D

James

Link to comment
Share on other sites

No need to vanish Nick ;)

One thing I've found over the years is just how unhelpful googles search results are becoming. A few years back I could do a search for a particular problem like this and up pops some helpful info/sites, but that doesn't seem to be the case so much these days :(

Link to comment
Share on other sites

Well, after fiddling about with things for a bit I was coming to the conclusion that the standard deviation was the best measure I could find.  Any other ideas I had might have worked, but weren't really sufficiently sensitive in terms of how much they changed between "in focus" and "badly out of focus".

However, I've just discovered that a common focus scoring algorithm is to calculate the sum of the squares of the gradient at each point in the image.  The examples I found of this algorithm however only computed the gradient vertically and only in one direction (ie. only down the image, or something like that).  As the Sobel and Scharr methods attempt to calculate a figure for the gradient to all adjacent pixels it might seem a better then to sum the squares of the Sobel or Scharr outputs.  I tried this and it does seem to work nicely and is more sensitive than the sd.

James

Link to comment
Share on other sites

I tried this application a while ago... really promising, but I have a few problems with my QHY5L-II.

I gave another look yesterday, a bit better, but still I'm not able to really use it.

Basically, I have loads of frame drops, and sometimes the usb connection seems to hang, and I have to disconnect the camera.

Any hint on what to look to fix this?

Link to comment
Share on other sites

I do test the QHY5L-II mono and haven't found problems with it myself, but where hardware is concerned that seems to be irrelevant :)

What OS distribution/release are you using, and is it the mono or colour camera?

James

Link to comment
Share on other sites

Hello! It's the mono camera, on (x)Ubuntu 14.10.

It's an Atom netbook, so maybe it's a bit slow, I know, but I have full speed (30fps) on windows, while i barely get 4fps on oacapture...

Thanks

Link to comment
Share on other sites

  • 2 weeks later...

Hello! It's the mono camera, on (x)Ubuntu 14.10.

It's an Atom netbook, so maybe it's a bit slow, I know, but I have full speed (30fps) on windows, while i barely get 4fps on oacapture...

Thanks

I've been experimenting with this.  I did find a bug in the code which I have now fixed.  The bug was such that it didn't show itself in my testing (I only found it by code inspection), but it may have bitten you.

Having fixed that problem, at full resolution can still only get about 5fps on my Aspire 1 (Atom-based and running Ubuntu 14.10) whereas on my desktop machine (an i5 running Mint 17) by experimenting with the USB Traffic setting I can get 28 or 29 fps.

If I change the frame size to 640x480 then I can get about 30fps on the Aspire 1 or 93fps on my desktop.

I think perhaps the Atom machine really is just too slow :(

James

Link to comment
Share on other sites

Thanks!

Is there any public repository available to try a snapshot?

I have an Aspire 1 too, and i "pimped" it a bit, with 2GB of ram and an SSD hard drive... maybe I'll be lucky :p

I've been experimenting with this.  I did find a bug in the code which I have now fixed.  The bug was such that it didn't show itself in my testing (I only found it by code inspection), but it may have bitten you.

 

Having fixed that problem, at full resolution can still only get about 5fps on my Aspire 1 (Atom-based and running Ubuntu 14.10) whereas on my desktop machine (an i5 running Mint 17) by experimenting with the USB Traffic setting I can get 28 or 29 fps.

 

If I change the frame size to 640x480 then I can get about 30fps on the Aspire 1 or 93fps on my desktop.

 

I think perhaps the Atom machine really is just too slow :(

 

James

Link to comment
Share on other sites

Thanks!

Is there any public repository available to try a snapshot?

I have an Aspire 1 too, and i "pimped" it a bit, with 2GB of ram and an SSD hard drive... maybe I'll be lucky :p

Once I have the source a bit more stable it will be going onto github or somewhere similar.  Meantime I shall be making a new release within the next few days.  I've just got a little testing to do on OSX and then I'm done for this release.

I have an SSD in my Aspire 1.  I've not upgraded the memory though.  I'm sure the lack of memory can't help.  Ubuntu is getting a bit memory hungry these days.

James

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.