Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

JamesF

Members
  • Posts

    31,960
  • Joined

  • Last visited

  • Days Won

    182

Posts posted by JamesF

  1. Sounds like you are charging ahead with your project James, good stuff.

    It's quite unexpected in a way.  I've spent most of the last year or so writing code for a single project and it's quite hard to bring myself to sit down at a computer and write even more code after finishing each day, but having started this I'm really very much enjoying it.  It's perhaps not the tightest code I've ever written, but I have learnt a fair bit about Qt, video formats, the ffmpeg library, threads and got myself back into a bit of C++ programming, so I really can't complain there.

    To top it all I plugged in my 120MM a moment ago and it "just worked" even without monochrome support.  I couldn't work out why until I discovered that the mono camera driver actually supports RGB24 as well :)  I'm going to force it into mono mode and make that work anyhow, because I'll surely need that for other cameras along the way and it would be nice to keep the output file sizes down where possible.

    James

  2. Ah, well, there's always just one more gotcha :)  The Ut Video codec appears not to want to play if I tell it the file format is BGR24 rather than RGB24.

    Fortunately I believe the ffmpeg swscale library can convert from the former to the latter.  I just have to work out how to use it.

    James

  3. Well, having put the camera to one side for a few hours to allow me to get on with some work, I've returned to it this evening and having basically decided that setting the USB bandwidth control variable to 40 is the only option for getting things working I now have functioning capture with the 120MC :D

    I found something bright red and something bright blue and verified that the blue and red channels did appear to be swapped over.  Fortunately there's a Qt function that does the swap.  Now I just need to sort out writing an AVI file in BGR format rather than RGB.  I don't think that should be too hard as long as the Ut video codec is happy about it.

    I also still need to sort out the binning and there is a 12-bit mode that may not be supported outside the ASCOM driver, but I shall come back to the first after getting the 120MM to work and worry about 12-bit later I think.

    James

  4. Aha!  Set the control to 40 and I get an absolutely rock-solid image at 1280x960.  That's a step forward.  Now I just need to sort out why the red and blue channels appear to be transposed and why the ROI control still isn't working properly.

    Shame I ought to be working really, but this is far more interesting :)

    James

    • Like 2
  5. I've seen very similar results from my QHY5L-II in one of the firecapture betas, though it was solved by reseting the camera - perhaps more to do with the timing issues at a hardware level.

    That does seem consistent with what ZWO are telling me.  I have an inkling that I might sometimes have seen it with the ASI120 in FireCapture too, though not often.

    Either the Point Grey Firefly or QHY cameras are next on my list to get working, so we shall see what happens there.  I need to get monochrome capture working before going too much further.  I've not tried that at all yet.

    James

  6. ZWO are suggesting that it happens if there's insufficient bandwidth on the USB bus and that there's a control for alleviating the problem I can use, but I have no documentation for it at all as yet :(  I've picked the shared library apart a little (without access to the source) and it appears there's a separate execution thread that manages the camera and drops data into a buffer for the caller.  I suspect that there may be synchronisation problems if the USB side can't keep up.  Personally if

    I didn't have a new frame ready I'd probably arrange for the user to get the same frame twice rather than give them a damaged frame without flagging some error, but I guess that's a design choice...

    For the moment it might be a case of just tinkering with the controls to see what happens.  I've read the limits for the control from the library which says it should range from 40 to 100, but that the value set in the camera from cold start is 1.  That doesn't bode well.  I'll have to check that the camera default settings are in range and sort them out before starting a capture run I think.

    James

  7. Here's an example of the problem, this time from my capture program:

    debayer.png

    I think the fact that this happens in both my capture code and in their demo program which share no common code above the C/C++ and USB libraries strongly suggests to me that the likely cause is their library.

    James

  8. Well, I'm still no further forward :(

    I'm actually having my doubts about the reliability of the ASI library at the moment.  I built their demo program, which basically just grabs a frame and throws it on the screen repeatedly.  At times it "jumps" and I see an image that has, say, the left-hand  two thirds apparently not debayered, whilst the right hand third is debayered, but is actually the part of the image that should be on the left-hand side of the image.  It looks a bit as though the buffers for the debayered and raw images are getting overlapped for some reason, or the data for the image isn't being copied from the correct place.

    James

  9. I now have an inkling that the frame returned by the SDK library is a plane of red, then green, then blue whereas everywhere else I'm using 8 bits red, 8 bits green and 8 bits blue, repeated over and over again.  The image I'm getting at the moment looks like this:

    colour-problem.png

    I'll reorder the data and see if that helps.

    James

  10. A quick tinker over lunchtime and I have the window resize on rescaling problem fixed.

    I think I've also tracked down the problem with changing the ROI though it's going to take a little longer to fix.  I discovered that some V4L2 cameras need partially restarting when the resolution is changed and added functionality to do that.  It appears that's not enough for the ASI camera though.  Looks like I need to completely disconnect from the camera and reinitialise it from scratch otherwise it dumps core in the library's debayering code :(

    I'm not entirely sure that all the frames are being returned correctly either.  There's no demo code for anything other than 8-bit monochrome and next to no documentation, so I may be doing something wrong.  Further investigation required, but it looks like it could be related to debayering too.

    James

  11. Very nice :)  I like the UI :)

    Thank you.  I have to admit to pinching some ideas from FireCapture for the UI, but imitation is the sincerest form of flattery :).  My plan is eventually to allow the controls to be either at the top or the side.  I think with a netbook it might fit better on the screen if the controls are alongside rather than above the preview window.

    James

  12. Laydeez an gennelmun! I present to you a screenshot of data captured from my ASI120MC :D

    zwasi120.png

    This is using the 2.1mm lens that was originally supplied with it.

    I still have some way to go with support for this camera, but given that the only documentation is a few comments in the header file for ZWO's library and that I've probably only put five or six hours into integrating it with my existing application I'm feeling rather pleased with myself right now.  I'm actually surprised how relatively easy it's been given that I didn't really look at the ZW API in detail before I started coding the application.  Hopefully that's a reasonable indication that I have pretty much the right design for the interface.

    Of the bits still to do, I particularly need to sort out changing the ROI and look into the exposure times as the limits for that control returned by the library look quite odd.  I always intended to have a separate pop-up window that exposed all the available camera controls rather than just those that are probably immediately useful for imaging and I think I might need to bring that forward as there's a very blue cast to the image that can probably be fixed using the blue and red balance controls.

    I also need to think about how I'm going to hook the binning into resolution changes, but that's lower priority.

    Oh, and there's clearly a bug there the preview pane doesn't get resized when I rescale the image to be smaller so the scroll bars are left in place.  I shall have to fix that (the image from the camera is showing scaled to 50% in this capture).

    James

    • Like 3
  13. It might be useful to look at that one day, but for the purposes of getting the application working in the first place I'm not going to worry about it right now.  I hand the bitmap to Qt and let it decide how best to deal with it and that's been ok thus far.  If necessary it's an area I can look at optimising in the future.

    My main concern regarding performance right now is if I should have a separate thread for writing the data to disk.  I can see some justification for that, but again it's not been a major performance issue so far (and I have been doing some testing on an old Aspire One netbook), so  I think time is better spent on getting a usable application rather than tuning something that may not actually need tuning.

    James

  14. Actually there are one or two tricky problems to solve, though they're more UI issues than application logic ones.

    Some cameras appear to allow for arbitrary ROIs to be set.  The ASI120 allows ROIs, but only a fixed set of sizes.  Other cameras allow different resolutions to be used, but still show the full field of view (a bit like binning, I guess).  Combining all of those into a tidy yet meaningful presentation in the UI is going to need some thought.

    The camera also support 2x binning, but only at a single frame size.  That is, you can't set an ROI and use binning on that.  Which means the binning option will have to interact with frame size selection.  I'd not considered the need for that before.  Hardly the end of the world though.

    James

  15. OpenCV is about image processing with SSE based algorithms. The GPGPU portion is dire.. although nVidia have added lots of CUDA but OpenCL is lacking..

    Yes, but in this case the issue was more that the ASI SDK examples appear to use openCV and the WIndows DLL does in fact appear to be rolled up with openCV somehow.  I was concerned that perhaps the camera control library was somehow using openCV for processing the data from the camera and may rely on it for delivering data to the client application, but in fact that doesn't appear to be the case at all for Linux and the user interface appears to be very simple.

    There are unfortunately other niggles with the SDK.  For instance the library appears to be in C, but the header file will only compile in C++.  I've fixed that one fairly trivially.  More awkward to fix is the apparent inability of the SDK to support more than one camera at any one time.

    That's a bridge to cross another day however.  Given the relative simplicity of the interface I just want to crack on and get it working for my ASI120MC.  If I can do  that this week (and there's no real reason to believe that shouldn't be possible as long as I have enough spare time) I could potentially be imaging next weekend's triple shadow transit of Jupiter with my own imaging application which would be exceptionally pleasing.

    James

  16. Been looking at the ASI library this whilst I wait for the children to make dinner this evening (under supervision :)

    The OpenCV stuff looks like it might be a bit of a red herring.  The demo appears to use it just for processing and displaying the grabbed image, but the actual library interface looks really simple compared to the Linux V4L2 interface.  The forecast for this evening is pretty good up until midnight however, so I don't think I'm going to get too much coding done this evening :)

    James

  17. Just wanted to say thanks for taking the time to work on this, James. I'll be watching your progress with much interest. :)

    Thank you.  I feel as if I'm making significant progress now.  I've spent a little time today doing "tidying up" work -- fixing hacks in the code that I put in just to see if it work get things working and suchlike, but when it is next cloudy (tomorrow night, probably :) I shall get on to integrating the ASI interface.  It feels like ages since I had this much fun writing code :D

    James

    • Like 1
  18. That's just very poor driver coding by whoever wrote the driver.

    I've been writing camera software for PC's for years now, and nearly every single time their are problems with cameras (quite often) it's always down to very poor driver coding, at least when it comes to windows drivers, manufactures are very poor at writing driver code and have no pride in their work/coding :(

    The Linux SPC900 driver (well, a shared driver for a whole set of Philips-based cameras) is something of a nightmare because the person who originally wrote it provided some of it as a binary because he was under NDA to Philips.  Someone else then reverse-engineered it and there was a bit of a row, people got hacked off, the world moved on and no-one other than astronomers then really had an interest in it.  It's been left in something of a limbo state for perhaps ten years whilst the kernel has been re-worked, internal interfaces have changed and in fact a whole new video device interface was created.  Hence my initial surprise that it worked at all.

    So, it may be not be specifically poor coding, but that it's been unsupported for so long and isn't behaving in the way that current kernels require.

    I'd actually be prepared to take it on and update it, but for the fact that kernel drivers (that are intended to go into the distributed source) now appear to have to go through quite a rigorous code review and learning all the whys and wherefores is something I'd really struggle to find time for at the moment (or if I could, I'd spend it doing something else).  Fifteen years ago or more I found some bugs in the driver implementation for a NIC I wanted to use in a Linux box, fixed them, emailed a patch off to Alan Cox (who was "owner" of all things network-related at the time) and within a day received a reply along the lines of "Oh, yeah, I'll put that in for the next release".  Sadly it's not quite like that any more :D

    James

  19. Well, it looks like any other device plugged into the same USB bus with the camera means the camera has insufficient bandwidth on the bus.  I think for the time being I'm happy to leave that there.  It's clearly not a problem with my code.  It looks like the same problem exists with other cameras that use the same driver having had a quick google.  Ironic really, that a USB1.1 device should generate an error due to lack of bandwidth on a USB2 bus :)

    James

  20. I see the driver is using interrupt transfers - the downside is they fail if the requested reserved bandwidth isn't available.

    Check that the hub isn't multiplexed with internal devices.

    Indeed.  There's nothing obvious at the moment, but in a similar vein I did have a mouse plugged in.  Removing the mouse allows the camera to work, but if the mouse is plugged in after the camera is running then the mouse doesn't start up, presumably for the same reason.

    So, there's always the possibility that it's an issue with the mouse.  I shall  try a different one later and see what else I can find out about the hardware.  Unfortunately this morning I have the far more exciting task of visiting another potential secondary school for my son.

    James

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.