Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

JamesF

Members
  • Posts

    31,960
  • Joined

  • Last visited

  • Days Won

    182

Everything posted by JamesF

  1. I've just checked the sources for the IEEE1394 library. There's a list of USB vendor and product IDs for cameras known to support IIDC over USB which at the moment is restricted to the Firefly MV (colour and mono) and the Chameleon (again colour and mono). I'd guess dropping the appropriate VID/PID in for the Atik GP might work, but that might be something someone else has to do, though it wouldn't be hard for me to add the data to the code. I'd just have to use a bespoke build of the library. As I do that for ffmpeg, libusb and libuvc already that wouldn't be a big deal. James
  2. Well, for this release I had no plans to support any Atik cameras, so anything's a bonus, really. Perhaps the IEEE1394 library will "just find it" if someone tries. I haven't poked about in the source so at the moment I have no idea at the moment how it finds my Firefly MV, but it certainly does: If I can get that working then I think I'll consider anything else a bonus James
  3. That's rather handy. I didn't realise there were Atik cameras using the same standard. James
  4. I discovered last night that whilst Point Grey provide their own SDK with all sorts of "proprietary, do not distribute" FUD, their cameras largely seem to implement some sort of machine vision camera standard called either IIDC or DCAM that is related to IEEE1394. That's rather handy because there's an open source camera library that implements the standard and supports not only (the more usual, I think) FireWire-connected cameras, but also some of the USB ones as well. I compiled one of the demos from the source distribution and it grabbed frames from my USB Firefly MV quite happily, so it looks as though for the cost of writing a small amount of code to integrate the library with my API I might end up with support for all of the FireWire and USB-connected Point Grey cameras on Linux and OSX Need to stay focused though. Must get the UVC stuff for the TIS cameras sorted first. James
  5. There won't be too many visible changes in the next release as my main goal is to get something working on OSX and it'll mainly just be bugfixes otherwise, but as part of the OSX port I've now massaged the sources into a state where I have exactly the same code building working binaries on 32-bit and 64-bit intel Linux, Raspberry Pi and OSX. Basically from the top of the source tree it is possible to do: $ ./configure && makeand it does the rest. I think the main task remaining is to work out a tidy way of loading the firmware into the QHY cameras on OSX (I can do that manually myself, but apparently the average Apple user can't cope if it doesn't have a point'n'drool user interface What I'll probably do is just have a bit of code at the start that matches the USB VID and PID values before the firmware is load and runs the relevant commands to do the work. I also need to sort out wrapping everything up into an installable package for OSX. I'm probably going to leave one ugly OSX issue present for the time being because I'm not sure how to sort it out yet which is that the buttons in the UI get rendered at what appears to be twice the desired size on Retina displays when they contain an icon. If I plug in my desktop display then everything looks fine, but there's clearly some sort of scaling issue relating to high pixel density displays somewhere that I haven't been able to run to ground yet. James
  6. (I should point out that I've never actually tried this myself. I am rather assuming it will work out the way I propose.) James
  7. My thinking regarding the C-mount is that it will help get the spacing correct. An EOS-to-C mount adaptor should give the correct spacing to bring the EOS lens to focus 12.5mm (I think) behind the mount. A C-to-CS spacer will take another 5mm off that and bring the EOS lens to focus on the ASI camera sensor. With the EOS-to-T2 converter you'd still need to make sure of the spacing to get the lens mounting flange 46mm (?) in front of the sensor (and woe betide you if it's already more than 46mm James
  8. My camera has a threaded piece that screws into the T2 thread, with a C/CS thread on the inside. The CCTV lens that came with the camera fitted into that. It may not be the same any more. Mine was one of the early ones. James
  9. My ASI120MC has a CS-mount fitting as well, so one of these with a 5mm C to CS fitting might work if you can find one that will ship to the UK: http://www.ebay.com/itm/Canon-EOS-EF-EFs-lens-to-C-Mount-Film-Movie-Bolex-Video-Camera-CCTV-Adapter-Ring-/321292437097 Unless they don't provide the CS mount any more? James
  10. There is a new release http://stargazerslounge.com/topic/215690-oacapture-005-beta-for-linux-is-released/ James
  11. I think what Mark is doing is comparing the achieved sensitivity (without the lenses and CFA) with the extrapolated optimal sensitivity based on the readings with the CFA and lenses. For example, if the RGB values for a given photosite total 31355 then ideally with no CFA the same photosite should record 31335. Because the microlenses are no longer present then the measured value is considerably lower. Gina on the other hand is comparing the actual signal values measured for (a group of four, in this instance) photosites before and after removal of the microlenses and CFA. In this case the readings are better "after" compared with "before". But they're not the same thing, so they can't be compared directly. Both are probably valid ways to look at the data. The difference is probably that with the microlenses and CFA present the resolution is lower even if more photons contribute to the readings for each photosite. (Obviously there's some approximation going on with the CFA example too, because at a red photosite the G and B values never actually happened. They're interpolated from the values recorded at the surrounding pixels.) James
  12. I'll try oacapture out on distroastro for the upcoming release, too. I've not tried it on Mint prior to release 15 until now. I would not be surprised if there were niggles with the SPC900 support because it did stop working somewhere between kernel versions 2.6.x and 3.2-ish, but one of my future jobs is to add a user-space driver for the camera as an alternative if the kernel-space one doesn't work. James
  13. I downloaded DistroAstro overnight and installed it in a virtual machine this morning. It looks like it is based on Mint 13 with MATE, with additional bits from Precise. I like MATE and in fact that's one of the reasons I use Mint now, but I would be tempted to use a lighter-weight window manager on an astro distribution I think. I use an Acer Aspire One for testing oacapture and it really isn't happy running Mint/MATE, probably because of the memory footprint. It works fine with Mint/Xfce though. James
  14. I think 14.04 LTS has only just been released, so that may perhaps be a little way off yet. I imagine it will take some time to get 14.04 shaken down and then to port all the necessary bits to it, set up the repos etc. James
  15. Well, having read more on the website it's not entirely clear to me what exactly it's based on, but it does use the same 3.8 series kernels that I have been using for development on Mint 15 and reading between the lines I infer that it's based on Precise, which would make it 12.04 LTS. There are some oddities that have cropped up recently. For instance I have built it against libtiff5 which is part of Mint 15, but appears not to be present (nor installable as a package) in Ubuntu 12.04, so those sorts of things do need ironing out. I'm actually in the process of installing a couple of machines (32-bit and 64-bit) with a number of recent Linux variants as well as around fifteen different virtual machines, so I should have a fairly good idea of what the next release works on and what it doesn't. I'll add Distro Astro to the list of VMs to install. Eventually I will get it all built as a package (or several, perhaps) like any other, but it all takes time James
  16. I think the current Distro Astro is based on one of the Ubuntu LTS releases, but I'm not sure which one. I know there are kernels that don't support the SPC900 very well and I've found issues with the Lifecam and Xbox cameras in some kernels (and made patches for them). I'll see what I can find out. James
  17. Thank you, Alan. I'm really pleased you've found the application so good to use and I think you've done well with that image. It's not often you see a lunar image done with an SPC900. It's a pleasure to have such positive feedback. The next release of oacapture is almost ready and in fact would have been out before now but for some last-minute updates to track changes to the ZWO SDK. As well as support for the QHY5L-II (at least the mono camera, which is all I have) there is a new camera control window that gives access to all the controls for the connected camera (including the "private" functions supported by the SPC900). I would attempt to get everything sorted this weekend, but my wife has decided that I will like my office much more if she decorates it this weekend, so I'm busy dismantling everything and packing it up for a few days. LX mode is on the list of enhancements, but not something I've got my head around yet. I know Robin (he of SharpCap fame) said that LX mode was a real pain to get working. For the following release I think I'm probably going to concentrate on support for OSX and perhaps adding a skeleton for support of filter wheels. Motorised filter wheels are a bit more tricky to handle than cameras as they generally seem to appear as USB-connected serial devices and there's no way for the software to tell for certain what is really on the other end of the serial port. Part of my intention with oacapture was to require minimal input from the user from the point of view of saying "device X is my camera, device Y is my filter wheel and device Z is the mount" (for instance), so I need a tidy way for the API to be able to say something like "there's definitely a filter wheel connected here" or "these two devices may be filter wheels, but I can't tell; the user must choose". James
  18. I think I'm pretty much there with the QHY5L-II now. I don't have HDR support enabled yet. but I think that's about it. I may well leave that for another time as I'm really not convinced it's particularly useful for astroimaging. I've also been working on adding a settings pannel to show all the camera controls at the same time and as part of that I've managed to sort out support for the "undocumented and private" controls for the SPC900, so in the next release they should all be available like this: Most are still available in the main window as well via the drop-down, but it is now possible to see all the settings at the same time. James
  19. That's a very nice image Experimenting with the exposure times is definitely worthwhile. I'm not sure there can be any hard and fast rules. I am currently (from memory) using 1/500th @ ISO200, but I have to use a mirror lockup delay otherwise I get ghosting in the image because of the vibration caused by the mirror moving. James
  20. That's a very good effort, I'd say. I'm wondering, just looking at the image, if we're seeing quite a bit more of the western limb than usual at the moment. Though it might well just be that I'm seeing quite a bit more of the Moon than has been usual over the last six months James
  21. Always good to see people getting out there and trying it. Even better when the weather co-operates and it's possible to get a good run in. James
  22. Got the temperature reading sorted last night before bed I also (re-)found the source for the MT9M034 I2C driver for the Linux kernel as used on the Beagle Board which turned out to be quite a useful source of information in that I now know what most of the I2C registers are supposed to be. The two clearly aren't quite identical. but there's some interesting stuff to look into as a result. For instance, the QHY driver does binning in software after the image has been downloaded from the camera whereas the Beagle Board driver makes it look as if that is possible on the camera chip itself. It may also support flipping the image horizontally and vertically in the hardware, which doesn't even get a mention in the QHY driver. James
  23. Thank you, Jake. I don't know what the Windows SDK is like, but I have the open source one for Linux and it's, umm, well, a bit odd, let's say. I knew I'd have to re-write the code anyhow as I want it in C and to match my existing API so I've really only used it as reference documentation. There is some stuff that I just don't get; for example it has some sort of HDR mode that is unpacked into an image in a most unusual manner, but I think I can leave that for the time being. James
  24. Couldn't swim at the same time as the kids tonight because on Fridays the club takes the entire pool, so instead I took the laptop and QHY5LII, sat in the sports centre cafe and ironed out most of the remaining issues with the camera. I need to sort out reading the temperature and do some tidying up, but I think the hard work for supporting this camera is all done now James
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.