Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

JamesF

Members
  • Posts

    31,960
  • Joined

  • Last visited

  • Days Won

    182

Everything posted by JamesF

  1. I will have to come back and read through this properly when I have the time it deserves. Nice to see you back, Qualia James
  2. I have a similar 2500 litre (about 730 US gallon, I think) tank that I'm turning into an anaerobic digester. The walls really aren't very thick at all. Only a few millimetres. I suspect that once you start destroying the integrity of the form it will lose much of its strength. I guess the six million dollar question is whether you can restore that more cheaply than building the observatory another way. James
  3. It's an intriguing idea. I think you may find that once you cut the top off the tank that it loses much of its inherent strength however. For example, in one piece the sides will resist deforming sideways relatively well. Take the top off however and they'll probably wobble all over the place. You may well need to take measures to reinforce the new lip of the "wall" so it will carry the weight of the top and the hardware required to allow the dome to rotate. Cutting a door in the side may weaken it further. I don't think that means the idea won't work, just that it may need a little more consideration James
  4. For the time being I think I'm going to ignore it, Nick It looks as though there's a problem with FireWire cameras that I want to get sorted and I'm in the middle of improving the recording profile management. Once that's done I'll get another release out, especially as I need to do one to sort out support for the ASI USB3 cameras. After that I want to work on filter wheel support and whatever else is necessary to allow me to move to using it for RGB imaging of Jupiter on its return in late Autumn/early Winter. I also want to think about support for the Lodestar and the modifications required to create an all-sky cam/meteorcam version. Even without oaCapture I seem to have 101 other projects on the go at the moment, so lots of new stuff is just having to go at the end of the queue James
  5. Something to aspire to Using the Mak with an ASI120 you can probably go to 400x400 on Jupiter if your tracking is good. That should be plenty big enough. The other planets will all be smaller. James
  6. Ah, right. Planetary may well be ok anyhow, as you can probably come down to at most 640x480 if your tracking is good (unless you're using a C14, perhaps . Lunar is going to be more troublesome though, unless you're just targeting specific features. James
  7. That does make it sound like the disk is likely to be the problem. Did you have a particular target in mind for capture, or was this just for experimentation for the time being? James
  8. I have a couple of Aspire One netbooks and have managed to capture useful data for an image of Jupiter with it. Either the hard disk or its interface do seem to be quite sluggish on these small netbooks though. What camera are you using for capture and at what resolution? James
  9. It's been a quiet few weeks what with the children being off school and us going abroad for a holiday, but now I'm home again and got through all the stuff that inevitably seems to build up whilst we're away I've settled down to some more work on oacapture. I've added the small amount of support required for the new USB3 ASI120 cameras, which was pretty trivial to be honest. Looks like it could be an interesting camera, especially for solar/lunar imaging. Nice high frame rates at full resolution. Then rather than working on my own code I've been twiddling someone else's. Quite a few other people's, in fact. It turns out there are three of us who had taken an open source UVC library and extended it (myself to support the TIS cameras), so I've been merging all of those changes together to fold back into the original author's code. I did have an ulterior motive however. Now I'm all done bar one required bugfix for an issue that was already in the UVC library, and I've written some code to convert YUYV and similar video format frames to RGB, I've added support for the MS Lifecam. That's now working on Linux (though it probably already would have done via the V4L2 interface), but should also now work on OSX. As soon as I have the code cleaned up a little I'll give it a go. It may also be that I can support the Xbox camera on OSX, once I can work out exactly what format it is using to transfer the image. OSX support for the Lifecam and Xbox cameras doesn't of itself sound that exciting, but it will mean that there is a cheap native astro-imaging option for OSX. And let's face it, by the time you've sold your firstborn and bought a new MacBook there's not much cash left for anything else The support for video format frames also takes me one step closer to one day perhaps supporting the SPC900 on OSX. I've also made a few UI changes that allow the various control panels to be docked either at the top or the right of the display, or even dragged off as separate windows altogether. Hopefully that should help a bit on small displays. It's also possible to hide all the controls so only the captured image is displayed, giving the largest amount of screen available to help with focusing on low resolution displays. James
  10. Well, I pretty much finished support for my Firefly MV this evening, dropped the sources on my MacBook and they built and ran first time without errors so I'm quite happy with that. Pending some regression testing and a couple of small bugs to look at I think it might be time to start looking at another release. James
  11. Further to my posting a couple of days ago, here's a screenshot of my IIDC Firefly MV capturing images: Rather pleased with that I have to admit. especially as it's unencumbered with any of the Point Grey software and its associated licensing conditions. There's still some work to do and I've made a few unsafe assumptions about how the cameras work that are probably ok for this particular application at the moment but will need sorting out in the future. And it's not tested on OSX yet, though I see no particular reason that it shouldn't also work there. Shame I don't have a firewire astro cam to plug in and test with it, but until recently I didn't even have access to a machine that had a firewire interface any more. James
  12. That's excellent news. I'll look into the matrix format issue. It may be that I forgot to save it or something daft like that. James
  13. I've just checked the sources for the IEEE1394 library. There's a list of USB vendor and product IDs for cameras known to support IIDC over USB which at the moment is restricted to the Firefly MV (colour and mono) and the Chameleon (again colour and mono). I'd guess dropping the appropriate VID/PID in for the Atik GP might work, but that might be something someone else has to do, though it wouldn't be hard for me to add the data to the code. I'd just have to use a bespoke build of the library. As I do that for ffmpeg, libusb and libuvc already that wouldn't be a big deal. James
  14. Well, for this release I had no plans to support any Atik cameras, so anything's a bonus, really. Perhaps the IEEE1394 library will "just find it" if someone tries. I haven't poked about in the source so at the moment I have no idea at the moment how it finds my Firefly MV, but it certainly does: If I can get that working then I think I'll consider anything else a bonus James
  15. That's rather handy. I didn't realise there were Atik cameras using the same standard. James
  16. I discovered last night that whilst Point Grey provide their own SDK with all sorts of "proprietary, do not distribute" FUD, their cameras largely seem to implement some sort of machine vision camera standard called either IIDC or DCAM that is related to IEEE1394. That's rather handy because there's an open source camera library that implements the standard and supports not only (the more usual, I think) FireWire-connected cameras, but also some of the USB ones as well. I compiled one of the demos from the source distribution and it grabbed frames from my USB Firefly MV quite happily, so it looks as though for the cost of writing a small amount of code to integrate the library with my API I might end up with support for all of the FireWire and USB-connected Point Grey cameras on Linux and OSX Need to stay focused though. Must get the UVC stuff for the TIS cameras sorted first. James
  17. There won't be too many visible changes in the next release as my main goal is to get something working on OSX and it'll mainly just be bugfixes otherwise, but as part of the OSX port I've now massaged the sources into a state where I have exactly the same code building working binaries on 32-bit and 64-bit intel Linux, Raspberry Pi and OSX. Basically from the top of the source tree it is possible to do: $ ./configure && makeand it does the rest. I think the main task remaining is to work out a tidy way of loading the firmware into the QHY cameras on OSX (I can do that manually myself, but apparently the average Apple user can't cope if it doesn't have a point'n'drool user interface What I'll probably do is just have a bit of code at the start that matches the USB VID and PID values before the firmware is load and runs the relevant commands to do the work. I also need to sort out wrapping everything up into an installable package for OSX. I'm probably going to leave one ugly OSX issue present for the time being because I'm not sure how to sort it out yet which is that the buttons in the UI get rendered at what appears to be twice the desired size on Retina displays when they contain an icon. If I plug in my desktop display then everything looks fine, but there's clearly some sort of scaling issue relating to high pixel density displays somewhere that I haven't been able to run to ground yet. James
  18. (I should point out that I've never actually tried this myself. I am rather assuming it will work out the way I propose.) James
  19. My thinking regarding the C-mount is that it will help get the spacing correct. An EOS-to-C mount adaptor should give the correct spacing to bring the EOS lens to focus 12.5mm (I think) behind the mount. A C-to-CS spacer will take another 5mm off that and bring the EOS lens to focus on the ASI camera sensor. With the EOS-to-T2 converter you'd still need to make sure of the spacing to get the lens mounting flange 46mm (?) in front of the sensor (and woe betide you if it's already more than 46mm James
  20. My camera has a threaded piece that screws into the T2 thread, with a C/CS thread on the inside. The CCTV lens that came with the camera fitted into that. It may not be the same any more. Mine was one of the early ones. James
  21. My ASI120MC has a CS-mount fitting as well, so one of these with a 5mm C to CS fitting might work if you can find one that will ship to the UK: http://www.ebay.com/itm/Canon-EOS-EF-EFs-lens-to-C-Mount-Film-Movie-Bolex-Video-Camera-CCTV-Adapter-Ring-/321292437097 Unless they don't provide the CS mount any more? James
  22. There is a new release http://stargazerslounge.com/topic/215690-oacapture-005-beta-for-linux-is-released/ James
  23. I think what Mark is doing is comparing the achieved sensitivity (without the lenses and CFA) with the extrapolated optimal sensitivity based on the readings with the CFA and lenses. For example, if the RGB values for a given photosite total 31355 then ideally with no CFA the same photosite should record 31335. Because the microlenses are no longer present then the measured value is considerably lower. Gina on the other hand is comparing the actual signal values measured for (a group of four, in this instance) photosites before and after removal of the microlenses and CFA. In this case the readings are better "after" compared with "before". But they're not the same thing, so they can't be compared directly. Both are probably valid ways to look at the data. The difference is probably that with the microlenses and CFA present the resolution is lower even if more photons contribute to the readings for each photosite. (Obviously there's some approximation going on with the CFA example too, because at a red photosite the G and B values never actually happened. They're interpolated from the values recorded at the surrounding pixels.) James
  24. I'll try oacapture out on distroastro for the upcoming release, too. I've not tried it on Mint prior to release 15 until now. I would not be surprised if there were niggles with the SPC900 support because it did stop working somewhere between kernel versions 2.6.x and 3.2-ish, but one of my future jobs is to add a user-space driver for the camera as an alternative if the kernel-space one doesn't work. James
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.