Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

JamesF

Members
  • Posts

    31,960
  • Joined

  • Last visited

  • Days Won

    182

Everything posted by JamesF

  1. As has been posted above as I was writing, I'm in agreement. It doesn't seem to make sense that the backfocus should reduce. James
  2. That would give you a larger image and change the point of focus, certainly. James
  3. In a colour camera each pixel corresponds to a particular colour in the image, red, green or blue. For pixels in red positions, for example, there is no green or blue data. It's derived from adjacent pixels of those colours and as such may not be entirely correct. Effectively the resolution is decreased slightly by doing this. Also, in order to make the pixels respond to the colours required you start with a mono sensor and add a filter for the required colour to the front of each pixel. That filter causes some loss of light and reduces the sensitivity of the sensor. With a mono camera you use a filter in the optical train for each of the three colours, so every pixel records data for each of the colours and you get no loss of resolution. Also, the external filters tend to allow more light to reach the camera sensor, so overall the sensor is more sensitive. Of course the (arguably) negative side of using a mono camera is that you have to take three sets of data; one for each of red, green and blue and then combine them afterwards to get a colour image rather than getting all your data in one hit. And in the case of Jupiter you'll be trying to do all that data collection in the two-minute period before any distortion in the image due to the rotation of the planet becomes apparent. And you need to buy the filter wheel and a decent set of filters, which could easily cost more than the camera. But... the result of using a mono camera should be a much better image than is possible from a colour camera of the same sensor size and resolution. James
  4. The 120MC takes a bit of time to get the hang of, but it's capable of good results. This is from the same camera with my 127 Mak: James
  5. Consider: The average user of the video mode of a consumer grade compact or DSLR camera is interested in the overall result. They won't be too fussed about noise or a bit of lost data unless it affects the video as a whole. They're certainly very unlikely to go pixel-peeping individual frames of a video. Manufacturers know this and take advantage of it to allow them to increase the resolution or produce frame rates that would otherwise be impossible with the same kit. When you want to use a camera for planetary imaging however, every pixel of every frame counts. You're after the most accurate record of the light that hit the camera sensor. That's somewhat at odds with the situation above. If two adjacent pixels are different colours, even only very slightly different, you want to know. You don't want some camera firmware deciding that they're close enough that in a 25fps video no-one will notice if they're made the same colour to reduce the required storage space for the frame. James
  6. "Proper" CCD (or other planetary imaging) cameras give you the data read out of the camera sensor unit. They don't mess about with it to reduce the data size or anything like that. As I said a few posts back, if you had the raw data for a 1920x1080 frame at 30fps for two minutes you'd be looking at 7GB-ish of data. That's exactly what a dedicated camera would give you in that situation. James
  7. I'd turn image stabilisation off. Not sure about focal length I have to admit. I'm not at all sure how that will work out. It may work in your favour if you use something rather shorter than you did for the image you've already posted. James
  8. Could be. Does the manual not explain the difference between the two settings? James
  9. If you look at things the other way around, a raw (ie. still in bayer format as it came off the sensor) colour video at 30fps and 1920x1080 resolution of two minutes duration should be about 7GB even if it's only 8-bit colour. I agree with Chris about the onion rings. I think it's an indicator of lack of dynamic range in the data. You probably need more exposure time per frame though it's entirely possible the camera will not do that for you. They're not really designed with planetary imaging in mind... James
  10. H.264 is MPEG4 I believe. It's almost always lossy compression too. Unless you can find out exactly how the camera firmware works and decide what format you want from that I think you might just have to accept that you're never going to get stunning results this way and work on the things you can change, such as accuracy of focusing James
  11. AVCHD is, as far as I'm aware, layered on top of MPEG. If you take a raw video frame, convert it to MPEG and then convert it back you won't get what you started with, so data has been lost. JPEG is poor in two ways. First it's 8-bit only, so if you have more than eight bits of data they'll get thrown away. Second it's lossy compression. In my experience MJPEG is generally used where there's insufficient bandwidth for the full frame even with lossless compression, so the raw frame to JPEG conversion will be lossy too. In fact, if you look up JPEG on Wikipedia the first sentence says "JPEG ... is a commonly used method of lossy compression". Thinking about the way that JPEG actually works at an algorithmic level and how stacking works it seems highly unlikely to me that JPEG images are ever going to stack as well as raw frames. If you think of stacking as a process for reducing the random element of noise then I think it might not be unreasonable to view JPEG conversion as actually modifying the noise in a manner that makes it harder to reliably remove the random component. James
  12. MJPEG is definitely not good for planetary imaging. JPEG is (sometimes very) lossy, so you'll struggle to get a decent image out of it. James
  13. I think that using AVCHD is probably not a good idea. Under the hood it's probably using lossy compression on the image data which is unhelpful. If you can get a lossless AVI format from the camera that would be great, but it's entirely possible the camera just won't do that in which case I'm afraid you're going to struggle to get decent images out of it. I do also wonder if the camera just isn't sensitive enough for planetary imaging. James
  14. If you think it would be useful though, there's a build-dmg.sh script that comes with the sources for oacCapture. By all means take that, rip out what might be useful to you and use it however you wish. It's often much easier when you can see how someone else has done it to start with. James
  15. If you want to use the OSX installer then you have to create a .pkg file, which is possible, but as far as I recall requires signing keys and registration with Apple etc. What I (and many others) do is to package everything as a .dmg disk image which means it's just a "drag and drop" job to install where you want it. However, that does mean that everything needs to be completely relocatable (because it can be installed anywhere) and it's "the Apple way" that all non system software dependencies are bundled with the package too. That includes any external libraries that aren't part of the core OS installation and suchlike (which also may need paths changing internally). That really seems to be what Mac users expect. I get the impression that in the main they're not happy about being asked to run scripts or do anything from the command line. James
  16. Sounds good. Unfortunately at the moment it looks as though I will be unable to help for a while. Unless it's possible to polar align without being able to see any stars whatsoever. For weeks James
  17. Aha! You star, Robin I upgraded the firmware, but no change. So I started fiddling with what I could in the non-ASCOM control program. The "Recalibrate" button moved the wheel. Doesn't look like a hardware fault. But it still wouldn't move on a reset. From the manual it looks like the "movement" button should pop up a dialog box that allows me to check the speed, but, oh dear! It's crashed. So I switched back to Linux and connected to the serial port directly and entered "SA" to set the speed to 100%. Then "R0" to reset the wheel and it works! Weird problem. I wonder if there's some sort of noise when the cable is disconnected that causes the speed to be set to zero? I did try to read the current speed in Linux, but before I could do a proper reset the firmware seemed to be giving odd results to commands and I couldn't get it. James
  18. Sending a cold reset (R0) command to the wheel I get: Restart Xagyl FW5125V2 FW 2.1.7 Initializing Calibrating ERROR 3 - Wheel stalled ERROR 1 - Initialize failed to complete I'm going to have a go at a firmware update after dinner.James
  19. Well, bad news to report. The new PCB seems to have failed in exactly the same way as the old one. It was working happily last night, but tonight will no longer drive the motor I've just emailed Xagyl, but at the moment I am not a happy bunny. James
  20. You do have to make sure the wires go back into the right holes, too Replacing like with like is clearly no bother. It turns out however that whilst the screen-printing on the V2 PCB suggests (with a little box in one corner of the marker for the connections) that the orientation may need to change, it doesn't. James
  21. That's a shame, MIchael. They've been quite helpful to me, though they have been a little vague about some details which may be down to language issues. Hence my digging for a few more details here. If it's a very early model then I don't think the latest firmware would work anyhow. I seem to recall that the latest version I could upgrade my V1 wheel to was v2.2.0 whereas the latest firmware release is 3.3.0. The description of the firmware file sounds as I'd expect though. I believe the (downloadable) user manual explains how to do the upgrade. I think there is some weird thing where you have to upgrade to v1.9.9 first if you don't already have v2.0.0 or later, but other than that it was straightforward. Should it still not work, Ian King may still have some of the replacement V2 PCBs in stock which I'd expect to fix it unless the motor has failed. Mine wasn't expensive and just required the two motor wires unsoldering from the original board and soldering back to the new one. It's a bit fiddly because they're very close together, but otherwise ok. If you're not sure about doing it yourself then perhaps we can sort something out by talking to Ian and posting the bits to me. Just don't expect it back next day Or perhaps there's someone closer who could do the soldering job. The V2 upgrade has the bonus of moving all the electronics inside the wheel, so you don't need the specific Xagyl cable -- any standard USB to mini USB cable will do. James
  22. Ah, fair enough. I think there has to be some sort of associated EEPROM for a serial number to be present. If you have a pre-production model then I guess it's entirely possible the EEPROM isn't there. James
  23. Bah It seems that 0403:6015 is another generic FTDI VID/PID, not specific to Xagyl. Your 525 wheel doesn't have a serial number at all, Robin? James
  24. That's a bit unkind if it happens to be some other device that the user doesn't want you fiddling with I'm tempted by the argument that if a device should not be messed about with then some other application should have it exclusively locked, but wilfully sending data that might be regarded as invalid to an unknown device goes against the grain a bit. None of which means I won't resort to that option if all else fails James
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.