Jump to content

ollypenrice

Members
  • Posts

    38,260
  • Joined

  • Last visited

  • Days Won

    307

Everything posted by ollypenrice

  1. The old combination of Tak FSQ106 and Kodak 11 meg chip was a consistent collector of NASA APODS for years. 3.5 arcsecs per pixel. I'm not into competitions but here are some examples of 3.5 arcsecs per pixel: Rosette, Ha OIII LRGB 25 hours. ( Olly Penrice ) - AstroBin NGC7000 widefield with Paul Kummer ( Olly Penrice ) - AstroBin Breaking Wave around Gamma Cass from Les Granges. ( Olly Penrice ) - AstroBin Sh2 126 Two panel. ( Olly Penrice ) - AstroBin Don't fall foul of obsessing over the numbers. It's all about taking pictures. Olly
  2. I had this idea while working in a non-astronomical domain, but it concerns working with small, fiddly components which are easily lost, flicked onto the floor, etc. As well as putting them down in a vertical-sided tray, you can put a strip of double-sided tape on the floor of the tray and put items down on that to keep them secure. It you make it a long strip of tape you can also put the components down in the order you removed them so that, when re-assembling them, you have them in the right order. If the tray gets a nudge, they don't stir themselves up to confuse you! This may be a known trick but I hadn't come across it and it's working wonders for me. Olly
  3. But don't forget that the half-chewed bacon sandwich is incredibly rare. How many people eat half a thing so sublime and throw the rest away? Finishing your own and murdering your neighbour to steal his is a far more likely scenario... Olly
  4. I don't know the book but I process mainly in Photoshop. Am I a real imager? That's for others to decide, but others have given me runner up in the Astrophotograher of the Year competition, 45 citations on Astrobin and dozens of magazine publications. I don't say this to blow my own trumpet but to defend Photoshop as a post-processing software. There is a self-righteous contingent in imaging who assert that their graphics program follows the true, holy and pure path to the divine image. I think that, from my tone, you will understand that I find this claim to be utterly spurious. I don't like Adobe's business model but Photoshop has not become the world's premier image processing software for no reason. Nor do I process only in Photoshop. I take vital steps in Pixinsight, Registar, and APP but, if the sub-text in the 'real imager' bit is to say that Photoshop is not for real imagers, then I reject it robustly. Olly
  5. I've done this object twice with guests. It isn't my favourite but, hey, it's out there and we need to see it! lly
  6. The proof of the pudding is in the eating... This is a good pudding! Olly
  7. I host a robotic shed. I've replaced lots of astro mini-computers but I've yet to replace a bog standard desktop next to the mount. Agreed. As a robotic scope host I've replaced lots of astro-specific mini computers but those rigs with a desktop near the mount just cruise on and on. I'd choose a desktop with enough USB ports to avoid hubs any day. Olly
  8. When did cost ever deter astrophotographers??? If it did, there wouldn't be any... This is, of course, a very cute idea. The theory is compelling. If the focus could be adjusted at each individual wavelength (as it would be) the objective would not need to be so well corrected either. I wonder what the professionals make of it. They must have thought about it. Olly
  9. A RASA 14 has a focal length of 790mm. With a 2600 CMOS chip this would give an image scale of 0.98"PP. The optical resolution of a 14 inch will be very unlikely to be the limiting factor and the seeing will be very unlikely to allow 0.98". This combination would take a lot of beating. Given the amount of signal its aperture per pixel ratio would give, the signal strength would allow for very aggressive processing of fine detail and pull out the faint stuff. Even the more affordable RASA 11 would give 1.25"PP with optical resolution to spare. I'd love the opportunity to try to live live with that! Olly
  10. They both collect light from the globular and the galaxy. The long FL scope only brings the galaxy's light onto the chip whereas the short FL scope brings the galaxy's and the globular's onto the chip. And, yes, the shorter FL brings more photons onto the chip for this reason, but the number from the galaxy is identical for both. This is true in the case of parts of the image which fall comfortably above the noise floor and contain details worth distinguishing. However, if the photons in question are coming from a very faint part of the target, one which is struggling to get above the noise floor and contains little resolvable detail, they are not wasted. This is the difference between sufficient signal and sufficient resolution. No photon from a very faint tidal tail is ever going to be wasted. But I agree that sufficient resolution to reach the limit of the seeing is very much to be desired. Olly
  11. No, it does not collect fewer photons from the galaxy. The galaxy photons do not know the the focal length of the objective into which they pass when they do so. The same number of photons from the galaxy enter any two scopes of equal aperture and all are focused onto the chip if the FL allows it. Conversely, however, it is the long FL instrument which 'wastes' the photons from around the galaxy. The incoming sky photons are identical in both cases. Olly Edit. Maybe the key thing to remember is that all parts of the objective contribute to all parts of the image. Those high school ray diagrams with two parallel incoming beams, one top and one bottom, are not wrong but they are so incomplete as to be potentially misleading.
  12. I think this may be a misleading way to think about it. If we think in terms of 'object photons' (photons from the object) then the number captured depends only on aperture, provided the object fits on the chip. When you have a FL which lets the object cover the chip you are not exploiting more object photons than would be captured by a shorter FL of the same aperture. What you are doing is putting them onto more pixels for an improved spatial resolution at lower intensity of signal. The shorter FL doesn't waste any object photons - they are all used - provided you don't saturate the bright parts. Only if you saturate the chip can you 'waste' photons because any more cannot be counted. Olly
  13. Indeed, more variables. I was thinking of final resolution, I guess. If I were to choose a galaxy imaging scope today it would be a RASA 14, in truth. Olly
  14. The problem with this assertion is that it rather glosses over so many variables. If you have the cash you can go out and buy an absolutely stunning refractor which needs nothing doing to it at all. In my case that's my TEC 140. It is plug and play. You cannot go out and buy a plug and play reflector, be it RC or Newt. The one that you do buy may have fatal flaws which need correcting before it can be used to a high standard. I agree with you that, once the multitude of sins contained in those two words 'Properly handled' are under control, the reflector should win. As for winning 'handily,' I'm not sure. Can you link to an image from a 10 inch reflector which beats the TEC image handily? It can certainly be beaten, but 'handily?' M51 TEC140 data only ( Olly Penrice ) - AstroBin Olly
  15. Although it's counter intuitive, dark sites are not dark to the adapted eye. I can walk about quite easily at our site, which reaches SQM22. That's when the sky is clear. When there's cloud and no moon, that is a dark site and I can't see to move around in it at all. You may still have a point, though. I combat sea-sickness by making sure I can see the horizon and concentrate on it. My money, though, would be on the OP simply having an EP which disagrees with his vision. Olly
  16. I think this will be a case of 'try it and see.' Being 1mm out might not matter since Tak claim a 45mm image circle for your system and your chip diagonal is only 28mm. This is a very good margin. It is on chips exploiting the outer part of the image circle that chip spacing becomes ultra critical. While you are right to prefer an all-threaded set of connections between scope and camera, push-fit can work fine. I've had several push-fit systems giving perfectly good results. The original Sky 90 was notorious for being prone to mis-collimation. (This affects other Tak scopes as well, particularly the FSQ range.) I don't know if the later version has been improved. If you do get distorted stars, therefore, don't assume automatically that it is chip tilt. Rotate the camera through 90 degrees and take another test shot. If the distortion remains in the same oriention on the sky, it will be collimation. If the distortion follows the camera angle, it will be tilt. Olly
  17. I agree with this, Rodd. I have excellent transparency but I don't think my seeing is generally exceptional. My findings with the refractor are specific to my own site. Olly
  18. I wrote that Astronomy Now article several years ago and have never been tempted to change my mind. In fact the refractor was 5.5 inches, the TEC140. What would tempt me, though, would be the RASA 11 or 14 with a small pixel CMOS camera. The depth of signal from fast optics means, in reality, being able to process harder. Olly
  19. CCD cameras have plunged in value on the second hand market and remain no less good than they were before the CMOS arrival. Olly
  20. Good one. I like the colour since often this is presented with too much blue. Good propellor, too. Olly
  21. No, with a small pixel camera. You can get a decent pixel scale from 135mm if your pixels are small enough. Our chip is the 2600. Olly
  22. There may be no point in going for barlows. When it comes to 'getting up close' to a target the significant number is 'arcseconds per pixel.' This is, even for millionaires, limited by the seeing. Living in SW England you are very unlikely to beat about 1.5 arcsecs per pixel whatever setup you use. Exceptionally, you might but even a sensible optimist will go no better than 1.2 or so. You can go below that, sure, and get a bigger image of your target but it will contain no more detail than a smaller image. You could get the same result by resampling your smaller image upwards. The term 'empty resolution' defines larger images containing no new information. Olly
  23. I can never see M33 though I find M31 dead easy. Olly
  24. I did post this 2 panel done with Paul Kummer and Peter Woods on the DSI board but here it is for Samyang lovers. Wide open at F2, the Blue Horsehead region. This Lens, the small pixel CMOS camera and Russ Croman's software come together with a sweetness rarely experienced this side of heaven! Olly
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.