Jump to content

ollypenrice

Members
  • Posts

    38,261
  • Joined

  • Last visited

  • Days Won

    307

Everything posted by ollypenrice

  1. Not sure about this. What's your reasoning? I'm afraid I haven't been able to come to a conclusion either way... Olly
  2. You're asking the million dollar question, of course. As Vlaiv says, how do we identify terrestrial signal? Although I've never done it, one idea comes to mind: image the target at the lowest possible elevation and at the highest and compare the two. Whatever trend you find in the difference between low and high will also be the trend of the light pollution and might be extrapolated further. Just a thought. Olly
  3. Shock horror: I focus in luminance and then scroll RGB,RGB,RGB without refocusing between the filters. (The exception would be in shooting low targets when I shoot red at the lowest altitudes and blue at the highest, but I can shoot down to the horizon here if I have to.) I don't find anything wrong with the way this works. Absolutely perfect focus per filter would always be nice but, if we think about how LRGB imaging works, we can see that the penalty for slightly imperfect focus in the colours is trivial. We are not chasing resolution in the colour layer. Indeed most of us make it a low priority in our RGB processing. What we want is strong colour intensity and low noise, both of which are best served by abandoning the pursuit of detailed resolution. I'm using well colour-corrected optics (Tak FSQ106N and TEC140) with Baader filters. I believe most decent filters are parfocal, any shift in focus coming from the telescope's optics. And think of this: a luminance filter passes R, G and B simultaneously ,with no way of focusing separately between wavelengths, yet it gives us our most resolved layer all the same. Olly
  4. I didn't know about these budget ultra-wide EPs and take your point. Olly
  5. My understanding is that the focal reducer makes the telescope's baffle tube the source of vignetting/ reduction in effective aperture. For this reason you can open up the view to the widest extent possible by using a 2 inch visual back and widefield, long focal length, 2 inch EPs. The only reason for using the reducer visually would be achieve a wider view at lower cost with less expensive EPs. The more expensive solution is preferable if you don't mind it. If we move on to imaging we enter the dreaded minefield known as the F ratio myth. It's a minefield for several reasons but... 1) A focally reduced scope does not take the same photograph as a native scope. If you're imaging an object which fits nicely on the chip at native, introducing a reducer brings in precisely no new light from that object. It brings in new light only from the region, now fitting on the chip, around that object. Starizona would do well to make this clear in their advertizing. The reducer puts the same amount of galaxy light onto fewer pixels, 'filling' them faster but giving a smaller image. The other way to do that is to bin the pixels 2x2 or 3x3 etc. Which brings us to... 2) F ratio, of itself, does not matter. Exposure time is determined by the area of the objective relative to the area of the pixel. The professionals often use huge apertures of fairly slow F ratio and, critically, huge pixels. Olly
  6. I've made this change recently and have also used both OSC and mono versions of the same camera in the past. The first thing to say is that I think CMOS OSC cameras are much more convincing than CCD OSCs. (I've used both.) So I think the game has changed, recently. In the CCD era I was a firm advocate of mono, which is faster and more versatile than OSC. (It quite simply is faster because it can shoot luminance, red green and blue, simultaneously on each pixel. An OSC camera cannot do this and so is, most assuredly, slower. Assertions to the contrary cannot be correct.) This remains true of CMOS cameras as well but the high sensitivity and low read noise of the CMOS camera changes the real-world balance of virtues, in my view, enough to make CMOS OSC worth thinking about. I'm also using the CMOS OSC in a fast astrograph, a Celestron 8 inch RASA at F2. This makes the time constraint a lot less pressing because we are getting so much signal in so little time that we don't really care about a ten or twenty percent increase in exposure time. It's a percentage increase in 'not a lot.' Yet another change has been introduced by the dual and tri-band filters now available for OSC. Several members on here have posted deep images of faint NB targets using this technology in fast instruments. We don't have such a filter yet but intend to try one. It is certainly frustrating to lose decent nights to the moonlight, though I have never imaged around full moon, even with a 3nm Astrodon filter in a mono camera. I'm enjoying using the CMOS-OSC-RASA combination. Olly
  7. Not only that, but, on a Dob, the focuser only ever operates horizontally so it's not required to hold weight against gravity. In this context a Crayford's smoothness has things going for it. I've grouched about the design often enough but, fair's fair, it's a serious option on a Dob. Olly
  8. Personally, I think it's incredibly simple: the claimed aperture must be the clear aperture or the marketing is fraudulent and should land the manufacturer in court. Olly
  9. This, if the objective lens is in good condition, is an under-rated telescope and can give pleasing views. I've had one for many years. It has little value on the used market because it's unsuitable for astrophotography and needs a substantial mount. (A new mount suitable for this scope would cost around £1000 but someone who already has such a mount might give £150 for the parts you have.) I don't see it as looking externally anything more than dusty and faded. Mine looks worse but works well. The condition of the lens is everything. Olly
  10. Could you tell us exactly what you want to do with the proposed scope? DS imaging? Planetary and lunar? I've used focal lengths from 1 metre to 2.6 metres for deep sky imaging of smaller targets and found little difference in the final results. Olly
  11. The oft-used artificial star is just a ballbearing illuminated from a little off axis so you don't see the source. Only the bit of the ball nearest the telescope reflects light down it, making it an excellent approximation of a point source. You'll find this system in professional optical shops. Olly
  12. I don't think it's a sensitive issue, it's just expensive. Also, some imagers don't feel they've done any real work if they haven't either set up a remote system of their own or done it the usual way, next t their setups. I'm not of that persuasion myself but I know that some are. One way to make it cheaper would be to team up with others and share the data collected. I don't know if the providers discourage this but it seems a perfectly legitimate thing to do to me. As a robotic telescope host it strikes me as none of my business what my tenants do with their data. Olly
  13. Artifical star? The downside is that reflectors may not give an accurate indication of their collimation when pointing at the low elevations implicit in this test. Is there a roof you could get to? (Half joking...) Olly
  14. Starnet won't run on our huge files from the ASI2600, Martin. I'm hoping it will work with Xterminator. Interestingly the RASA/ASI gives excellent star colour. Given the F2 and the reduced well depth one might have expected the reverse, but no, it's good. Stars are quite soft in the RASA so maybe the colour is coming from the weaker outer edges of the stars, I don't know. Olly
  15. New to me, too, and I love it. Tremendous sense of depth and nice to see dust combined with reflection nebulosity. Processing is bang on the money. Olly
  16. Nice. Having cropped it, you have a new and somewhat reduced dynamic range within which you might be able to extract more local contrast. The blues are a little flat but could probably be teased into more variety now you don't have to keep the background within range. Olly
  17. I agree. Assuming the glass is flat, any part of the chip cover will do to send a reflection down to the screen. It doesn't have to be the same spot on the glass. If the glass isn't flat you're doomed anyway! Olly
  18. Yes, I think it was just an internet glitch.
  19. But don't most men see that when they close their eyes? lly
  20. Well... on this target, and without any NB filtration, I doubt you'd go as deep you'd go with filtration, but that's not a surprise. There's an awful lot of very faint Ha and OIII in the Veil complex. This is what about 60x3 minutes per panel gave just in the RASA/ASI2600 OSC: I see it as a rather different kind of image and like it in its own right. The big take away from this rendition is the way it shows the cleaned sky within the loop. The SN shock front has swept out the central region of space within the roughly spherical shell. I guess this shows well because the RASA is so good at finding dust, which makes it good at showing where there isn't any, as well. On the other hand it doesn't give refractor stars but if I can get Star Xterminator to download successfully I'll try a de-starring processing job. An interesting note: this data had no calibration at all and was taken through ABE before being given to Registar to have the 4 linear panels joined. The result was absolutely seamless, which is highly unusual in my experience. (...and very welcome!) Olly I can't remember but it's probably over forty. Olly
  21. This combines newly acquired RASA/ASI2600 OSC data with older CCD OIII/Ha/LRGB data from our Tak FSQ106N. The RASA part was in conjunction with Paul Kummer who owns the scope and camera in our robotic shed. Unfortunately the RASA's robotic focus motor has packed up and will have to go back for repair. However, Goran has said that he finds the RASA holds focus really well and so it does. We do a manual focus at the start of the night and, while we can't get it as tight as the computer can, it does hold steady for the night. RASA, about 3 hours per panel, four panels. Tak data was in 3 panels. Olly
  22. I don't see any artifacts arising from the hard process so I think the look of the image is down to the extremely strong signal. This is not a shark I'd want in my swimming pool!!! Olly (Not that I have a swimming pool.... )
  23. Ah, Amazon Prime... Don't get me started. They use Chronopost most of the time, which means delivery is inverse-decimated with about one item in ten being delivered. (That is literally true.) Interestingly, though, Amazon don't use Chronopost for their returns, they use reliable UPS. I guess they prefer a good service when they are the customer... But, yes, I did order a set of Allen keys online. This will not topple the mighty Trezzini hardware store and builders' depot, though. It's the best enterprise I've ever dealt with. Every employee is a trained technician who can give authoritative answers on anything from roof tiles to bolier thermostats. And it runs on trust in a way lost to the modern world. I drive in to the yard, load up, then go and tell them what I've taken. By contrast, using the big chains makes me feel like a thief since the assumption from the moment you enter the gate is that that's what you are. Olly
  24. Colour gradients appear in almost all astrophotos, even those from very dark sites like mine. Many of the dedicated software astro packages have tools for dealing with them. Pixinsight has ABE and DBE, Astro Art has gradient removal, APP has a tool, and so on. There's also Gradient Xterminator, a plug-in for Photoshop. Olly
  25. Another offshoot of decimal confusion arises when natural disasters or wars kill off almost everyone at the scene. The population is said to be 'decimated' whereas, historically, the term meant to execute one person in ten. (The idea was that it would focus the minds of the survivors...) Olly
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.