Jump to content

ollypenrice

Members
  • Posts

    38,260
  • Joined

  • Last visited

  • Days Won

    307

Everything posted by ollypenrice

  1. There is a risk associated with manipulating optics which is as non-zero as you rightly say applies to contaminants on optics. This is a matter of cost-benefit analysis and risk assessment. We know that the effects of a small degree of contamination on a mirror cannot be discerned by any observer at the eyepiece in a double blind test. If it cannot be so discerned, what is the rational benefit of taking the risk of manipulating the optics? I can see none. You run the risk of damaging the optics for no perceptible gain in performance and, in this case, no perceptible gain means, quite literally, no gain since perception is what we are interested in. Olly
  2. I've never used NB filters with an OSC camera and hadn't considered this scenario. Point taken. What I have done is a side by side comparison of an OSC versus mono image of the Trapezium region of M42 in the same model of camera (bar the Bayer matrix difference) and the same optics. This was for a magazine article. Try as I might, I could discern no difference in perceptible final resolution of detail. I heartily agree that the best thing to do in this game is get on with it! lly
  3. Personally I wouldn't Barlow the 90mm triplet. Not enough light per pixel. Experiment always wins, though... Olly
  4. The RASA 8 is a dream with this camera. FL is 400mm but small scale resolution is good. The RASA is not diffraction limited but it has so much more aperture than most 400mm FL scopes that it still has room to out-resolve even the diffraction-limited smaller ones. You also get so much signal in the time that you have more room for sharpening in post-processing. As for the effects on resolution of the Bayer matrix, I would say they are insignificant. You'll hear people argue that an RGGB set of pixels resolves as one large pixel but this is not true. The debayering software interpolates the missing information and, yes, this is a form of highly educated guessing but it is remarkably effective in the real world. In the days of CCD, I preferred mono over OSC but not because of resolution. I compared my mono and OSC Atik 4000s regularly and never found anything to choose on resolution. I'm now using two variants of this camera, one in the RASA and another in a Samyang 135. It is, in both cases, a cracking instrument. Olly
  5. I'd start by thinking about image scale. You may find it difficult to get beyond about 1.5 arcseconds per pixel when it comes to finding real detail. You can certainly get decent galaxy results at that and you need a guide RMS of about half that. 0.75"RMS should not be too difficult. Next, you want as much light as possible at that image scale, which means aperture, but it must be within range of what your mount can carry. Basically, I'd focus on those numbers, image scale and aperture. Olly
  6. No, my point was that the secondary is a pretty big 'defect' in the scope's light path and yet it is not, in reality, a defect at all. Olly
  7. My views: A is much too contrasty, the faint nebulosity not sufficiently in play, the bright too insistent. B is the opposite and too flat. C is getting there and D is the best. There is a fundamental issue with the data, though. The bottom left hand corner is dominated by a greenish-brown background. I don't believe this is entirely spurious because I can see evidence of a dusty region there in my own data, partly because the star count is lower. However, I think the initial steps in DBE and colour calibration have exaggerated it considerably. (On the other hand, it may be an imaging 'First' on this target and I'm talking nonsense!) My curiosity is piqued and I'll take another look at all my raw data. My instant impression on all versions was that there was a lack of small scale texture. Where did you apply the NR? I would always avoid it on anything but the fainter signal. Right, I'm off in search of dust in that lower left corner... Olly Update: the brown dust to lower left is certainly present in my Rosettes and I hadn't paid it enough attention. However, I do think it's over-stated in yours, probably because early operations treated it very differently from the rest of the 'background.' Conversely, I've just given my brown dust a boost and like it, so I'm going round in circles!
  8. I agree with earlier posts. 9 mins is a long time for a DSLR. I'd try more and shorter. 2 darks will add noise, not reduce it, and on DSLRs you'd probably do better to use a master bias as a dark anyway. On processing, I'd be looking at: -The flame being too red and not yellow enough. The blues being too green. The bright stars being colourless and surrounded by colourless circles introduced in processing. I don't know how they got there, though. Does the software attempt to mask the bright signal, or have you masked the bright stars in some way? Whatever the answer, it has not worked in the image's favour. Personally, I never use any ready made stretches. The stretch is the absolute number one post processing priority and it seems to me that learning the ropes of a stretch tailored to just the one image in question is the key thing. In this part of the sky there is no background at all. Your framing is full of nebulosity and your black point is good. Olly Edit. It really is useful to have Photoshop to do cosmetic repairs such as your sat trails, though AstroArt has a 'Remove Line' feature so maybe other astro packages also have it. I don't know.
  9. In one of his novels, the incomparable Mark Twain mentions the wreck of a great, flat-bottomed paddle boat, beached, helpless and forgotten, on a sand bar in the Mississippi. I can imagine the smile on his face as he composed its name - the Walter Scott... lly
  10. To reassure yourself, look down the front of your telescope. Oh no! there is an almighty blot, right in the middle of the light path. Your secondary mirror! Does this obstruction ruin the view? Olly
  11. That really is super. It also made me chuckle because we happen to be doing just the same project here... Full Samyang 135 Orion enhanced in regions of interest by telescopic data. We have the full height in the can but need another strip down the left hand side to finish the framing properly. I think we'll be lucky to catch it this season because we've had some down time with mount issues, an electronic problem, a broken bracket (Do not buy the Wega Samyang lens holder) and a local internet shut down. You couldn't make it up.) It's very interesting to compare your result with ours. We have no Ha, just OSC, so you're deeper on the gas and our balance of gas-dust differs, not surprisingly. I'd say we've made quite a few similar processing decisions and a few rather different ones but it's probably reassuring to see that the images are remarkably similar, really. It shows we don't just make this stuff up! Anyway, a great piece of work with well-judged blending of the telescopic bits. Had I not imaged this myself in the Samyang I wouldn't have been able to tell it had been enhanced. It's seamless. Olly
  12. If it doesn't make an awful lot more sense than most of Walter Scott's writing, you'll be on a hiding to nothing trying to unravel it! lly
  13. The speckling in the stellar cores is new to me and I've seen dewy sensors, cloud, and out-of-focus subs hundreds of times. I would certainly swap the power supply and cables, as newby alert suggests, and eliminate any third party bits like hubs. If you do send it back to Rui Tripa in Portugal you'll be in good hands, though. He's a gentleman. Olly
  14. It really is a beginner's mistake to obsess over the cleanliness of optical surfaces, especially when observing them with a light. Olly
  15. This is tremendous. What a vista. I've always thought it ironic that, the shorter the focal length of an instrument, the better it is for making mosaics. I was never tempted by mosaics when using long FLs but always want to use widefield setups to go wider still. Also, the apparent resolution improves to the point at which you'd hardly notice the difference between, say, a Samyang image and a very-multi-panel telescopic one. This is gorgeous though, personally, I'd bring down the red saturation in the California. Olly
  16. And there's the problem. If you want to image at very high resolution you will, in no time at all, find yourself seeing-limited and having anything larger than a 6 inch refractor will get you nowhere, unless you locate in exceptional seeing. When you've finally done that, you will be imaging targets already imaged by professional telescopes and you will have little or no chance of offering a new perspective on a target. If, on the other hand, you aim to go wide and deep, who knows what you will find? Olly
  17. The only thing that would, now, tempt me away from the RASA 8 would be a RASA 11 or 14. There is nothing of which I'm aware, at any price, which I would take instead. Olly
  18. Too much PI for me. Easier to shoot blue! lly
  19. Yes, I can't suspect NoiseXt, which is as close to perfect as anything ever gets. I run it as a bottom corection, top layer then go to the sharp, detailed parts of the image to erase it from there if it's done any damage - but it hardly does any at all. It's rarely worth erasing it from the detailed parts. Olly
  20. It has t be worth a try. As for the mechanics of how to do it, how would I perform this subtraction? I tried opening L and R in AstroArt and applying the Subtract command to subtract R from L. It left very little behind, as I suspected. (I did check that I'd done the subtraction the right way round because, when I substracted L from R it left even less!) Also, what about the sub exposure lengths and number? I have always shot more L than RGB. That seems to be the whole point of doing LRGB. Olly
  21. I'll buy 'em all and sell them on at a mark-up when people realize Vaiv's theory doesn't really work. 😁 Olly
  22. Ah yes, good point. I don't de-star when linear. I'd rather see any issues with the starless right away than find them only after the stretch. I like to do a roughly 2/3 stretch before de-starring so I can exploit the advantage of a starless final stretch. Olly
  23. Yes, I get the idea that L-(G+R)=B. However the B component of L has to pass through the same optics and electronics as the B does, so I don't see why it should be any better than B-filtered B. And then, when we add L to RG (B made by subtraction) it will be identical with the L in that part of the spectrum. Will the L then enhance the signal in the blue channel in the way that it does the R and G, which are not the same signal as the L? It seems to me that your method would be a bit like stacking the same sub 10 times (pointless) instead of stacking 10 different ones. (Please note that I don't expect to win this argument with you. I never do. I'm still curious, though.) Olly
  24. Go on, I'll bite! Tell me why I shouldn't... lly
  25. Except on targets of truly extreme dynamic range (M42 ) I have always found it entirely pointless to shoot different sub lengths. Look at your linear data. If the core is not saturated at this stage, there is no need for it to be saturated at the end. That's down to careful stretching. Stretch it anyway, keeping the core down as best you can but accepting some saturation if you have to. Next make a stack of as much of the shorter data as will give you an un-saturated core. Stretch that, concentrating on perfecting the parts which are saturated in the other. Finally use an HDR tool or routine to blend them. Personally I use Layer Masking in Photoshop but there are lots of alternatives. Olly
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.