Jump to content

ollypenrice

Members
  • Posts

    38,260
  • Joined

  • Last visited

  • Days Won

    307

Everything posted by ollypenrice

  1. There is such an instrument down the road from me but it isn't used for imaging. If invited to use it (a big 'if') I think I'd try the Cat's Eye nebula. Olly
  2. Certainly a great finderscope and, even more certainly, the world's best guidescope. Who visited those festering little finder-guiders on the world? I'm now using two of the wretched things and they are a constant faff, forever losing focus. I would be embarrassed to admit how long it is since I last refocused my ST80 guidescope and even more embarrassed to admit how much time goes by between occasions on which I scrape the accumulated crud from off the lens... Everyone should have an ST80! lly
  3. I had a play with a screen grab using a Photoshop trick which usually accentuates the Ha component in the red channel but, for once, it had little effect. I think your colour balance is a little cold - blue dominated - because M33 is pretty colour-neutral rather than blue, but it would seem that you may not be collecting much Ha signal as things stand. Olly
  4. Who cares about small stars any more? We have total control over them now... Olly
  5. Difficult one. It rather sounds as if the scope is a lemon, though Es Reid might be able to fix it. Will it really cost £150 to send it each way? That seems like a lot. A compromise might be to get him to make an initial assessment of what's likely to be possible. I agree with Tomato regarding pixel-peeping for stellar perfection. How bad is it? Do you see it if you don't look for it? Like Steve, I'm using a RASA 8 and also, now, a Samyang 135 F2. I've never enjoyed imaging so much in my life and I'm able to make the kind of pictures I've always wanted to make. Not surprisingly, it is very hard to achieve stellar perfection in these gloriously fast systems but does it matter? If it's OK to have darned great crosses springing out of stars, why is it such a big deal if small ones round the edges aren't prefect if you look closely? (Large, imperfect ones can be fixed in one click by a simple Photoshop routine saved as an action.) I also think that, sometimes, a product really is born a lemon. I know one imager who had a lousy new Paramount which it took him two years to get working, by which time he'd switched to a different mount. He reckoned it was now OK and sold it to another imager I know. It isn't OK, it is back in 'not working' mode again. This is made worse by Paramount's hard-nosed attitude to customer service and the absurd prices they charge for components. The same applies to another person I know who got stuck with a very expensive and very unsatisfactory OO UK astrograph. There is a case for cutting losses and bailing out, I guess. Olly
  6. I think, 'Phew,' might be the word we're looking for. An RMS of 0.6 is very unlikely to be your limiting factor for resolution. Olly
  7. Naturally I have never, ever, done anything like that... Ahem. lly
  8. Thanks. Before post-processing, I resized Paul's linear mosaic from a width of 15862 pixels to 7000 pixels. This was intended as a first attempt at processing, to make life easier for some of the software routines. My intention was to re-do it at a larger size if all went well. I think, having seen it as it is, that I'll stick with this rendition unless Paul wants me to go for bigger. The web JPEG is resampled down a little more to a width of 6000 pixels. Regarding the faint dust, an idea I had a few years ago works even better now that we can use it on starless images. In Ps, create a copy layer of the starless and also copy that onto a layer mask. Equalize the layer mask (in Image, Adjustments), blur it and increase its contrast, then gently stretch through it. The idea isn't new, just this easy way of doing it. Olly
  9. Yes, Rogelio pioneered this framing of targets, I think. He has a distinctive style in astrophotography, which is different from mine and Paul's, but he had a great idea in bringing these objects together. It is 99% straight RGB. For the bright part of the California I had a 30 hour HaOIIILRGB image which I blended in a low opacity but this didn't add to the extensions. It was really just a lazy way of getting the bright parts into good shape. Essentially this is straight RGB. After trying a full circle arrangement, which gave oddly tilted outer haloes, we settled on this arrangement and now we just get decent stars. Thanks. The blue extended 'pincers' are not new to this image, though. I was fascinated by them myself in an image I admired about ten years ago. I did a very clunky, undersampled camera lens attempt ages ago and, although it found them, it was very rough. The speed of the RASA makes it easy to get a nicer result. Olly
  10. Another with Paul Kummer, who organized capture and preprocessing. Post processing is mine. We aimed for 3 hours per panel but naturally there was weather-induced variation. RASA 8, ASI2600MC, Avalon Linear Fast Reverse, located here at Les Granges. ABE, Blur XT and SCNR green in PI then Photoshop for the rest. This is a thumbnail, so to speak, but there's a larger one on my gallery site in the link. This was unexpectedly easy to process. Link to larger version: https://ollypenrice.smugmug.com/Other/DUSTY-DARK-AND-MILKY-WAY-TARGETS/i-Rp94h6z/A Olly and Paul.
  11. Here's a version 2. I didn't use BlurXT on the first one because it damaged the stars. In V2 I did use it and found the starless nebula was better because BlurXT helped StarXT and also improved detail. When I came to replace the stars I used the ones which had not been through BlurXT so I could get the best of both worlds. I think this is going to be my standard approach with the Samyang. Is it a bit bright on the colour??
  12. I agree with Tomato. It sounds mechanical and shouldn't be touched without a Lucas say-so. All the Mesus here are ones with the fixed motors so I can't help. I hope you get it sorted quickly. Olly
  13. This has just come up in a recent and similar thread. I explained how I do it in Photoshop using blend modes Lighten or Screen and another member provided the equivalent formulae for use in pixelmath. To follow up on Olly's post above, pixelmath expressions for those two blend modes are max(img1,img2) ~(~img1*~img2) The thread is here: I've never done this in PI but perhaps it will help. Olly
  14. I should have added that the lens was wide open at F2. Olly
  15. First light for the setup I share with Paul Kummer and Peter Woods. Olly
  16. First light for a joint project with Peter Woods and Paul Kummer using a co-owned robotic setup based at my place. Samyang 135 lens, TS 2600 OSC camera, Avalon M-Uno mount. 2 panel mosaic, 56x3 mins for one side and 72x3 mins for the other. That's just what the sky decided to give us but it was an easy post process. Paul did the driving at capture and the pre-processing. I did the post processing on this version. Perhaps this beautiful nebula will get more attention now that the Samyang 135 is so popular. One thing I like about shooting only in broadband and with fast optics is the way you capture a higher proportion of dust relative to emission nebulosity than when adding Ha to RGB. A Photoshop adjustment of the hue in reds is good for emphasizing the dust. EDIT: Please scroll down for a version 2. Olly
  17. In Ps I do not add NB to broadband at the linear stage, I do it with fully processed images. Since I know what parts of the NB will be going into the final image (that is, the parts where the NB is brighter than the BB) I can stretch the two sets differently. The BB stretch must keep a low noise level in the fainter parts because it will be seen in the final image. The NB image can be stretched in the knowledge that its fainter parts will not be included in the final blend, so I can use a very extreme stretch. As long as the faint stuff in the NB is darker than the same faint stuff in the BB, I don't have to worry because it won't be added. In a nutshell, the stretches used in NB can (and in my view should) be very different from those used in the BB. Olly
  18. It would not be a good idea just to weight them in an average of some kind. What you want is to add the NB signal not found in the broadband to the OSC, but without adding the higher noise from the short NB exposure. It's vital to distinguish between the NB signal and the NB noise because you want all of the new NB signal and none of the noise. Any kind of global averaging will just give you a bit of both and do more harm than good. This must be possible in PI but I've no idea how to do it. In Photoshop it's a doddle. Olly
  19. Lucky you! And a man of taste. These were unkown to me in the UK till I first saw one as a backpacking student in Greece in 1972. I was delighted by the idea of half an R50 and have wanted one ever since. I did a lot of miles on an R60/5 and an R100/7 but I can't reach the ground from the modern ones and I don't want all that power anyway, as I hit seventy. (Plus I can't afford them. ) Olly
  20. Thanks. I have to ask: does your forum name refer to BMW's old 250 single, of which there were quite a few in Greece at one time? Thanks, but I don't get that box. I get a cheap and nasty one like this: Quite honestly, mine looks as if might have come out of a christmas cracker. Yours faithfully, Miffed, of Etoile St Cyrice.
  21. Sorry, but I think it is a bad idea to use NB data as a luminance layer. Maybe twelve years ago everyone used to do this and the forums were full of cutesy pink nebulae dotted with blue-haloed stars! The thing is that Ha is a deep red and OIII lies on the blue-green border. That, therefore, is where they should go in the final image. If you use NB as luminance you will illuminate the whole image with the light from just those colours, so anything like blue reflection nebulosity will not be illuminated and the NB will, in effect, go incorrectly into all channels. What you need is a program with layers. In Ps you could put the NB layer over the OSC and try blend mode Lighten or blend mode Screen. Perhaps you can do this in Affinity? I think it must be possible in GIMP. I would do this with both images fully stretched, by the way. The point of this method is that the NB signal will be applied where it is brighter than the OSC, meaning it will go into the right places. Olly
  22. My machine manages the default pretty well, but where did you find these options? Olly
  23. Certainly, yes. How you do it depends on what processing software you use. In Photoshop I'd place the stretched NB data on top of the braodband as a layer and change the blend mode either to screen or to lighten. Olly
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.