Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

Ouroboros

Members
  • Posts

    3,533
  • Joined

  • Last visited

Everything posted by Ouroboros

  1. I feel the weighted batch preprocessing in Pixinsight has been a game changer. I like the way it sorts and associates different lights, flats and darks. Mind you, I’ve only used it for OSC so fairly straightforward. I like the pre-run check with flow diagrams showing what it’s going to do with the different files. The free-to-view YouTube videos by Adam Block on WBPP are excellent. I decided some time ago that it suits me to try and learn one piece of dedicated software rather than to do processing across several. At the time PI seemed to be the application of choice. So I went for that, although more alternatives are now available. I can now use PI for everything and can even (!) end up with a pleasing result. I am amazed though at the superb results some people can achieve with it. I think some people are just natural virtuosi and others like me are not. I like the logical methodology of PI. And it’s become considerably easier since the development of AI add-ons like BlurXterminator etc. I’ve said this before on SGL, but the issue I have is that although I’m pretty much fine up to and including the point where the image is stretched to the non-linear state, after that the options broaden into what feels like a huge multi-dimensional parameter space. OK I can finish off by doing curves and pushing up the saturation, applying a few masks and adjusting and enhancing local contrast and sharpness and so on. But I’m often left feeling I can’t really see what’s wrong with my image nor do I know how to make it look better. I think that probably comes down to a fault in myself rather than PI. I have not found any of the reference books like Warren Keller’s (excellent though it is) much help in this regard. Other suggestions welcome.
  2. My thinking has been very similar and so far I’ve only stuck with broadband OSC. 120s seemed a reasonable compromise for all the reasons you cite. It already takes overnight for my aged MacBook to process the subs. The poor old thing’s fan is whining away for hours on end. I’ve taken to raising it off the table slightly to allow air to circulate. on the other hand lengthening subs runs the greater risk of having to throw away longs subs ruined by wind shake, planes, satellites, intermittent clouds etc. Though I guess filters should reduce the contribution of satellite trails/planes.
  3. Interesting you use 120s. I thought I’d read that longer times are often used when using filters. Have you tried longer subframes?
  4. Just out of interest, @Lee_P, what gain do you use with your 2600 camera and dual band filters?
  5. Elp: “Open the front door, AI.” AI: “I’m sorry Elp. I’m afraid I can’t do that”. 😀
  6. Well, having just acquired an OAG with my little refractor I’ll let you know whether I find it “bonkers” or not @ollypenrice. So far so good. It works. It makes a nice compact system I can keep as a unit (without a ‘sticky out’ guidescope) that I can easily pack for travel and carry out to the mount. I see certain advantages of a guide scope. They’re easily swapped between different telescope, if you use the finder shoe that is. They’re useful for visual too. Teamed with a computer, ASIair or similar they can be used in GOTO mode with plate solving to put you smack bang on the visual target through the main scope. Assuming that is you have the two scopes lined up. I found that a real asset and it seemed to have re-enthused my visual astronomy. You can even acquire some images for your records. A sort of visual/EAA hybrid mode.
  7. There was an interesting piece during the PM programme earlier about ‘now casting’ in which AI is being used to forecast what’s going to happen in the next few hours only. That after all is what we astronomers really want isn’t it? Yes, it’s good to know when it might be clear (and it really is only a “might”) next Tuesday or whenever. But what we really want to know now, this evening, tonight is what it’s going to do in the next few hours. The AI uses current MET radar, satellite and weather data, compares that with previous data obtained over many years, to make a “now cast”. I like the sound of that. It has the potential to be very useful. My money is on that being the most accurate.
  8. Great. I’ve always wanted something named after me! 😆
  9. The clear forecast for Saturday night must be accurate because other factors make it impossible for me to do any astronomy that night.
  10. Not sure if this counts but looked out the window in the wee small hours to be treated with lovely views of Venus, a crescent Moon towards the east and Orion in all its magnitude towards the south. Tempting though it was I slunk back to bed.
  11. As for my elongated guide stars, a quick google this morning suggest this is quite a common observation with OAGs. I understand it is due to picking off stars at the edge of the image field where distortion will be most apparent. I didn’t adjust the position of the prism when putting the set up together. I just checked the prism wasn’t vignetting the main image in daylight. Regardless of stretched stars the guiding looks good to me, and I haven’t even fiddled with any of the settings.
  12. @pipnina Yes, I’m pleased with what I see, but then I don’t have the experience to know whether it is what I should expect. That’s why I would appreciate comment from experienced ‘opticians’. I also don’t really know what to measure.
  13. Took delivery of an Askar FRA 300 telescope and OAG last week. I’ve had a humdinger of a cold the last four days but decided tonight I just had to try it out before the weather deteriorated in the south as forecast. The skies were misty which didn’t really matter for some tests. Here’s a couple of single subs - 120s and 300s respectively. The stars look pretty round right across the field, although towards the corners there is clear evidence of chromatic aberration. This surprises me a little because the ASI2600 is an APS-C sensor and the 300 is supposed to be designed for full frame cameras. I want to analyse the images properly in Pixinsight. These are just screen shots. Any thoughts on the quality of the stars anyone? Zoomed into a corner …. I found the Askar OAG quite easy to set up. And the guiding seems quite good. I was a tad surprised how elliptical the guide stars looked. More like a streak. Is that normal for OAGs?
  14. Funnily enough it was the frustration of drop-outs using direct, wired USBs in PCs and Macs over ten years or so that led me to the ASIair which I have found to be (mostly) far more reliable.
  15. I have only just realised this having just acquired and fitted (though not tested) the Askar M54 OAG. It hadn’t occurred to me before. I guess you have to make sure you’re happy with framing first, then calibrate. I think I tended to do that anyway.
  16. Yes, very good. S@N did a prog on the VLT recently. Still available on iPlayer.
  17. Well done, Kostas. It’s an excellent image and cleverly arranged.
  18. The nice post lady brought another goodie today. I got it all connected up and the main and guide cams co-focused. Well, kind of anyway. It’s tricky when they’re not actually looking at the same thing. I’ll have to wait until I get some stars to focus properly. This is my first OAG, so I’m interested to see how well it performs.
  19. Small and perfectly formed. Askar FRA300 and Askar filter drawer. It was sent by FLO inside a box, inside a box, inside a box, inside a box. I can’t say it was under packaged. Brilliant to have it redirected to a pick up point too. So I didn’t have to wait in all day or worry about it being left on the doorstep. I’ve ordered the Askar OAG too.
  20. “Press and guess” I like that phrase. Sums up my workflow. Actually I’m pretty much OK until I go nonlinear, then I feel I’m flaffing around in a multidimensional space of possibilities. I’m not quite sure where I’m going or how to get there. I always do get somewhere. I end up with an image. But I often feel there’s a better image lurking in the data if only I knew how to get it out.
  21. That’s an extraordinary image, @Clarkey, irrespective of whether it’s your first go in PI. It took me a moment to recognise it was the NAN because it’s such a wide field. Isn’t there a lot going on there in the region? Wow! I’m very interested to see how you progress with this image because I am hoping to do some dual band Ha + OIII imaging myself having only done OSC up till now. I have been watching some videos to try and get my head around how processing of dual band differs and I have to say I as yet don’t have a clear idea of how to go about it. Hoping people will suggest a few more tutorials in response to your post. What briefly was your workflow to get that image?
  22. I decided to have a look at some of my subs to see how many pixels they shift between subsequent shots. The was data obtained with a 910mm FL main scope and 176mm FL guide scope. Focal lengths determined by the ASIair. Anyway there's a x5.2 multiplier between the two. The pixel sizes between the main and guide cam are identical. I had set the dither to 5. Presumably that's +/-5. I'm not sure whether the 5 pixel dither referred to is measured along the x and y axes, or whether along the hypotenuse. Anyway it doesn't matter a lot because of the large difference in focal lengths. I do get some odd measured dither values ie more than 5 but much less that 5 times 5. I think the maximum dither was 13.5. But most were less than 5 along the hypotenuse. The over size ones are probably down to a shift in pointing or something. Odd because my stars were (mostly) round. Anyway, I think this confirms what ZWO told @Jim Smith ie dither is main camera pixels.
  23. Oh, yes. Sorry. You did say.
  24. I really like that, @kirkster501. I like the way the galaxy core is large and misty. Quite often the core of M31 looks too small and intense in images. I also like the subtle saturation in the white bits of the galaxy and the additional ‘highlights’ of colour on the disk’s edge. That’s coming on nicely. Perhaps for my taste the yellow stars are a tad too yellow, but it’s all down to personal taste. I don’t know whether you find this but if I process an image for too long I can’t assess it anymore. I have to leave it and come back to see it (almost) afresh.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.