Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Ouroboros

Members
  • Posts

    3,533
  • Joined

  • Last visited

Everything posted by Ouroboros

  1. @Nikolai De Silva Excellent. Amazing what they did with photographic plates isn’t it? I bet someone had to guide those long exposures by eye for the whole duration of the exposure too. Also amazing that those images were the best of what could be achieved then with big very expensive equipment ….. and that people can now produce better images in their own back gardens. The coloured plate of spectra is amazing. I checked my own copy of encyclopaedia Britannica published in 1953 and it does not have such good images.
  2. I was interested and intrigued to see @bdlbug how you returned to generalised hyperbolic stretch again at steps 18/19. I don’t expect you to recall why you did that here but is that something you do regularly? I’ve come across GHS as a means to stretch in a more controlled way from linear to nonlinear than is achieved by using the STF + histogram transform method. But I haven’t seen it used in the later stages as you have here.
  3. I see what you mean. I’ve only recently started using StarXTerminator. Perhaps a bit slap dash of me, but I have so far naively assumed, unless the residues left behind are really bad, that since I’m going to put the two images back together they’ll be hidden again*. Previously I used Starnet2 for a short while. I thought StarXT did a slightly better job on balance. I didn’t use filters with this image. It’s straight OSC. The haloes could be caused by faint mist I suppose. Perhaps I should reblink the original subs, weed out if necessary and reprocess. * I can see this doesn’t necessarily work because the starless image is probably going to get stretched more than the stars image.
  4. Hey! Lots of responses. Thanks all. Yes, my experiments with TGVDN last night seems to confirm this. Yes. In fact I’ve played around with this image so much I’m having difficulty seeing it anymore. Thanks for the breakdown of your workflow. I very much like your second attempt particularly, along with @Fegato’s, for the way you’ve brought up the background nebulosity. Indeed it is. My recollection is that I got three hours of good data and then the mist rolled in off the Atlantic. Thanks for taking the trouble to provide your workflow. I shall study that. You’re right - there are any number of ways to skin a cat process an image. It’s one reason I take so long to process an image. Three steps forward, two steps back. Your comment about giving the data a bit of noise reduction at the linear stage. I think I have that stuck in my head as recommended procedure when I first started learning PI. Warren Keller even has a section on it. Kill the noise before you stretch it seemed to be the idea ….. but not to the extent that you kill the important details. Anyway, maybe that rule of thumb has been superseded. I also like the way you’ve developed the background. You’ve lost the purple colour and made it look more misty. Nice. You’ve also brought up the Cave Neb nicely too.
  5. Yes. I often use a mask with it too to target the background for example. Those residual green casts can be quite subtle can't they? It's not until you remove them does it become obvious that they're there.
  6. Thanks, @Fegato. Your background looks much better than in my effort, which encourages me to have another go.I think we all agree there is insufficient data. But then I’m thinking of this as a learning exercise. I am amazed how quickly you produced a result. I work very slowly, not least because I don’t quite know where I’m going so I spend ages on each process trying something, going back, trying something else. But how else can you come to understand what is after all a very complicated processing tool which can be applied any number of possible ways to achieve different results?Watching videos only takes one so far.
  7. OK Let's see if this works ... This is 90 x 120s pre-processed data. All I've done is a slight dynamic crop of edge effects. Enjoy! NGC281_dc.xisf
  8. well, just tried that by halving the resolution (ie reducing the image height and width by a factor of two, if that's what you meant) and it made no discernible difference. Thinking about it anyway, what you're looking at is reduced resolution because I reduced image size by a factor of about 5 before saving to jpg to post. .... and I don't think it is a particularly good image in that the framing could have been better ie the nebula could have been more central, but that's in the data acquisition stage.
  9. Very nice pic. It’s interesting comparing the two. It took me a while to get my eye into spotting the similarities and differences. Interesting to see how the different techniques compare and what each bring out. I’m not sure my blotches appear in your image. But I’m going to reprocess mine and see what I get. Ta!
  10. Yes, thanks for this suggestion. I’m going to revisit the data later when I’ve got some time to experiment with different amounts of noise reduction at different stages. Just got to experiment.
  11. Yes. I use SCNR a lot. Usually later and very successfully to remove residual green after background modelisation and colour calibration. The green I’m referring to above is the bright green colour cast observed in many OSC images immediately after preprocessing and before you do anything else. I have read it arises as a consequence of OSC sensors having twice as many green sensors as red or blue. Just out of interest I just applied SCNR to a just processed green image. It goes sepia, and looks yuck! I think it’s better to apply SCNR when you can see what it’s doing in a controlled way, possibly masked too to protect certain areas.
  12. Thanks for your thoughts on this. I’ll revisit the data and have another go by trying those things. I feel astro image processing is a constant learning experience. In a sense an image is never ‘done’. I’ve picked up a lot in the last year or so, and feel it’s good practise to go back and redo old data. The weather is lousy anyway so few opportunities to get new data.
  13. You might be right when they first launched it. In fact the process included a box to check for linear. That’s gone now. I apply it early because not only does it work extremely nicely in linear but also because I thought it was recommended to do some noise reduction before stretching.
  14. You’re right, but I usually found DBE applied early went a long way to dealing with the green cast too. I normally only do background neutralisation within SPCC. Seems to work anyway.
  15. ~F/6.5 with flattener/reducer. But yes, @Fegato, not oodles of data I agree. I did do DBE. I just missed it off my list. Traditionally I’ve always done DBE immediately after dynamic crop. But I read recently that it didn’t matter when you did it during the linear stage. This time I did it on the starless image after doing dynamic crop, SPCC, BlurXterminator and NoiseXTerminator on the original starry image, if you see what I mean. I’m not sure DBE would make much difference because the mottled features are on a smaller scale than the DBE background which is usually slowly varying. While we’re on the subject of applying DBE to the starless image ….. I too have followed Adam Block’s method on this. Have you noticed how applying DBE to a uncolour calibrated starless image does not get rid of the green colour cast you often see at the beginning of processing? Yes, I applied NoiseXTerminator on the unstretched linear image. I thought that was recommended.
  16. I have recently had another go at processing 3 hours of OSC data of the Cave Nebula acquired last year with my SW ED80 and ZWO ASI2600MC. I’m reasonably pleased with the resulting colours and the look of the image overall. But I’m less happy with the blotchy colour in the fainter areas of nebulosity. Maybe it’s real of course or maybe three hours just isn’t enough. My workflow has been roughly as follows: Dynamic crop. SPCC. BlurXterminator. NoiseXterminator. StarXterminator to separate nebula from stars. Generalised hyperbolic stretch of starless and star images. Several selective applications of TGV Denoise applied using masks (to protect brighter areas). Curves applied to the starless image for contrast and emphasis of brighter regions. Colour saturation applied to brighter areas and to suppress some unsightly purple areas in the background. Some Local Histogram Equalisation to brighter areas and some Dark Structure Enhance. Recombination with stars. Desaturating colour in the background just looked artificial and unrealistic. So I ditched that approach. Any suggestions of possible solutions?
  17. @Craney I've been using the new Sat24. It's better than I initially thought. Have you found a way of moving the map? It seems to be stuck with the UK in the centre. Sometimes it's good to see further out to what's coming further out.
  18. It is very interesting. At one time I was 100% in favour of recommending the ASIair, but my experience of late since recent software updates is that it has become flakier as a platform. All this has been debated at length elsewhere. I don’t know why this has happened. Whether ZWO have overstretched themselves in adding too many knobs and whistles to the air. I’ve no idea. Anyway, it’s worth knowing about alternatives in case jumping ship becomes necessary.
  19. That’s an interesting thought. I had assumed it was solely a commercial decision to tie people in to the ZWO stable of cameras. In fact I thought they were being quite clever in enabling a lot of different types of DSLR cameras to work with the ASIair because they know that a lot of people transition from a DSLR to a dedicated camera.
  20. I agree with @Bluesboystig’s comments. I’ve never tried the All Sky alignment but I am aware it exists. As for distance …. 25m is out of the question in my experience. Even ten metres from inside the house is pushing it. It’s not so much that it won’t make contact with a tablet or phone, but you’ll see slower downloads and more drop outs. I own two ASIair Plus and neither achieve the working distance claimed by the manufacturers. As it happens that’s not too much problem for me. In your circumstances I’d look at extending your home WiFi and control the ASI on your network. You’ll be able to be anywhere in your house then. I don’t know what other equipment you have, but an ASIair limits you to equipment (ie cameras) from the ZWO range, apart that is from some DSLRs and other mounts.
  21. It's ironic isn't it? Just as we might have believed that some progress was being made controlling ground based light pollution - what with dark sky areas etc - they think of ways of light polluting the skies from above.
  22. The plan is for tens of thousands isn't it? I see the amazon chappie (not to be outdone) is getting ready to start launching his lot. Look up in 20 years time and the night sky with be a fine white mist of satellites I reckon.
  23. Personally I’d say that’s OK. That’s basically 50m cable reel extension length isn’t it? You can easily run a shredder or hedge trimmer down the garden you can certainly run an Astro rig. My astronomy shed is powered on the end of a 13A 50m extension cable. I run an astronomy rig, small light, small ancillaries like power supplies for a laptop etc plus, on occasion, a fan heater. It copes fine with that. My set up incorporates a residual current trip too just in case.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.