Jump to content

ollypenrice

Members
  • Posts

    38,263
  • Joined

  • Last visited

  • Days Won

    307

Everything posted by ollypenrice

  1. Drawtube or camera chip? Try rotating the camera 180 and see if the offset follows that. Your offset vignetting is pretty typical of many systems (even better than some I've used) and wouldn't keep me awake as long as your stars are good across the chip. Olly
  2. I would strongly recommend the 6nm over the others. It will block more moonlight and other LP if you have it, give you smaller stars and higher contrasts. Personally I use a Baader 7, an Astrodon 5 and an Astrodon 3. I don't find the Astrodons to need longer exposure. Rather the reverse if anything. Olly
  3. While this is obviously a very good image it is not an exceptionally deep one. There is considerably more outer halo than we see here. Total exposure time is short, at 50 minutes, but the aperture is large. It would be interesting to see if how much longer it would have taken to capture a fuller halo. Olly
  4. I like the image as well. The cutting of rings is simplified by a compass-cutter from graphic outlets. For larger diameters you might need to improvise your own. Olly
  5. It's already had an almighty tweak as it is! It was a whopper. I entirely deleted the reflector layer from it in order to lose the spikes. Unfortunately its halo swallowed up the smaller stars around it so any further restraint would produce a visual black hole around it. Indeed there already is one, unfortunately. Olly
  6. If it were complicated it would be beyond me, Alan! I stack the images from the different scopes entirely separately so I end up with L from each, RGB from each and Ha from each. I then flatten the backgrounds of each of these separately in Pixinsight's DBE. Finally I open both Lums in Registar and combine them, then do the same with both RGBs and both Ha images. From then on it's the same as usual. (With an enormous amount of palaver it would be possible to align and calibrate every individual sub for subsequent stacking in a single run. I'll pass on that!!) Olly Edit: Registar is a doddle to use. It's expensive but saves me hours and hours since I'm always combining old and new data, data from different scopes, etc., and I use it for mosaics as well.
  7. An interesting set of points made by Vlaiv on Rodd's M101 dataset thread sent me back to my own data to see if I was guility of making the spiral arms too blue. I think I was, but they refuse (by any reasonable means!) to become less blue than this. I had a lot of data, over thirty hours, from Yves Van den Broek's ODK14 and our TEC140. Ha LRGB. The reference image in the other thread was the Hubble Team's though it concentrates on the core. Olly
  8. I just saw a bloke on FB drink a 1 litre bottle of vodka down in one. By next week someone will have made that two. This is my opinion of the internet mania for isolated concrete pillar foundations of many tonnes. I image at 0.9"PP with a pier plonked on the continuous floor of an observatory. It's a decent floor - maybe three tonnes of concrete, but that includes the warm room. I have other, smaller, bases for scopes as well. One guy demonstrated that you could detect some movement at the EP if you stamped heavily around the telescope. True. And the solution might be...? Our large four-scope remote observatory has a continuous floor of six tonnes. Nobody complains. If you have a problem with trains, trucks etc passing, will a huge isolated lump fix it? The vibrations came to you through the ground... Olly
  9. This would be perfectly true if you ever imaged a starfield at the resolution of a lunar-planetary imager and with comparably short sub-exposure times - but you don't. It's a combination of very high image resolution and very short exposure time which makes the movement visible. Various 'steps' in the movement are frozen. Longer exposure and lower resolution will blur out this level of movement. That's why nobody can capture M31 with the detail found in a Damian Peach Jupiter image. One last question. Sharpening algorithms take various forms but they are derived from the data so they are more than cosmetic until they are overdone, at which point they create and even 'sharpen' artifacts. I can't speak for other people but I think most of us in deep sky imaging will avoid sharpening stars but will sharpen galactic details and, perhaps paradoxically, gassy structures in nebulae. (Why not sharpen stars? Because their visibility at all in our pictures is little more than an artifact. A perfect imaging rig of amateur resolution would make them all land on just a single pixel. In the final image they would hardly be visible at all. Optical laws mean that, in fact, they spread onto many pixels and imaging reality follows the star chart convention of brighter stars looking bigger. We stick with that. We can have some confidence in our sharpening algorithms when they extract new details which are in accord with those of far higher resolution instruments like the Hubble and when they also agree with independent capture and processing results from other amateurs. In general they do this. Olly Edit. Thinking again about sudden shifts in stellar position I would suspect that play in the focuser or camera adapters (or even the chip) would be prime candidates.
  10. Yes, something like that for me too. If I use a standard log stretch on my Sony data and keep going till the background reaches 23, the fainter object signal is stretched beyond its noise floor and the background is also noisy. (I end up with some background pixels at 23 but a great many at all sorts of values below that.) So I log stretch till only about background 20 and stop. Two steps follow. In Curves pin the background at 20 and fix it above that, then lift the curve a little where it's below 20. This lightens the overly dark pixels for a smoother sky. Then I pin the background at 20 and fix it below that so I can stretch a little more of the object signal. With the Kodak chips none of this is necessary. I get a consistently better background (and stars) from the Kodaks. Olly
  11. No, not 30. I've always regarded 23 as an upper limit but I tend to work with it higher than that and bring it down as a last adjustment. (You can't bring it back up when it's gone!) When I started I used to make my backgrounds darker than that and a guy on the French forum commented unfavourably, recommending 23. Around the same time Nik Szymanek said something similar. Olly
  12. I always measure my background sky level in Photoshop. There's a current discussion about background sky values here: I think you've made a good call on the stretch. The barred spiral on the lower left is beginning to show a little grain as it is, so it's presumably at its limit. I have an image of this target with 12 hours' data but with more aperture (140mm) and from a darker site, I imagine. The outer halos are a little brighter but there's nothing faint in mine which isn't in yours. Olly
  13. Poor seeing, AKA atmospheric distortion, has never given me any visually apparent stellar movement on the chip. Its frequency is too high so that even a one second sub has largely blurred it out. I do use longer subs to focus, typically 3 seconds, in order to blur the seeing effects out fully and give stable full width half max values. Bad seeing shows itself this way - poor and unstable FWHM values. I think Vlaiv's suggestion of a PE issue is more convincing. Olly
  14. I did. Sorry, I'll correct that. Thanks. Olly
  15. Seems to work! You might run into difficulties if trying to extract very, very faint stuff like IFN of tidal tails, maybe. Anyway, super result. Olly
  16. That's very kind of you, Ray. I don't think all the filters are worth the extra and for nearly a year I've been working almost entirely with Baaders. I do have a Mr and Mrs Gnomus-owned Astrodon Ha in one camera but my own Astrodon Ha is sitting in its box because it's unmounted and the appropriate filterwheel is for mounted filters. I've only ever used Baader LRGB. I'd say that the premium Ha filters, especially the 3nm, and the OIII are certainly worth having. My Baader* Astrodon 3nm Ha is way more moon-proof, gives tiny stars and striking contrasts and may even pass more wanted signal. It certainly isn't slower but it's a beggar to focus! (No stars!!!) Both my 1.25 and 2 inch Baader OIII filters produce bright star halos. This is a bind but isn't fatal simply because I don't do pure NB imaging. When I add OIII to LRGB I can cosmetically remove the halos when I use them to lighten the green and blue channels. I'd like a premium OIII filter, however. Olly *I meant to write 'Astrodon.'
  17. Another thought. (Sorry but this is a great question!) Here's what a background of 23 looks like with zero-value text on it. We must also leave the general background light enough to allow dark patches of obscuring dust to show themselves against it. Here some of the dust is only at a value of 9. Olly
  18. There's a difference between clipping object signal and clipping skyglow. I think we can all agree that clipping object signal is never the right thing to do. However, skyglow is also genuine and natural. At a very dark site on a clear night you will, when dark adapted, be able to find your way around quite easily. I live in such a place and only need a head torch when I've been imaging and looking at a PC screen. If it's a visual-only night a dark site is not dark. However, if it goes cloudy I can see nothing at all - which shows you that starlight is light. This argues for a non-black background. Leaving the background reasonably high is a sign of confidence in an imager, in my view. For this reason I aim for somewhere between 20 and 23 in Photoshop. Where, in that range, I end up depends partly on the camera: my Sony chip makes it almost impossible to get beyond 20 while my Kodak chips are happy to produce a lighter sky. I don't know why. The other factor concerns the object itself. It's probably a matter of which level of contrast looks best because , in any image, there is a visual interaction between one part and an another, as endless illusory images demonstrate. Over enthusiastic clipping will also remove all noise from the background. This simply looks wrong. The sky does not look like a shiny, flat, jet black surface. Note surface. A clipped sky in an image does look like a surface but the sky itself is not a surface, it's a depth. This is related to a point in visual perception theory: a high gloss without grain prevents a picture from creating the desired illusion of three dimensionality because it is surface-like. A little grain breaks that 2D surface up, allowing the 3D illusion to work. Finally, once you've looked at lots of astrophotos, you can usually guess why a sky is jet black. Usually it's because an inexperienced imager has used clipping to remove gradients. They can, however, be removed without clipping. Olly
  19. Lovely clean image of a great target. Not even flats? Olly
  20. The 4SE is an Alt-Az mount if I'm not mistaken. Do you have it on an equatorial wedge to convert it equatorial? Olly
  21. Once the LRGB is fully processed I split the channels, activate red, paste the Ha on top of it and set the blend mode to Lighten. I can click it on-off to see what it's doing. I don't want it to lighten the sky so, if it is, I can black clip it a tad or use curves and lower the bottom. Maybe it's not doing much? In that case I'll pin the background where it is in curves and stretch it above that till it's making a difference. You can stretch it above the noise floor so long as the noise is only affecting the darker parts of the Ha. These won't be applied to the red channel where the red is lighter than the Ha. With galaxies, where the Ha tends only to be bright blobs, you can stretch the hell out of it! Flatten Ha onto red and merge channels in RGB. If the Ha has come out with too much force it doesn't matter in the slightest. Just paste the new HaLRGB onto the original LRGB and reduce its opacity to taste. Olly
  22. I'm astonished - quite literally - that this was unguided. A guider might possibly have tightened up the stars some more. They are round but this does not, contrary to popular assumption, prove perfect tracking. If errors are about equal on both axes you'll get round stars but still lose a little resolution. But would guiding have given you a better result? Quite possibly not on this occasion. It would allow longer exposures on faint targets but CMOS cameras depend far less on this than CCD and your skyglow would limit exposure time anyway. And then I'm doubly astonished by your location because I've often processed guests' images from light polluted areas and they look nothing at all like yours when opened up. They are literally bright orange. It's often said that OSC cameras are worse hit by LP than mono. Well yours is doing fine! I saw your signature said London but I thought you must have been imaging from somewhere else. Here's a version with a bit more pop. The JPEG on here looked a bit muddy when I uploaded it. Olly PS Sorry, I forgot to fix the bright star artifact, top left. Very easy, though.
  23. OK, so I ran DBE in Pixinsight to get the background to neutrality - equal values in R, G and B - and reduced the green noise slightly with SCNR then went into Photoshop. Stretched in Levels till the background reached 22 then used curves. Pinned the background at 22 and fixed it below that, then gently gave it a bit more stretch above the background. A few tricks to intensify and adjust the colour, some sharpening of the globular core and here it is. The data are very good. An excellent capture. Tight focus, no tracking errors, low noise. The background was not bad at all, suggesting either a decent site or a very good LP filter! Red and blue stars are well separated in terms of colour. That was fun! (And it's raining here...) Thanks. Olly
  24. More seriously, it does seem a good package. I must have a look at it. Of course, now I have to produce a flat, grey background with the cropped data. Uh-Oh! 🤣 Olly
  25. Yes but PI would sort out the background sky! (Runs for cover...) Olly
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.