Jump to content

ollypenrice

Members
  • Posts

    38,263
  • Joined

  • Last visited

  • Days Won

    307

Everything posted by ollypenrice

  1. I don't think the original is too blue so much as too cyan. On this forum Vlaiv has posted plenty of good stuff on imagers' tendency to push the blues too far in spiral arms and I think he has a point. I was guilty of this in my earlier M33s so I went back to the data and did a reprocess in which I scrupulously avoided specifically pushing the blues. Normal colour calibration gave me this in HaLRGB. It's a refractor image. Sorry if this is heretical. Olly
  2. I can't comment on the App and am allergic to mobile phones but a way of measuring the sky is very useful. If the App works then fine. But why are they useful? - If you've just come away from a computer screen, certainly at my age (67), you won't be able to assess the sky visually for about ten minutes. Is it clear, is it not? Very hard to tell, so use the SQM. - You're up and running but the results on the screen are dodgy. Is it the sky? De-adapted, you can't tell by looking. Use the metre. - You run an astronomy gite and want to boast convincingly about your sky quality? Quote the meter!!! 🤣 (Go on, ask me! It's sometimes 22.) Olly
  3. Move the telescope. That's the secret. I also think that the natural movement of hand held binoculars explains why so many people on this thread have seen it with them. Get the scope pointing where M33 ought to be, loosen the clutches, and scan the area by hand, moving the tube slowly. When you move past it you have a fighting chance of seeing an extended patch of very slightly brighter sky. That's M33. When using a Dob I do this a lot, I 'nod' the scope up and down, and that slight movement makes things pop into view. I dare say it varies from person to person but this technique is incredibly effective for me. I live at a very dark site with lots of clear nights but I have never come anywhere near seeing M33 naked eye. M31 is dead easy. Olly
  4. That's right. The plate solution will tell the mount where it's pointing and that's all it needs to know in order to find the target and return to it if necessary. The key thing is that Tracking and guiding have nothing to do with target finding. They are independent processes. BTW, my Mesu mounts only have a one star alignment option anyway. Olly
  5. Atiks usually come with a metal chip window cover. I do my darks off-scope with this cover on because I find excluding all light remarkably difficult. As Andrew says, a delay between subs would be a good idea too. Olly
  6. Absolutely not. The 3 star alignment is entirely internal to the mount and is concerned with helping the mount find the target accurately. On a short hop from Beta Andromeda to M31 a single star alignment will be plenty good enough to locate the target. Once found, the mount's internal planetarium has no further role in the proceedings. PHD will lock onto you chosen star and follow it. The only thing PHD needs is for your polar alignment to be accurate. It cannot correct for that. If your mount isn't polar aligned accurately enough the guidestar will be held in place by PHD but the image will slowly rotate around it during the shoot. The longer the subs you take the better the PA has to be. Slight rotation between subs will be corrected by the stacking software. Olly
  7. It's quite easy and Photoshop offers my favourite way of doing it. Process your basic colour image first and save it. Don't bust a gut trying to drag out the faintest of the red Ha signal contained in the image because you have that in your narrowband capture which you'll add later. Next you'll need to align (AKA co-register) your Ha image so it fits over the RGB properly aligned. I dare say you can do that in DSS, I don't know. It can be done in Ps manually but it's ages since I did that. If there is any rotation between Ha and RGB that's the fiddly bit to sort out manually. Process your Ha image as a monochrome. If you are aiming to add it to RGB you may want to process it a little differently from the way you'd do it for a standalone Ha image. Personally I push it harder if it's going into an RGB image. I go for starker contrasts and I don't worry if the darkest points are a little noisy because the dark parts won't end up in the final image. Adding the Ha: Select and copy (CtrlA, CtrlC) the Ha. Open the RGB and split the channels in the channels menu. Minimize the B and G channels with the R channel active. Paste (CtrlV) the Ha onto the R. In the Layers menu change the blend mode form Normal to Lighten in the drop-down. This means that the Ha influences the R only where it is brighter than the R. Where it is darker it is ignored. Before flattening the Ha onto the R check the following: Is the Ha lightening the R background? If it is you can open Levels and bring in the black point or use Curves and lower the bottom part of the curve slightly while holding up the top. Is the Ha doing as much as you'd like to make the object show? If not, use Curves, pin the bottom and lift the middle range till the Ha comes nicely into play. (If the Ha is overpowering, ignore that for a fix later on.) Flatten and Merge Channels choosing R (now modified), G and B in the right channels. If you feel the new HaRGB has too much Ha, paste it onto the RGB in Levels and reduce its opacity till you like it. What about Ha as luminance? Generally a bad idea because it gives pink nebulosity and bright blue star halos. It also holds down blue reflection nebulosity. However, it may be useful to paste it on top of your new HaRGB image in blend mode Luminosity at a very low opacity like 10 to 15%. Try it and see. Some examples of this method: https://www.astrobin.com/full/327970/0/ https://www.astrobin.com/full/301531/0/ Olly
  8. No, I agree an OAG is vastly to be preferred for a reflector. Olly
  9. In my understanding it has to demosaic first because colour must be attributed according to the Bayer pattern of that particular chip. If it aligned on the stars first the alignment would ignore the Bayer matrix and would stack a red pixel onto a blue onto a green, etc etc, and the colour information would be lost. I'll need to watch the video again but I'm left shaking my head when I hear that there is only one best way to stretch data. The stretch is the most important intervention in processing and cannot be reduced to a single method for three reasons. 1) Every image is different. 2) An imager might have a specific set of objectives for a dataset. Sometimes this can be extracted at the expense of that. We prioritize what it is we want our image to show. 3) Some of the most effective stretching techniques work by blending different stretches simply because one stretch cannot do everything. This doesn't mean I don't think Mark's stretch is an excellent one. It is. Olly
  10. Good! You have a well colour-balanced and even background sky which is so important in galaxy imaging. Olly
  11. Fabulous. The holding down of the stars makes such a difference. Olly
  12. Nice separation of reflection and emission.
  13. Although, bizarrely, the French in Britain outnumber the British in France, or they did recently. I heard on the radio that London had the fourth highest French population of any city in the world, including French ones! Olly
  14. In the end, astronomy and neighbours simply don't go together. In the UK isolated houses are expensive. Fortunately, over here in France, the reverse is true. Olly
  15. Easy to like this one. I suspect a colour gradient with green dropping off towards the left hand side since that side becomes increasingly magenta. The colour balance on the right hand side looks superb. Olly
  16. I think reducing the colour saturation considerably would give a calmer, cleaner image. Otherwise it looks great to me. Olly
  17. Wow, that is very classy with a seamless penetration into the Trapezium region. A most exceptional M42 at any level. Olly
  18. 'Do you think this movement has something to do with the strange filaments moving through the M45?' I think it's M45 which is moving, but yes. However, I had a surprise. I thought that the 'arm' of smaller stars was leaving a trail behind it but, with sufficient data, this turned out not to be the case. This is from mine: The arm of stars is marked by green dots. The two green lines mark the start of trails which I initially thought came from the arm of stars but clearly they don't. What has caused them? Good question! 'I have for a while been thinking that it would be of value to make several "final" versions that make different statements.' Yes, I absolutely agree. Olly
  19. I like it. With what I know about processing so far, which is probably about one percent of what might be known, is that I think that you have to decide what an image is about. You go after this at the expense of that or you go after that at the expense of this and so on. In my case I had imaged the Pleiades several times with guests and retained the data each time. But I had, in my memory, the recollection of a remarkable image in a book called something like 'The New Astronomy.' I've lost this book so I'm reliant on memory, but I think it showed a selective wavelength image which clearly demonstrated that the cluster was moving through interstellar dust. (Previously it had been assumed that the nebulosity surrounding the cluster was its natal cloud.) I was driven by the desire to trace that 'wake' through the dust so I organized both my next captures and processing to find it if I could. The victims of this approach include a natural look, small stars etc. One day someone will produce an image which will do everything but, so far, I don't know of anyone who has managed it. A great piece of advice from a friend (advice which he only gives to himself) is 'Make your own picture.' Olly
  20. Fabulous. I can't see a mono doing significantly better on this target. Olly
  21. On the fainter stuff and its contrasts, yes.* I had been quite disappointed by these regions in the original. On the main cluster, in search of the evidence of the 'wake' through the interstellar medium, I had more success with Unsharp Masking, with the Threshold value kept high to prevent over-sharpening of small details. Olly * Edit. Once I had the equalized mask in place I did a further stretch in Curves but only one which lifted the lower values since these were to regions which benefitted. So a lift low down restored early to a straight line.
  22. I wonder about this. I've given the reds no preferential stretching here. Even in the shallower data sets which eventually found their way into this one I was seeing reds/browns just to the left of the cluster in this orientation, and they appear as a brownish yellow in plenty of other images. I don't think the dust is all that brown further out, though my habit of lowering the cyan components in red my tilt them that way. After the discovery of the IFN we all started wondering about its colour so Tom and I went as deep as we could at a time when it was usually just showing as grey. We processed the data independently as a deliberate policy but both ended up with this: I looked for a piece on this by Rogelio Bernal Andreo, who concludes that the IFN should indeed be brown, but couldn't find it. And then there's the phenomenon of ERE, extended red emission, which is still unsettled amongst the professionals as I understand it. Conclusion: I have no idea! Olly
  23. Target-specific, I think. I do skim round the edges of any image to look at the possibilities for stellar framing. Sometimes there's something you can exploit, sometimes not. I avoid like the plague the rotating of my cameras away from orthogonality with RA and Dec, however. Olly
  24. Gorann's excellent RASA M45 image reminded me that my own M45 was processed before I'd thought to use the Ps Equalize function for enhancing faint contrasts, so I went back to the data for another play. This was always a somewhat head-banging image but I wanted to show how the cluster is leaving a 'wake' as it ploughs through the gas and dust of space. (The technique involves pasting an 'equalized' copy of the image onto a layer mask, blurring it and increasing its contrasts, and then stretching through that.) I've gone for an unconventional orientation partly to refresh my own eyes on the target, partly to put the visually 'heavier' dust at the bottom and partly to exploit two nice stellar arcs, the smaller one upper left and the larger one lower right, which trace an S shape through the picture. LRGB from Tak 106N/Atik 11000/Mesu 200. Barking mad integration of over 25 hours. Olly
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.