Jump to content

ollypenrice

Members
  • Posts

    38,263
  • Joined

  • Last visited

  • Days Won

    307

Everything posted by ollypenrice

  1. Yes, HOO would benefit from RGB stars because it comes close to RGB. In fact I tend to work the other way round. I'll shoot an RGB, sometimes quite quickly, and then add deep Ha to red and deep OIII to green and blue in blend mode lighten. I prefer to work this way round than to add RGB stars at the end. Olly
  2. You don't say which Nikon you'll be using, nor do you say whether or not you'll be using the focal reducer, but even if you are using the focal reducer and have a relatively large pixel Nikon you will probably be imaging at well below an arcsecond per pixel. A good EQ6 (they vary) can deliver a guided precision of about 0.5 arcseconds RMS which limits you to an image scale of about 1 arcsecond per pixel. In short that means that your autoguided mount is unlikely to be able to support the resolution at which you are trying to image. That's not the end of the world but it does mean that your mount will need all the help it can get and that means an OAG. An OAG will react to slight mirror movement when a guidescope won't (not the same light cone) and an OAG will be working at the FL of the imaging scope. Also, there will be no differential flexure between guider and imaging system. Basically you are asking more of your system than it has any real hope of delivering (but it will deliver images none the less) so you need to give it all the help you can. This calculator http://www.12dstring.me.uk/fovcalc.php will tell you your image scale if you plug in all your kit details. Again, a well tuned, autogiuded EQ6 will deliver about 0.5 arcseconds RMS which limits you to an image precision of about twice that, so 1 arcsec per pixel. Don't panic. Just don't expect to be able to present the final mage at full size. Olly
  3. So much of this image is absolutely great. However, the HSO palette does not replicate RGB, so I'm not sure what you're trying to achieve by combining nebulosity from one (HSO) palette with stars from another (RGB?) Olly
  4. You're going from strength to strength here. Olly
  5. The other thing to check is the famous date format. It is not day, month, year but month, day, year (the bizarre American system) unless Skywatcher have changed it in recent years. Olly
  6. How long your short subs should be is very kit-dependent. For my own M42 efforts ( https://www.astrobin.com/380941/?nc=user and https://www.astrobin.com/321869/?image_list_page=2&nc=&nce= ) I used 11 seconds for the Trapezium stars and 50 seconds for the Trapezium region, then 15 minutes for the outlying parts. That's using cooled monochrome CCD. When assessing a sub length look at what is saturated (burned to white) in the linear sub. Anything saturated is saturated for good. You can do nothing with it. For how to blend different exposure lengths you cannot beat this method: https://www.astropix.com/html/j_digit/laymask.html Regarding the filter (which I've never used) my understanding is this: it will block wavelengths other than Ha, OIII and H Beta. In order, those wavelengths lie roughly in the red, the green and the blue. So will they approximate to RGB? No. Although H Beta is blue, it is emitted by much the same gasses as are shining in Ha so it will only add rather faint blue signal to what is already there in the Ha. The OIII line lies right on the blue-green border so that, too, will pass a small amount of the longer wavelength blue. But neither the H Beta nor the OIII will pass the bulk of the signal from blue reflection nebulosity. Quite a lot of the Running Man feature is made up of broadband blue, so the filter is stopping it. I'd try the filter thus: Shoot and process a full spectrum RGB without it. Shoot and process a version with the filter, such as you already have. Apply the NB version over the full spectrum one in Photoshop's Blend Mode Lighten. That way the NB image will only brighten those parts where it has allowed you to find more signal. Where the RGB image has more blue, it should be unaffected by the top layer. (Warning, I haven't tried this. Normally I split my RGB channels and add Ha to red and OIII to green and blue in blend mode lighten. You might have to split channels in both your RGB and NB images and do it as I do but it might work done 'all in one.' I don't know.) Olly
  7. It's good, but I'll make constructive comments. To control the core on M42 you really will need short subs. It's one of very few objects for which this is essential. The whole Trapezium region is saturated here. For me there is green where there ought to be blue. What's showing as green here is not OIII emission but reflection nebulosity. Your OSC camera without the filter would have shown this. What is your filter bringing to the table? Olly
  8. I suspect I didn't know it, despite having read Stephen Inwood's excellent biography of Hooke. https://www.amazon.co.uk/Man-Who-Knew-Too-Much/dp/0230768458 However, I'd better check because I'm quite likely to have forgotten this part of his story. (What I do remember is that Hooke took a ride on horseback out of London and bruised his testicles in the process. It struck me that history affords little privacy to those with whom it busies itself!) 😁lly
  9. It's always best to do your own testing. Sometimes I find this confirms the received wisdom but sometimes I find it doesn't. Olly
  10. If you have Photoshop or a program with similar functions there is something you could try. 1 Concentrate on a zone in the image where the elongations are all going the same way. Make a copy layer. 2 Rotate the image so that these elongations are now horizontal or vertical on your screen. 3) Set the blend mode in Layers to Darken. 4) Go into Filters - Other - Offset. You'll be able to nudge one layer (either vertically or horizontally, depending on how you rotated your image) relative to the other to mask the elongations. Use a large, well feathered eraser to remove those parts of the image damaged by this modification, keeping only those zones in which it's an advantage. 5) Flatten and repeat with a different rotation for other parts of the image. 6) Restore the original orientation of the image. Make no mistake, this is essentially a bodge, but done with care it might considerably enhance your image. Olly
  11. I'll just throw in one caveat here: if other reviewers are anything like me they won't accept for review a product which they think will be a waste of their time. The disruption to the observatory (my reason for not doing reviews any more) is considerable and not rewarding unless the product is decent - so doubtful products don't find reviewers while promising ones do. There's an invisible selection process going on in the background, I think. Olly
  12. A pretty cursory search on the internet will find a considerable number of disappointed customers. That's a plain fact. I have nothing to say about whether or not they are right to be disappointed - that's for their readers to decide - but I do know that one of my own customers was threatened with legal action by OOUK regarding anything he posted about them. Perhaps they'll threaten me over this post. They are welcome to try... Olly
  13. There are no star-removing filters, as others have said, though tight NB filters drastically reduce star size. Still, you want broadband star colour so that's not the solution. The thing about post processing, where you can reduce star sizes, is not to look for one magic wand. It's all about patience and small iterations, one small improvement at a time, often in layers. (Fight with masks in Pixinsight if you like but they are not for me.) Careful use of Curves in stretching also helps. There is no point in stretching anything brighter than your brightest nebulosity, so identify that point in Curves and don't stretch above it. It really is all about tiny steps, there is no One Big Fix. For all that, the image with which I'm the least satisfied in my collection is the Veil, despite every effort! Olly
  14. That's worked! There's a certain charm to persuading data from different sources to come nicely together. I love your thread title! It proves we're all mad... 😁lly
  15. I think the outer halo would be helped by a more neutral background sky. It's pretty blue 'as is' and, since the outer halo has also emerged as blue in your mapping, the contrast is diminished. As you say, I think this will be a processing job to tinker with. Olly
  16. 84 hours is the one! 🤪 Splendid. Olly
  17. Sorry, I'm with Adam J. The sun 'joke' is an in-joke for astronomers. When you post on the net you have no idea who will be watching or how they will interpret what you say. They may well not 'get' your irony at all. (Irony is the term, by the way, rather than sarcasm.) And, I'm sorry again, I found the rest of the humour pretty turgid as well. Olly
  18. I've done five first year level courses under this consortium and found them excellent on all counts. Going further would have meant sorting out my maths, a potentially impossible challenge. I was a teacher at the time of my astronomical studies and, therefore, had an interest in the business of teaching and learning as a subject in its own right. What struck me was how well thought-out the courses were from a pedagogical point of view. I felt I was being very well taught. Olly
  19. No, I don't have it Vlaiv. I constructed a big mosaic recently using panels captured and pre-processed in APP by Yves Van den Broek and I also used his APP-generated mosaic as a template, but I post-processed exclusively in PI, Registar and Photoshop. The first thing I'd say is that many M31s (including my own) are probably too colourful with the arms too blue and the core too red. Personally I don't like to use any automated stretches, including DPP. The stretch is the defining operation in post processing and I like to do it slowly, in small increments, and by hand. Once I get the background up to a reasonable brightness I stretch only above that. Olly
  20. Stars are point sources. They don't follow the 'F ratio rule' even if you believe in 'the F ratio rule' - which I don't! (I believe in area of aperture per area of pixel.) Olly
  21. I agree with Alacant. Try to get a rough focus by day. The distant horizon is a good bet. Olly
  22. That's absolutely gorgeous: the texture of the surface is tangible, making you want to run your hand over it! Olly
  23. Very smooth, Carole, and a good field given how big this object is. To my eye the black point looks a little high and perhaps you might coax out a little more local contrast? Olly
  24. Late to this party but it's a great party!! Stunning image. I do love a wide M45 because it all makes so much more sense when you see the lot. And, as others have said, it's a treat to see you back, Peter. Olly
  25. 1) Do you have red lighting anywhere in the observatory or use a red torch when setting up, etc? Could that have been getting into the tilt plate gap? 2) I don't like the look of the original red flat at all and the arcs certainly shouldn't be there. Whatever is causing them (we hope the tilt plate gap, now fixed) may well have been at a very different intensity between the flats light source and the lights light source, so they wouldn't be expected to calibrate out. If the flats were under correcting the percentage interference by whatever caused the arcs in the flats was lower than the interference during the shooting of the lights. That would fit (I think) with red light in the observatory getting in through the gap during the imaging run. I would paint that tape with barbecue or stove paint as well. You need a pigment based paint rather than a dye-based one for blackening optics. Black dye paints can be reflective in IR. Olly
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.