Jump to content

ollypenrice

Members
  • Posts

    38,261
  • Joined

  • Last visited

  • Days Won

    307

Everything posted by ollypenrice

  1. Direction may be a variable due to the proximity of the light source and the ray diagram cited from Oddsocks earlier on in the thread? Filters only work within a range of angles of incidence and if scattered light is falling outside that range artifacts would be expected.
  2. We've certainly seen this before on SGL in the context of flats. Members having trouble with flats discovered that they had inadvertently shot the flats in fast readout mode and they were not working. Just a guess, but does the software choose a fast readout if the exposures are short or make some other decision for us of which we're not aware? My spell with over-correcting flats was cured, for a while, by capturing them in a different program (AstroArt rather than Nebulosity) though we continued to shoot the lights in Nebulosity. And then, one day, this also stopped working. This kind of thing makes me despair of finding neat solutions or explanations for everything. I still do suspect the camera but include in that the capture software. Brendan didn't have this with his DSLR. Can an increase in sensitivity fully explain the sudden appearance of this problem? Should we be looking again at the filters? What if they behave differently under the different illumination produced by the sky and by the flats? Filters are quite batch-specific and could be faulty. Olly
  3. Looking at the with-without files gave me effects I've never seen before. Without flats: the image looked like a perfectly normal vignetted image with a smooth gradient between dark corners and a bight middle, slightly offset to the left. Such offsets are common. I tried both ABE and DBE in Pixinsight and the gradient maps (effectively artificial flats) looked exactly like the kind of flat I'd have expected given the vignetting. However, it was totally useless when applied to the light using division. To my great surprise it did not flatten the image, though it looked as if it ought to do so. With flats: the image is flatter than it was and does, still, look to me like one on which flats have over-corrected but also introduced other oddities. Again, PI's DBE and ABE failed to flatten it. Conclusion: nothing even slightly definitive but I think the flats are trying and failing to work. They are not totally useless but they are not consistent with the lights so they don't flatten. If we have two effects, fogging on the camera and Oddsocks' faulty flats light path (panel too close) we might find ourselves with a chaotic and unpredictable set of consequences. All I can suggest is to eliminate the fogging and try changing the flats method. (Panel further away.) The underyling data look good, frustratingly. Olly
  4. The most important choice of all is which wonderland to go for. In the blue corner we have Las Vegas and, in the red, the Atacama desert... 😁lly
  5. 62 MB at 2.2 MB per second with my internet... I would do as David/Oddsocks suggested and move the flats light source further away. Oddsocks' diagram is very potent and, though I'd forgotten it, I remember being impressed by the idea at the time. I'd still like to see two simple JPEGS: a log stretch with flats and a log stretch without. Vlaiv and I are not seeing the same things in the files we have seen so far. Olly
  6. Here we see four bright corners and a bright middle, exactly what we expect from over-correcting flats. Obviously the wipe has had a strange effect on them but it's all there - bright corners and centre. I can't download the files but it would be helpful to see a stack (with flats) stretched but with no other intervention and, likewise, a stack without flats given the same stretch. It's always worth remembering in any trouble shooting operation that there may be more than one defective component so I don't say it's only over-correcting flats but I think they are playing a part. Olly
  7. Because the image has a dark centre and bright corners. Here I quickly measured the background sky values in Photoshop. I'll put money (or would if I had any 😁) on this being the inverse of what a flatless stack would produce. Olly
  8. The scope isn't up to the job Edited: I don't think it's the scope but Vlaiv is right about light leaks and reflections. My doubts come from having seen so many threads about over-correcting flats. They afflict all kinds of optics. The flats are over-correcting I feel you can also be sure of this. The flats need to be >1 sec Has to be worth a try but also leave a multi-second delay between each flat in the sequence. This also has to be worth a try. Also, if I have to resort to putting sheets of paper in front of the light panel to achieve this, will inconsistencies in the paper cause artefacts in the flats? No, the paper is too far out of focus. Think of that primary mirror sitting in the middle of the light path. You don't see it. However, if it's going to worry you just stand by the panel during the flats sequence and rotate it from time to time. What do people think my chances would be of getting better results with an ASI533MC which is looking like an increasingly attractive alternative? Possibly quite good because this problem is almost certainly arising somewhere in the camera-readout-software-calibration-stacking parts of the operation. All of these will experience change if you change the camera but you risk finding that the problem may persist, especially if you stick with the same manufacturer whose software may have common factors between models. Olly
  9. One way to mitigate a stack which has been over-corrected by flats is to blend it with a stack without flats at whatever relative weighting gives the best result. You can do this in many programs but it's best to use one which gives a screen-only stretched view of the linear data. That way you can put one on top of the other and weight them while seeing what you'll get. I used to do this with our problem rig using AstroArt. It isn't a solution but it kept us in business. Olly
  10. OK, looking at the 5 greyscale images in this post, all but image 2 show very obvious inverse vignetting as caused, routinely, by over correcting flats. In image 2 something else is going on as well. I really can see no room for doubt about this but, to confirm it, you just need to make stacks without flats and see that the bright corners and dark centre will be transformed into dark corners and bright centre. So far this is all perfectly mainstream but, where it gets difficult, is in working out why the flats are over correcting. I operated a setup (not my own) on which we never got to the bottom of it. What's even odder is that the over-correcting-flats problem popped up one night despite absolutely no identifiable changes to any aspect of capture or pre-processing. I would certainly follow the suggestions of Nicolas regarding flat exposures. When I was struggling with this problem Harry Page, who knows what he's talking about, suggested that there might be a problem with RBI (Residual Bulk Image) when capturing flats in quick succession. It might be worth trying a multi-second delay between flat exposures. (Harry was talking about CCD so I don't know whether this is relevant to CMOS or not but it's a dead easy experiment.) This is not an alternative set of comments to Vlaiv's. It's perfectly possible to have two problems at once. I would certainly check that there is no small equipment light coming on and off during your runs. We've just found that, after a change in our RASA cable routing, a tiny camera light was reflecting off our cable guides and creating a huge gradient. Obviously the front-camera RASA is particularly sensitive to this but I'm wondering if you have something which isn't lit when you check the system but which decides to light up once you're not looking! What you should avoid at all costs is using the black point to clip out the gradient. Gradients must be adjusted out, not clipped out. To do this the software needs the full range of brightnesses both above and below the target background brightness. If you black clip you leave it nothing to work with on the side below the target. Olly
  11. Very clean with outstandingly tiny stars. This gives a great 'look.'
  12. I'm not sure about the build quality of WO scopes. I think they concentrate rather too much on making them look pretty, though there are certainly good ones. Anyway... I think you're starting at the wrong end of the thinking process if you begin with the telescope. To my mind this should be the last item to specify, not the first. My order would be... 1) What do you want to photograph? Small targets requiring high resolution or wider field targets needing more field of view? To answer this you'll need to explore what various chip sizes at various focal lengths cover on your chosen targets. It's possible to fall into a no man's land between long and short FL if you're not careful. You can do this on a software planetarium allowing all the permutations to be modeled. Of all the decisions, this is the most important. 2) Which camera? If you already have it, that's a given. If you don't, or might want to upgrade in the future, you need to think about chip size for field of view and pixel size for resolution. (Note, pixel size and not pixel count.) 3) Which telescope will meet these needs? Is the focal length right for the FOV you'd like? Does the corrected circle cover the size of chip you'll be using? If shooting broadband (OSC or LRGB) you'll want very good colour correction. This will be less important if you're only shooting narrowband. At a given focal length, more aperture speeds things up, but a faster F ratio is much more expensive to deliver with good quality. (Don't worry about F ratio in isolation, it's F ratio at a given focal length that matters.) In reality there's a lot of back-and-forth between 2) and 3) above once you've started to close in on your choice. Olly
  13. If you're not using a moving rear lens element such as a flattener or a reducer, isn't the back focus simply the focal length of the lens measured from its rear element? If the ST 80 is F5 then its focal length is 80 x 5 = 400mm. Olly
  14. And long lived. One of mine has just croaked after about 25 years... Olly
  15. All sorts of reasons for using a Telrad. - You can put a Telrad circle (transparent overlay in Wil Tirion's atlas, virtual one to scale on software planetaria) and mentally note its position relative to a couple of nearby naked eye stars. You really only need two. It is then easy to replicate the Telrad's position on the sky relative to those stars and there's rarely any need to star hop. One can drop onto the target almost directly from the Telrad or directly from it when lucky. This is where the circle beats a dot because the circle gives a sense of scale. Certainly the way my visual perception works, and this does vary between people, there is no substitute for those circles. - Initial alignment stars: you know without doubt that you're on the bright star in question and not accidentally centering on one nearby. - It is nice to have this very direct link between the telescopic and the naked eye view. 'So where is this object, exactly?' is a question often asked by our beginner guests and the Telrad answers that for them precisely. - My imaging rigs, like others no doubt, are complex in shape and bulky. (Two large refractors in parallel on a large GEM with CCDs and filterwheels in the back. There is literally no single position in which a finder is accessible at all positions in the sky. Because the Telrad has no parallax it can be viewed at a distance and sufficiently off-axis to be workable anywhere these rigs look. (my mounts don't have a home park position so I need a one-star alignment to start the session.) - It will do many of the things a laser will do if you don't want to, or can't, use a laser. For me the Telrad is one of those items which, once tried, you'll never want to be without. (A bit like heated handlebar grips on a motorcycle! 🤣) Olly
  16. This nicety was lost on me but, yes, I'm happy to spare any intervening life forms! Olly
  17. Unlike capture skills, processing skills never reach any kind of limit... Olly
  18. Proportion and common sense come into this. As others have said, lasers can be a boon for those operations which get harder as you get older. For a quick polar alignment onto Polaris, for instance, just aim one through your polar scope to be sure Polaris will be what you're seeing. That cuts the knee-grovelling, neck straining time in half. They are also a godsend for showing people around the night sky. 1 Mw is plenty. Very obviously, there are places and circumstances in which you shouldn't use them and we all know what those are. I don't think this thread should turn into one of those car manuals which tell you not to drink the battery acid. Olly
  19. A very good Eagle, especially since it was flying so close to the ground! 😁 I think it would stand more contrast in processing. You can use a standard contrast slider or, if you have a program with Curves, you could fix the bottom of the curve where it is and lift the curve above that. This will avoid boosting the noise. On any Ha nebulosity you can, if you have Photoshop, go to Adjustments-Selective Colour and lower the cyans in red (top slider to the left.) This is often spectacularly successful in boosting Ha emission. Olly
  20. The galaxy's good but, as you suggest, the residual gradient in the background sky is a distraction. I'd be surprised if Pixinsight's Dynamic Background Extraction couldn't fix this as a first operation with the data still in linear form but, even just working quickly with a screen grab, Photoshop allowed a bit of manual cosmetic intervention on the background. Done with more care, I think it could work. Olly
  21. Here, even more than in the OP's post, the rings are not only distinguished by brightness but by colour. Not only that: the colours seem to be organized somewhat like an opened-up white light spectrum. (Or perhaps a sequence of RGB,RGB etc?) Could this be connected with the debayering process or some other aspect of colour processing? Or is it in the captured data due to some optical effect? What would happen if you took your debayered master flat and discarded its colour information then restored it to RGB format? It would be monochrome and identical in all channels but, since it would be in RGB format, the stacking software should accept it. I certainly think that this looks, above all, like a colour problem but is it optical or electronic in origin? No idea. Olly Edit: I'd certainly want to look at the master flat colour-by-colour to compare the rings.
  22. ZWO are not the only makers of astro cameras or even of CMOS astro cameras. There are many brands. You most certainly don't have to use ZWO products. olly
  23. I'm lost. What does pixel size have to do with field of view? If the chips have the same number of pixels then, obviously, the one with the bigger pixels will have a larger chip giving a larger FOV. Olly
  24. Thanks Peter. Great idea for loading it up. Once there I wouldn't be moving it! Thanks for this info. The pier is concrete, so very sound. Olly
  25. Excellent question. I photographed a rainbow last week and wondered how much my regular DSLR was missing. Olly
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.