Jump to content

ollypenrice

Members
  • Posts

    38,264
  • Joined

  • Last visited

  • Days Won

    307

Everything posted by ollypenrice

  1. 'The TEC180 represents a small increase in resolution for a large increase in spending.' This came from a reliable source - TEC's own website! In truth they've now changed it but I'm all but sure it used to say this because it always made me laugh. However, it remains perfectly true. 0.66 arcsecs for the 180 against 0.8 arcsecs for the 140 and a threefold increase in cost. Will this affect resolution in imaging? Hard to say. On nights of very good seeing it might, just. On most of the nights I experience I don't think it would. However, my friend not far away has an AP175 and just looking at this scope, let alone through it, would entertain me for hours! Refractors in this class just look fabulous. Olly
  2. This isn't a beginner image. It's tightly focused, well guided and has a flat background sky. Whatever your 6 inch optics are, they are very good. Personally I use nothing larger than 5.5 inches to get 0.9"PP. Before getting too gung-ho with the blue it's worth a look at the astrophysical data to find the colour index of selected stars. It's available from many planetarium packages (I use SkyMap Pro) and can be compared quickly with a B-V colour index chart like this one. https://www.researchgate.net/figure/Dependence-of-color-index-V-t-mdinstTH-from-color-index-Bt-V-t-for-different-spectral_fig1_275388313 Blue stars have negative or low B-V values, red stars have high ones, as the chart shows. Clearly there's no need to check all the stars in your image but checking a few gives you an idea of your colour calibration. Olly
  3. 'Rotate 180' in Artemis does require rotated calibration files. A meridian flip doesn't. On the other hand a meridian flip does require the stacking software to know that a rotation of the calibrated image is necessary. Mine, Astro Art, needs to be told to support rotation up to 180 degrees. Some just do it without being asked. Because of the danger of forgetting, I never use 'rotate 180' in Artemis. Instead I open the earlier image I'm using as a framing reference in another package and rotate that so it matches the incoming images. Olly
  4. Agree with Francis - nailed! Super. I often end up blending two separate processings. Why not? Olly
  5. Sure. The advantage of the OAG is that it allows for mirror movement in the SCT. This is usually called 'mirror flop' but the term is an exaggeration. It does't flop, it may just move a bit. I would use an OAG with any reflector for this reason alone. Olly
  6. This might be a time to remove and replace noise, in that case. I remember a guest asking me, regarding this Ps filter, 'Add noise? Why would you want to add noise?' At the time I agreed with him but what can happen is that noise on a larger scale can only be removed by the usual smoothing algorithms which leave the 'vaseline on the lens' look. This can be mitigated by the careful addition of a little Gaussian noise to put the texture, the grain in smoothed parts, back to the image's overall level after smoothing. I did this in a recent image. I'm not saying which or where because if it didn't offend you I got it right! More data is the right way but then you're always tempted to stretch the new data a bit harder which means you're tempted by NR which means you want more data... In a west country accent: I like zider. When I drinks zider it makes I smell. When I zmells people gives I money to go away. With this money I buy zider... Olly
  7. A focal reducer would widen your field of view and reduce your pixel scale. More sky will land on each pixel. In a perfect world this would reduce the level of detail in your image but in the real world your image scale in a C9.25 is almost certain to be unreachable because tracking errors and atmospheric seeing will be the factors which limit your real captured detail. The reducer will be a big help. Many will assert that the reducer would also reduce exposure times but this is not as simple as it seems. Beware of the Hyperstar website which is misleading. 'Exposures that take an hour at F10 take mere seconds with the Hyperstar lens.' In my opinion this sentence implies that they are the same exposures - that is exposures of the same target - but they are not because a galaxy which fills the frame at F10 will be a tiny thing surrounded by stars and sky with the Hyperstar. If that's what you want, fine, but there is no free lunch. The F6.2 reducers are widely available second hand. And the idea that an F2 system can ever be 'easy' will find few takers in imaging circles. I'm agonizing over whether or not to say this but... you are about to devote a fair amount of cash to set up the C9.25 for deep sky imaging. If you later regretted this you wouldn't be the first and I've had the same regrets over a similar SCT. Modern cameras are getting smaller and smaller pixels which means that longer focal lengths are becoming unnecessary and even counter-productive in some cases. And they do make life difficult. Olly PS The reducer does not work with the Hyperstar. They are alternatives.
  8. This is vital. I work in Ps and use entirely different NR techniques for L and RGB. It's all very easy in Ps, too. You simply create two layers, apply slightly too much NR to the bottom one, then use the colour select tool on the top layer and erase as much or as little of the noise as you like while looking at the result in real time. You can also select and erase specific kinds of errant individual pixel from the top layer, revealing their denoised counterparts individually. This reduces the effects of pixel to pixel interaction. Olly
  9. I tried all these tests, Vlaiv, and they made no difference whatever. With one setup the flats always over corrected. I tried different exposures, different brightnesses of source, using bias as dark, using 'correct' flat darks... The over correction continued in precisely the same way. Comparing differently acquired flats as Ps layers with blend mode difference showed them to be nearly identical. I was capturing everything in Nebulosity. Eventually I shot some flats in Astro Art and they worked for many months. Then one morning they started over-correcting again! They just did. Nothing that I'm aware of had been changed. More recently a set of flats from the Moravian did the same, captured in SGP. Because of the shutter, exposures are quite long so I calibrated with 'correct' flat darks, not bias. I shot another set using the same values and they worked. I can't explain it. The only thing I conclude from this is that my Atiks, running in Atik's Artemis Capture, have never ever generated problem flats. Why do I like Artemis Capture??? Only cameras running in other software have given trouble - but this could be chance or human error. If it's error I don't know where it came from because I ended up using a written checklist and following it to the letter. I also followed it with guests watching over my shoulder. Olly
  10. I think lum is a good idea because you can harmlessly use NR on the colour beneath it. Olly
  11. I think the core is tremendous. If some aspects of it are slightly unfamiliar I think that's because they are better than we usually see. The progenitor star is well controlled and the very tricky pinks are clear as day. For me the core is right on the money. If the image has a problem (I stress 'if') it's that noise levels in the brown dust (which I think is just the right colour) are far higher than in the core. For this reason it hasn't been possible to push the brown contrasts as high as would match the core, either. Rodd is not shooting from a dark site so this is always going to be a problem. Sure, more exposure beat noise reduction but, when I run into this problem in post processing, I take a look at the noisy area in Ps at close to pixel scale and see what I can do about it. I don't like proprietary NR routines because they use pixel to pixel communication (blurring) which gives the slimy look. I prefer to work on groups of similarly errant pixels in isolation from their neighbours. I'm not above over-smoothing and then adding Gaussian noise in Ps either. I'd rather not but it's been known... I suspect that really crafty custom NR in the dust could bring it, aesthetically, into line with that splendid core and make the image harmonious with itself from corner to corner. Olly
  12. In imaging the critical difference is in focal length and so field of view. In visual it's a difference in light grasp. Since neither is a light bucket it's likely to come down to issues like portability. While I might prefer to image with an 80 (depending on pixel size) I'd rather look through a 100. Olly
  13. I think what we are asking is, 'Can a scope with cone error be used for polar aligning?' Since none of us is likely to own a scope mounted without cone error I think the answer has to be 'yes.' Olly
  14. The real issue is that you need good flats and if there is anybody out there who always gets good flats, which just work and never over-correct, I'm all ears. I follow the same routine each time and mostly it works but always, certainly not. I wish I knew why this is, but I don't. Olly
  15. Any time, Michael... Obscure galaxies have a certain charm! Olly
  16. This, without doubt. However, I know what John means, below, ... but I've always taken my pleasures seriously. (Why give precedence to your woes???) When the great cyclist Ray Booty's 'straight out hundred' record was under attack for the first time in over thirty years he was asked if he could remember much about the bike he used for his record. He said something like, 'Yes, quite a lot, I rode to work on it this morning.' That's a serious bike! Olly
  17. Resolution in the colour-shooting camera is a little lower than in the luminance-shooting camera, so a bit more colour doesn't go amiss. However, the weightings are also a matter of historical accident - how much from which filters we kept on the first shoot a couple of years ago. This image is just based on everything found in the collective pot! In all honesty I think we had reached data saturation, here, in the sense that we had enough data for more of the same to have little effect. I think, in the end, that I may have included 9.25 hours of L rather than the 7 I quoted but the weighting of two luminance runs was done by eye using Ps layers. In processing I spent more time on star control than on anything else. The galaxy just looked after itself. Olly
  18. Another one from an enjoyable fortnight with our guest Paul Kummer, though this uses a previous dataset as well. The increase in data does allow some little advantages to be teased out even though the first set was pretty good. Here we have around 7 hours of luminance and three hours per colour. There's no NB in this. It's full size here with the 0.9"PP of the TEC/Atik 460 used for luminance and cropped heavily. Dual TEC140, Mesu 200, Atik 460, Moravian 8300. Rig co-owned by myself, Tom O'Donoghue and Mr and Mrs Gnomus. Olly
  19. Another of these theories advocated an odd number when using Median to combine them - but I use average anyway! The idea that anybody could possibly tell the difference in the final image is - shall we say - fanciful. Olly
  20. I generally do about 30. Once you have the exposure time right they are not much bother to shoot. In all honesty I've never done a comparison but Per, who was into the technical side of imaging in a way that I'm not, did just 15 of everything, darks, bias flats etc. Olly
  21. Interesting range of opinions! My contribution (potentially heretical) for what it's worth: 1) There is no way in the world that you will notice differences in flattening between lights with small adjustments in focus. We are not doing photometry here, just getting rid of dust bunnies and vignetting. You will find it near impossible to get flats so accurate that the greatest deviation from accuracy comes from small focus changes. Just forget it. 2) My flats last anywhere between a week and six months in my observatories. Every night??? You must be joking! 3) Dust bunnies (visible ones) don't usually come from dust on the filters in my case. They don't, except very exceptionally (once every three years?) vary from filter to filter. So I use a luminance flat for everything, but see my next point. 4) Any slight flattening irregularities in an RGB layer will be re-illuminated by the luminance layer, by definition, so if that is well flattened you have the definitive flatness you need. Even NB imagers sometimes use their Ha layer as luminance so if the Ha is well flattened so are the other two layers. I take my imaging seriously but that includes weeding out unproductive activities. Flats are very important indeed but what I describe above covers my workflow most of the time. On the rare occasions it doesn't work I shoot the extra flats necessary - but that is a very rare event. I take a practical rather than a religious approach to flats based on hundreds of hours of exposure every year. Olly
  22. It's really best not to make FL your primary concern but to consider 1) Resolution in arcseconds per pixel and 2) Field of view. Here's a calculator: http://www.12dstring.me.uk/fovcalc.php Around 2 arcsecs per pixel is a nice friendly resolution and 500mm will give a good FOV on a DSLR chip. Finer resolution requires finer guiding and better seeing (more stable with less turbulence.) With a good mount and a good site it is worth going down to about an arcsec for small targets like galaxies or planetary nebulae but a 4 inch scope is a great place to start. You could get the same FL with more aperture in a very fast astrograph but 'fast' tends to mean tricky and expensive. I know of no fast, easy astrograph. You are right that it is possible to image at fine resolutions which will produce no genuine additional detail. This is sometimes called 'empty resolution.' Olly
  23. Yes, put up a JPEG. It will do fine for our purposes here. The smaller stars look miles out of focus and seem to show the shadow of the secondary. Could it be the distance between the chip and the CC which is causing this? A specific distance will have to be respected here and will be stated by the CC maker. This has to be done before any attempt is made to focus. Olly
  24. I liked your post very much and many points resonated with me. This last one, though, misses a key point about astrophotography and, more specifically, about image processing, which is (for me) the most rewarding part. Your choice of the term 'intimacy' is an excellent one, but it applies just as perfectly to image processing as it does to visual observing. It's a different kind of intimacy, sure, but it's a closely related one. Our data often contains the merest hints of signal just above the background sky and who knows what faint structures might lie, tantalizingly, within that signal? They might be tidal tails from ancient interactions, they might be vast extensions of better known and brighter nebulae, they might be ghostly outer spiral arms rarely seen or they might be traces of the Integrated Flux Nebulosity. Then again, careful sharpening might find new structural details on the finest scales. The object becomes a friend, gradually revealing more of itself, providng answers yet posing new questions. What, for example, might the source of that particle wind which must surely be sweeping through Orion from the west, imploding the western side of Barnard's Loop and shaping the Meissa Nebulosity? No visual observation can know even that this phenomenon exists because it is both too large and too faint for any eyepiece yet for me, processing an Orion image, it became a source of fascination, an intimate interaction with that constellation. I don't feel I know an object until I've imaged it. Looking at someone else's image doesn't work. I need to work through the data processing to feel confident that I know it. Olly
  25. I've had a few visit here and they've all been mighty impressive. Olly
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.