Jump to content

ollypenrice

Members
  • Posts

    38,261
  • Joined

  • Last visited

  • Days Won

    307

Everything posted by ollypenrice

  1. I have to join the chorus singing that the OAG is asking for trouble... but your call. I wonder if this thread might be of any use to you? It deals with camera sensor tilt adjustment but, off the top of my head, this simple jig might also be made to work for testing other parts of the train for tilt. It's a daytime test. The other way to do daytime/cloudy night testing is to make an artificial star by gluing a ballbearing onto a black card and lighting it up with a flashlight. You then observe or image this from a good distance. It really does make a decent 'test star.' Olly
  2. Pin sharp and the galaxy details agree with other good renditions. Plus the tail. This really is crisp. I felt the colour balance to be slightly out. The background was looking a touch blue-magenta on my calibrated screen and the galaxies seemed just a bit cold. Measurement confirmed this: Photoshop gives the background about 13 in red and green and 20 in blue over most regions. That's a big disparity. Olly
  3. My wife wouldn't agree. On our visits to England she's always keen to try some feesh and sheeps. Olly
  4. But he got back to narrate the story! As an Archbishop in the House of Lords said about Gulliver's Travels, 'For my part, I don't believe a word of it.' * Olly *This is honestly true, by the way.
  5. I'm still pondering the time frame of the H G Wellsian traveller going backwards in time. He's not going backwards in his 'pre-time-travel' time frame because that would simply mean running himself and his experience in reverse. What he must be doing is continuing to move forwards in his own new time frame (he opens the time machine door, then steps out, then looks around, etc...) Now, what if he only went back half an hour to the time when he was about to get into the damned thing? He'd be able to watch himself doing so. There would be two of him. Does this mean that the time traveller's time frame and his original are now parallel? (He can't re-join his old time frame because there would be two of him.) The more I think about this, the more convinced I am that H G Wells must have been making it up. Shocking. Olly
  6. In popular imagination a time traveller goes back in time as a separate entity from the self who arrived at the point when he or she began their journey into the past. There are, therefore, two of them co-existing at the same time on this backward journey, one travelling forwards and a different one travelling backwards. This seems odd... And, while travelling backwards, what does the time traveller see? People moving and speaking like a film played in reverse? Perhaps travelling backwards in time is an unpleasant and incomprehensible experience because the laws of time travel turn out to require you to replay your existing experiences backwards without being able to become a separate entity? Not a very rewarding experience, being stuck inside a self walking and speaking and thinking backwards in a world in which everyone is doing likewise. And then, after this, you simply replay your previous experiences in the right direction without being aware of replaying them (because you weren't aware of replaying them first time around!) Ah! ...and here's the thing: how do you know this isn't exactly what you are doing while reading this? Olly
  7. 'No, less. I regard you as a single lump of proles...' 😁lly
  8. If we travelled back to a time in which we were there as a returner, it would not be the same past as a past in which we were not there as a returner. The decision to return or not return can therefore create two pasts. I think our incomprehension of time is the biggest intellectual obstacle we face. Olly
  9. Does testing on a double star define what can be distinguished in details resolvable in fainter, more extended objects? Stars are exceptional things, optically. Olly
  10. Like Wulfrun I'd use binocular as an adjective and binoculars as the noun. Since I don't believe unreservedly in the concept of linguistic 'correctness' I don't know what the dictionary says and wouldn't be influenced by it on this one. However, the version with the 's' could be bisexual for me: 'This is a nice binocular,' sounds bearable, even though I wouldn't use it. In that phrase the noun, whatever it might have been, (binocular what? ) is left unspoken but is somehow hanging there in the air. It has, it seems to me, left some of its noun-ness implicit in the adjective. I can't see myself using the phrase though, when I can be entirely comfortable with, 'These are nice binoculars.' Now I'm imagining an obsequious tailor proffering his wares to a customer with the phrase, 'I think Sir will find this to be a particularly fine trouser...' 🤣 Olly
  11. I can't agree with this. What you 'photograph' in a single exposure on celluloid film will include: - Colours determined by the chemists who brewed the film's emulsion. - Colours affected by the colour correction of the optics. - System noise including the limited responsiveness of the film and the irregularities of your own light path. Somewhere in the middle of that lot, yes, the object will also contribute to the image! 😁 Now, when we go through the palaver of multiple stacking, dark subtraction, colour calibration and flat fielding, we are reducing the system noise and increasing the contribution of the object to the final picture. The fidelity of the image, which can only mean its resemblance to the original out there in nature, is enhanced by the palaver we go through. There is nothing natural about any form of photography. I cannot, alas, grow a camera in my veg plot. It is, in all its forms, a technological process and requires ever increasing complexity to reproduce increasingly accurate representations. A simple camera will take a less accurate (and therefore less natural?) photograph. I believe the idea that a basic point and shoot Box Brownie takes a more natural picture than a Leica Q is fallacious. In a nutshell astrophotography is about extending the sensitivity and dynamic range of our eyes, and 'natural' astrophotos are ones which we suppose resemble what our eyes would see if they didn't need this techonological help. The technological help AP provides does not resemble the technological help provided by a telescope because of the intractable 'surface brightness problem.' (Bright enough to see soon means too big to fit in our field of view.) Olly
  12. Another variable is the quality of the optics. It's not enough just to think in terms of the raw numbers involved in sampling rate when making the comparison. I haven't really imaged with an SCT but I have owned and used the Meade 127 triplet along with a TEC 140 and Takahashi FSQ106. (I've also used the TEC with and without its 0x flield flattener, which throws another curved ball.) The much shorter FL of the 106 means it's not a good comparator but it is worth comparing the TEC and the Meade. The TEC produces sharper images. And so it should, of course, given its aperture advantage and vastly higher price - but it does. The Meade's colour correction is very good for the price, very good indeed, but the premium refractors can beat it, meaning, of course, that more of the spectrum is in perfect focus. The curved ball from the TEC flattener is this: according to TEC, the flattener has no effect on colour correction and only flattens the field. According to most owners, including me, it does improve the colour correction, producing tighter stars and signifcantly reducing blue bloat. I'm in no doubt about this despite my respect for Yuri Petrunin. So my point is that we're not just comparing numbers, we're comparing specific instruments. Bearing that in mind, I'm keen to get that Meade ACF out onto a mount! Olly
  13. It's an F10, though I have the AstroPhysics telecompressor as well. I still have the Atik 11000, yes, though I've lent it to Tom at his robotic installation here for the moment. As you say, the large pixels would be well suited and it wouldn't matter if the full frame of the chip wasn't covered. Crop and go! Olly
  14. I have a 10 inch ACF which, absurdly, I've never used. I need to press it into service because I'm sure it will be good, especially now that there is star-removal software which can be used in order to produce smaller stars. (The previous owner got great images apart from slightly oversized stars.) Olly
  15. Hi John, I would first finish the LRGB image and accept the cores as they are. You already have that. I'd then open up the linear RGB and stretch it only for the quality of the core. I'd ignore the rest because it won't be used. The idea would be to get the outer parts of the core (or the area just around it) to a brightness as close as possible to that part of the main image. You need to do that in order to get a seamless blend later. However, I'd be using a stretch, probably done manually in Curves) to keep the core itself as free from saturation as possible. Next comes the blending of the gentler core with the rest. The proper way involves a layer mask. I'll use 'short' for the new RGB image and 'long' for the main LRGB here. 1 Copy-paste the short over the long as a layer. 2 Add a layer mask for the top layer. 4 Copy-paste the long onto the layer mask. Considerably increase its contrast to make the light parts lighter and the dark parts darker and then give it a big gaussian blur, maybe 3 to 5 in Ps. 5 Adjust the short till you see a seamless blend. Use levels or curves and the colour saturation tool for these adjustments. Now for the bad boys' way! (It usually works on round targets like cores and stars...) 1 Paste the LRGB on top of the short RGB stretch. 2 Take a fully feathered eraser a little larger than the core and apply centrally over the core on the top layer. Now you can see the short RGB core in the context of the main image. 3 Play with the RGB layer in Curves till you get a seamless blend. Before doing any of this, be sure to check that your RGB core isn't saturated at the linear stage. If it is, you'll need short exposures. I would only bother with RGB because the point of luminance is to get more signal and what we want here is less. I can't help with the gain question since I've only recently started using CMOS but, as ever, I'd experiment. Olly
  16. Funny you should say that because that's exactly what I was doing when re-processing my Shark and it did work well. I tend to under-process the dusty stuff and hadn't used the 'equalized copy as layer mask' method on the original. I'm preparing copies of my images for print and find that they need more separation between faint stuff and background in print than they need on the screen. That's why I tried the equalize trick and I found I liked it for screen viewing as well. I think the method is essentially a form of 'inverse layer masking' but it is far quicker and less complicated. Olly
  17. Oh, that's an interesting one. It seems to have got under my radar so you've inspired me, Carole. You've made a good job of it, too. (It's quite topical because I've been re-processing the Shark today. We could end up with quite an aquarium!) Olly
  18. I think your OIII is working beautifully to bring out what is a very faint outer shell, often missed by imagers doing this target. I know the OIII features are not 'in your face' but I really think they lift the target. The colour variety is good. Mine is in LRGB with Ha and OIII and, apart from the OIII shell, the rest of the nebulosity is very samey in colour. I might ease off on the saturation, though! Olly
  19. Heh heh, I'm indifferent to the business of capture but love the processing! I very much like your processing of the nebulosity, which is boldly visible but still fluffy and - well - nebulous. It's easy to end up with it looking rather solid. I'm less keen on the stars, which seem admirably tiny but somehow a bit 'spikey,' somewhat sharp and hard. There's a band of Ha crossing the Witch. Well worth having that, in my view, if you're feeling inspired. Olly
  20. Those are great. In your place I'd want to be looking at the slightly saturated cores in both. What I've found effective is not to take short exposures for the cores but to do an image in which you only concern yourself with the cores and only use the RGB data. With less signal it tends to be less over-exposed than the L. You can then layer mask or otherwise blend the less-stretched core into the present LRGB. Olly
  21. Looks great. I love these dusty targets when you see them hanging there in space and suspended in three dimensions. Olly
  22. I've now come to the conclusion that, even at 0.9"PP, I'm oversampled. We also need to remember that we don't post unprocessed images, we post processed ones. The more signal you have, whether from time or from aperture, the harder you can process and the better the result will be. A better real resolution with inadequate signal will not give a better image than a slightly inferior real resolution with a lot more signal. Olly
  23. Nice, but the whole image is surely brighter than it needs to be? Both the disk and the background could be darker so that's an easy one. If you shape the curves carefully you can also reduce the redness of the background. I don't think there's an easy way to do that in Levels without black clipping. (Could be wrong about that, though.) Olly
  24. Sure, but I suggested a comparison, ideally, of 3x5 and 5x3, though longer comparisons would be better. Now when I have a stacked dataset I don't measure it, I process it. I judge it on how well it behaves under processing because, at the end of this long process, I'm not going to measure the final image, I'm going to look at it. Along the way I'll find strengths and weaknesses in a dataset. It's not enough simply to spot them or measure them: what I want to know is which defects I can process out and which I can't. That tells me which defects I prefer! No defects is rarely an option. So, as you know, I remain a pragmatist. One of my rigs produces uneven background sky at pixel scale (a scattering of over-dark pixels.) I can fix that. Another rig likes long subs but saturates stellar cores. I can fix that as well. What worries me are defects I can't fix. Experiment lets me find them and avoid them. Olly
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.