Jump to content

ollypenrice

Members
  • Posts

    38,260
  • Joined

  • Last visited

  • Days Won

    307

Everything posted by ollypenrice

  1. Remember to allow for the orientation of your camera relative to RA and Dec. It is best, by default, to have one side of your camera parallel with RA and, therefore, the other with Dec. If the long side is parallel with Dec we informally call it 'portrait,' or 'Landscape if the long side is along RA. It's easy to align the camera. Just take a 5 second sub and slew slowly during the capture. You'll get star trails showing your present camera angle. There is no 'right way up' for our targets but it is easier to model the field of view on the sky if your camera's orthogonal with RA and Dec and, also, you can come back to a target on another night and replicate the framing easily. Replicating a random camera angle is a nightmare. Olly
  2. The first thing to do is obtain a planetarium software which lets you put in your chip size and focal length so you can model your field of view on the sky. The one I use is no longer available but I think there are free alternatives, maybe Stellarium or Carte du Ciel. Perhaps someone could come in with specifics? Olly
  3. The image you post is an example of what I mean to aim for. I've worked on lots of M31 datasets and am working with a guest on a his new set at the moment. As I said earlier, I've invariably noticed that they tend towards a fairly monochromatic colour, more or less green or tending towards magenta (which is one axis), in the stack. What we should ideally see is a yellowish-red, stellar population II core with traces of blue population I stars in the spiral arms. Initially these colours are insufficiently colour-contrasted to appear within the dull greenish initial appearance. The colours of the image in your link are often latent in the greenish data, as a crude boost of saturation will show - but saturation is not the way to extract them. I only use it as a test to see what is latent in the data. I use two methods to boost colour contrast in Photoshop, both saved as actions so they are one-click thereafter. I apply them as a layer so that the opacity slider lets me decide how much of the modification to retain. 1) The Lab colour trick. Image-Mode-convert to Lab Colour. Channels. Make 'a' channel active and go to Image, Adjustments, Brightness and Contrast and boost the contrast massively. I use 30. Make 'b' channel active and do the same, boost contrast to 30. Convert back to RGB colour. 2) The Soft Light Trick. Make two copy layers, so there are three layers in all. Change the blend mode of the top layer to Soft Light and flatten onto the middle layer. Give the middle layer a slight blur (about 0.6 on the Gaussian blur slider.) Change the blend mode to Colour. Flatten. On a really reluctant, colour-weak image I might run 3 iterations of each, alternating the actions. Sometimes only one iteration is needed. The results are much more subtle than Saturation boosts and don't provoke the colour noise of saturation either. Olly
  4. Possibilities: The guide camera or scope dews up during the run. A cable or other obstruction starts to drag. You have a voltage issue, notably in the supply to the mount. Polar alignment is out meaning that the effects of the error increase over time from when calibration was performed. (Note; I'm not absolutely sure about this one! ) If you use PHD2 to guide, run the guide assistant. If you're not using it, try it. Olly
  5. Because I image from a dark site I don't have too much work to do on colour, but I do have a little. What I do (which is very little!) has also worked on guests' data from light polluted sites giving a bright orange wash-out. I just use dynamic background extraction on the linear data in Pixinsight. Sometimes I'll use automatic background extraction when I can't really find any neutral background in the image. This usually gets me very close, though often a green bias exists at least somewhere in the image. This can be fixed with SCNR green, applied as sparingly as possible. Being happier in Photoshop, I take the modified linear image into Ps for stretching. By placing a few colour sampler points (set to 5x5 average) on the image background sky, I can see if my background channels are equal, as I like them to be. I try to end up at 23/23/23. in RGB. If the background sky is right, I tend to find that the rest of the image will follow. I do think, though, that you will need one of the dedicated and astro-specific gradient tools in your workflow. Many astro processing packages offer something. M31 is an oddball on colour. Unless worked on, it tends to have a fairly nondescript, murky and monochromatic look, located somewhere on the green-magenta axis. Prizing the reds away from the blues is the trick, but don't over do it. Olly
  6. Under UK skies I wouldn't make imaging my main diversion, nor would I see it as an alternative to daytime sporting or intellectual activities. Even here in Provence, with hundreds of clear nights each year, I have a plethora of other entertainments available, some of them expensive and some not! If I lived in Cumbria, I would do a lot of golf-free golf (by far the best kind ) by walking those hills and dales and relishing every kind of weather for its own charm. I'd take a camera to satisfy my picture-making itch, too. Remember, fate might have sent any of us to live next to Spaghetti Junction... Olly
  7. It is mainstream physicists, versed in physical laws, who have questioned the validity of our perception of space, just as they have questioned the validity of our perception of time. I won't buy the simulation hypothesis at any price for the same reason that I won't buy another well-known hypothesis which simply postpones the question of how all this began. I spend many happy hours exploring, vicariously, the depths of what we call space so I'm not dismissing it. It exists as a perception, at the very least. It just may not be quite what it seems. A book I read on time made the elegant point that the existence of a past, a present and a future was... a theory of time. The same can be said of our three spatial dimensions. Olly
  8. Yes, it matters with alt-az, especially if you do anything less than a 3 star alignment (which will calibrate out any tilt errors in the top of the pier.) You say you level, but you don't! You tilt the RA axis to your equatorial angle, whatever that is. In my case that's 44 degrees from horizontal. Quite a lot! Olly
  9. The process you've described, which is perfectly sensible for a visual observing, has no requirement whatever for a level base. Polaris can be centered in the polarscope from a pier top which is wildly off horizontal. If the polar axis points at Polaris, it points at Polaris and that's that. Who cares where your pier is pointing? This is an old story with a long history on SGL, but you can save time by concentrating on where your polarscope is pointing because that is what matters. Trust me: I'm a old gimmer! lly
  10. We should not really think of tracking and guiding in terms either of exposure time or focal length. These give a very crude estimation of what goes on. We should think of the tracking-guiding accuracy as the extent to which a point of light on the sky is held in the same place on the camera chip. During one rotation of its driven worm wheel a good, budget mount will allow that point of light to drift by maybe 30 seconds of arc and back. Once autoguided, that may come down to O.5 seconds of arc, or to 1 second at worst. That means you have improved your tracking accuracy from between 30x and 60x. In other words, it is transformed. When we get into the detail we find that the guiding error should broadly be no worse than half the image scale. A short refractor might be imaging at 3 arcseconds per pixel, so it needs to guide with an error of half that, 1.5 seconds, When your guiding is any worse than that you are beginning to move your camera relative to the sky and you are blurring the image. Although it isn't free, an autoguider performs a near-miraculous task. You'd probably need to spend an extra £5000 to go from a mount with a 30 second error to one with a 3 second error, a 10x improvement. An autoguider can give you a 30 to 60x improvement for a couple of hundred. Olly
  11. I wouldn't let the sensor shape trouble you since it's easy to crop a square out of whatever you have. If you like everything else about a particular camera I'd just do that, myself. In my view CCDs have not deteriorated since CMOS appeared, and we all know what great pictures they can make. However, the market may think that they're old hat and drive down prices radically. Once it's serviced I'll probably have my Atik 11000 up for sale in a big observatory shake-up, but I'm not expecting it to fetch very much. Essentially I think that CCDs may become real bargains in the foreseeable future. Olly
  12. It's certainly in a different league from any CCD OSC I ever tried. I think you might be right. Currently we have the ZWO edition and the TS version is due here this week. Both will be working at F2 so it will be hard to make comparisons with older rigs. Olly
  13. ...though in some rarefied regions of theoretical physics it has been suggested that space may be illusory. Now, can I imagine a spaceless universe??? And, if I could, where would I put it? lly
  14. Heh heh. I'm really quite optimistic! I have, however, given up all hope of predicting the consequences of new technologies. The printing press enjoys a revered place in history - and it's hard to argue with that. (Even for me, and I like to think I can argue with anything. 👹) But... a reasonable person might consider the internet, with its social media, to be a liberating extension of the printing press, a kind of printing press for all. The reality is, as it seems to me, that it has turned into a means by which raving lunatics unleash themselves upon a collectively compliant collection of lunatic-worshipers. My optimism envisages, out there, a world in which two terrestrial phenomena never evolve: predation and technology. How serene that might be! It might even be sufficiently sublime to make the possibility of eternal life just about bearable! Olly
  15. Along with two of my robotic shed clients, Paul Kummer and Peter Woods, I'll be joining this club. We're jointly funding a TS 2600 OSC COS camera, the Samyang, a ZWO focus motor and a lot of bracketery including FLO's adapter for eliminating the tilt-prone bayonet fixing. We aim to use the system at F2 and, on some targets, enhance regions of interest with higher res panels from the RASA 8 with its 2600 camera as well. I processed some data from another copy of this lens and thought it stunning, especially when using StarXterminator to hold down star sizes ruthlessly. This is Paul's Samyang data with my telescopic data for the Cone and Rosette applied for enhancement. Olly
  16. Yes. The fixed part of the mount is tilted N-S anyway, to your local latitude's angle with Polaris. The E-W axis just affects the time, which is re-calibrated by the star alignment. The very EQ6-like Takahashi EM200 mounts cannot be leveled since the legs are of a fixed length. If doing a drift alignment a level mount reduces the interaction between axes during the alignment process but not by much. Time would be better spent refining polar alignment than in aligning your pier with the centre of the Earth:! That's all you're doing, in the end... Olly
  17. A surprising thought: might high technology adversely effect the evolution of high intelligence? Might higher intelligence evolve without technology than with it? There is a debate in evolutionary biology about whether human evolution has stopped, slowed down or speeded up. One school says that we have learned to modify our environment, using technology, so that we no longer need to adapt to it. This makes me wonder whether, in 'outsourcing' many of our mental functions to computers, we have removed a key selection pressure driving the evolution of intelligence. We might argue that the use of IT drives a new kinds of mental function among the engineers, and it might, but they are a minority who might not out-breed the computer-dependent majority. It's an entertaining thought that technology might put an upper limit on the evolution of intelligence. Usually it's assumed that high technology goes with high intelligence. Maybe not! Olly
  18. When I was in my teens my father and I set out, systematically, to find out if there was any way at all that we could suggest going for a walk without our dog spotting it. There wasn't. There just wasn't. We never succeeded. I do believe there's something in this. Olly
  19. Is the practice of thinking indulgent on a forum devoted to astronomy? I'd have thought that it might even be regarded as acceptable on a forum devoted to shopping! lly
  20. I don't think that extraterrestrial life has to be all that different from our own life. I think it might well be quite similar. However, I also think it might be utterly and completely different. These are not alternatives: the universe, spacious hotel that it is, might have room for both. Regarding communication, though, I have to come back to the point that we cannot meaningfully communicate with chimps who are incredibly similar to ourselves. Surely this must give us pause. We are far too used to American actors pretending to be aliens and being even more like us than chimps! Olly
  21. I agree. I think it's wonderful that what is imaginable has expanded. Long may it continue. Two things can end this expansion: 1) reaching the point of being able to imagine the whole of reality and 2) losing the necessary power of imagination. I don't like the sound of either of them. I'm not sure which would be worse but I suspect, in my heart of hearts, that they would be one and the same thing because I don't believe we can imagine all that there is. Sure, but if it's a lot simpler, all you need to do is tell us how. How do we imagine things far outside our own experience? Olly
  22. OK. So the question is refined, but perhaps it becomes rather academic? If you mean by 'in principle', imaginable by beings other than us, there is precious little difference between your point and Michael's when he suggested that some life forms might be unimaginable. I assume he meant by us. However, if you meant that we have reached the point of being able to imagine everything, I'd have to ask why you think this. Why have we reached this infinity when chimps have not? Of course, I don't think you meant either of these things! What I don't understand is where you draw the line between 'imaginable' and 'imaginable in principle.' My view is that humans may not be capable of imagining all that can be imaginable in principle. (Phew, it has taken me a long time to get to a point which I might, with more wit, have reached some time ago...) Olly
  23. I doubt that it would be significant, though. Olly
  24. But could you imagine all of them? And, if you had, how would you know? I think, going back to Michael's point that some things may be beyond imagination, your position and his may be equivalent. We can imagine an unlimited imagination (because we can say 'unlimited imagination') but that falls short of actually having one - or of proving that anyone has one. Wouldn't it be profoundly unscientific to say, I can imagine anything? How could it be falsified? When you asked whether or not there was any evidence for a limit upon the imagination, I guess my answer should have been, 'No, how could there be? Science doesn't work by proving negatives.' Olly
  25. What would be the difference between these statements? I don't know what it is, but I can imagine it. I cannot imagine what it is. Doesn't your position require you to distinguish between them? Olly
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.