Jump to content

ollypenrice

Members
  • Posts

    38,260
  • Joined

  • Last visited

  • Days Won

    307

Everything posted by ollypenrice

  1. No problem. I've done something similar myself, ie use other images to estimate where the DBE samplers could realistically be placed. It's reassuring to see more agreement betwenen the images, too. Olly
  2. Pinched optics would align with a recent onset of the problem if the temperature has been dropping. Olly
  3. I'm a lot less persuaded than some that artificial intelligence exists at all. Clearly the term does refer to something new in the way that our machines operate, but should it be called intelligence? It belongs in a long tradition of using anthropomorphic terms in computing. 'Memory,' for example, is used of a computer but not of a page of text - yet my dictionary 'remembers' the meaning of every word printed in it. Anyway, regarding the 'automatic image' which some see as lying in the future (and it might well do so) we face a question which is already very familiar to us. If we want to listen to music we can press a button and listen to a recording. If we want an excellent meal we can go to a restaurant. So does this mean we won't want to play an instrument or cook a meal? Surely not. We do these things, and many other things, because we like doing them. If all we want is a picture of M31, there are thousands of them we can have in front of us in seconds and for free. But, surely, what we really want is to make the picture ourselves, no? Olly
  4. Humans seem bent on finding another way of eliminating themselves, since nuclear weapons did not deliver on this task as quickly as it had been supposed that they would... Olly
  5. That's right. The day will probably come when a camera can replicate precisely the intensity and colour produced by light falling on the eye. This will hardly make an old master. The masters see something in what they are looking at and paint not what is there, but what they see. Olly
  6. ^^Yesss! Auto-processing and I'm out. Processing is the bit I enjoy, the enjoyment coming in two flavours. 1) I feel like an archaeologist combing through the data to find what it contains. This is thrilling. It's like observing when you patiently work away at what you can see in the eyepiece. 2) I like to make a picture. Note, a picture. I don't mean an invisible window onto reality - because I don't believe that's possible anyway. I want my picture to be a picture. In my stumbling forays into regular photography I want the same thing. What I'm looking at is one thing, the picture I make of it is another. I want it to say something about what I was looking at but I want it, also, to look like a picture. I never expect it to be what I'm looking at. The difference is subtle but it exists. This is a picture. It looks like a picture. It doesn't quite look like what I see when I stroll down the road and look in that direction and that's what I wanted.
  7. I think that all this confusion is linguistic in origin. Vlaiv posted a visual representation of the lightpath and, years ago, when this issue came up, I drew myself a ray diagram and came to the same conclusion. It's in the verbal descriptions that the ambiguities arise and, from memory, (and this may be unjust) it was a piece of text on the QSI website which was ambiguous. Trust the drawing. Olly
  8. Here's chapter and verse from OPT. Scroll down for the section on filters and backfocus. The filter increases the backfocus by a third of its thickness so you need to add a spacer to adjust for the filter. https://optcorp.com/blogs/deep-sky-imaging/how-to-set-the-correct-back-focus Olly
  9. I agree. Being over-sampled is no big deal and you'll get a cleaner image if you downsample to a size which realistically matches your seeing. In doing so you won't lose any real resolution, only the 'empty resolution' of over-sampling. Olly
  10. The whole business of linking sensor size to focal length in the term 'crop sensor' leads us right up a gum tree! The idea that a crop sensor increases focal length is false. How can it? Focal length is a property of the glass in the lens. The term is useful in regular photography as a way of comparing the image produced by the same lens when used in full frame or APSc cameras. Clearly the APSc field of view is reduced but, more importantly, there are effects of perspective which also change. On a full frame camera, features in the foreground and background will have the same relative size as the naked eye view in a 50mm lens. The 1.6x formula allows a photographer to work out how to retain those proportions with a smaller sensor. None of this applies in AP. The only effect your chip size has is to increase or reduce your FOV. Also, pixel size, not the megapixel count often quoted in normal photography, defines your resolution. Olly
  11. I think it's done by finding the radial velocity of the stars from their Doppler shift. When the stars of a cluster share a radial velocity and an approximate distance, it's a safe bet that they are moving together and, therefore, formed together. If their proper motion is fast they would have moved well away from their birthplace and from any relic hydrogen left over from then. On top of that, the presence of hydrogen is strikingly absent in this region. There is Extended Red Emission (ERE) but no significant Ha. Compare this with what we see in the Rosette nebula which surrounds the cluster NGC2244. Finally, and this may be an amaeurish remark, I think we can see in the dust surrounding the M45 cluster, a kind of wake left by the stars as they move through it. Olly
  12. I wouldn't wouldn't want to offer the image that I took with Paul Kummer as being somehow definitive, but it was taken as a mosaic in which four of the panels met at the cluster itself. Since vignetting is panel-specific and the mosaic went together happily, I'm inclined to think that there is no dark region around the cluster, though the idea of radiation pressure creating one is plausible in principle. I've always found the same shapes in the outlying dust clouds whenever I've done M45 in widefield. It is a popular target with guests. ABE, in particular, is prone to taking background samples which are too close to a bright central object. This often happens when it's applied to galaxy images. The result is that it over-corrects on the basis of having read outer glow as background sky. This may well have happened in your (excellent) image. Increasingly, I follow the advice of Rogelio Bernal Andreo and place as few background samples as possible in DBE, keeping very well clear of any nebulosity. The idea is to find broad gradient, not localized lightpath irregularities. Olly Edit: another candidate for your dark circle would be over-correcting flats, which are quite common. If this were the cause, you'd also get over-correction of dust bunnies, but there's no obvious sign of that. Perhaps you didn't have any significant dust bunnies on your flats, though? If you did, the fact that they are not over-corrected in the image suggests that over-correction won't be the cause.
  13. This image's contribution to the field of view is entirely new to me and entirely admirable. I never suspected it. I think that what I posted some time ago was a reference the distinctively-shaped and very extended outer glow of the main galaxy. I wasn't thinking of anything like this and hats off to the team who found it. Great stuff. Olly
  14. There is a problem here! An image which doesn't stick out of the crowd too much does stick out of the crowd - because there is not much wrong with it. AP being very difficult, not many images, mine certainly included, don't have quite a lot wrong with them. Producing a clean, simple, unforced, slightly understated-looking image is, in fact, phenomenally difficult. But it looks like we agree that it's something to aim for. Olly
  15. You explained it perfectly. Stars shouldn't look 'stuck on' and, sometimes, a degree of artifice is required to make them look natural. And round we go again!! lly
  16. A very fluffy grenade, Martin, or so I hope! Lest anyone think otherwise, I really do think this challenge was a great idea, and a timely one. There was no irony intended. Regarding the dificulty, I've said for over a decade that the background sky and the stars are the hardest parts of an image to get right and, when you do get them right, the rest will follow. Olly
  17. What is natural? The question is ambiguous. Does it mean What is natural in the physical universe? or What is natural in the perceived universe? If the former then, as Steve says above, stars should not occupy more than a pixel each and the starless image is, by this definition, the most 'natural.' If we are talking about the perceived universe, however, the stars will have dimension in accordance with their visual magnitude. But... in the perceived universe, who is doing the perceiving? In the case of imaging, the answer is The camera. What does the camera perceive? This: Wow, that's a winner, eh? £13,000 well spent! So now we say, Well no, it's still natural if you stretch it... And move the black point... And use noise reduction as long as it still () looks natural. And tweak the colour balance... And remove gradients... This argument is beginning to look a bit lame, don't you think? Introducing a simplistic dichotomy like Science or art? does the conversation no favours because imaging does not have to be either, and because the use of art-like techniques can enhance the degree of objective information contained in the image. Take the image, above, of... erm... 👹 If you don't know, ask yourself how much information it contains. In my image processing I am trying to extract from my data that which allows me to emphasize selected aspects of the view collected by the setup. And, surely, so is everyone else. Emphasis, by definition, cannot be applied to every aspect of the data, so the emphasis I lay upon the stars is my decision, one of dozens during the processing. There is no resemblance between emphasizing and inventing, inventing being acceptable in art but not in AP. As a means of star control, removal and replacement has one huge advantage: it preserves, if done properly, the relative dimensions of the stars as captured, in a way that previous methods could not. Olly
  18. Agreed on all counts - including not taking the camera off the RASA!! Olly
  19. I very much like your image with its frank and effective emphasis of the stars, and I think that the original challenge was an excellent and timely idea. However, would it be possible to have this discussion without such loaded and prejudicial language? Dismissing the work of others as a 'craze' is not, in my view, what we do on SGL. I have no personal axe to grind, here, since I have never posted a starless image. However, I certainly don't think that the stars always belong at the forefront of every image. Have you imaged IC342? Do you think that the stars should be at the forefront of IC342? This is a genuine question and this whole discussion might be best located elsewhere. I heartily applaud Fegato's Cascade. A very good call on the balancing of the stars against the background. Olly
  20. Sometimes, but not all that often, refractors have easily collimatable front cell assemblies in which the whole group of elements can be tilted as one. I've seen some Altair Astro triplets which use antagonistic pairs of screws, parallel with the scope's axis and accessible from the front. These scopes are sold under other brand names as well. I've also collimated the front doublet element of a TeleVue Genesis on which the technique is to back off three radial screws round the lens cell, move the cell by hand, not with the screws, and fix it in position with the screws. I personally would only attempt these operations on a bench while observing an artificial star, and on an instrument with user-friendly adjustment facilities - which are the exception rather than the rule, I think. What I would not do at any price is attempt to collimate individual elements within the cell. If the scope gives a good star test I wouldn't do it at all. Olly
  21. I've enjoyed some memorable talks from great speakers. I never saw the trade show as the main attraction and, these days, I have one of those in my back garden! 🤣 Olly
  22. What you're trying, here, seems to me to be analgous with LRGB imaging in the sense that the mono camera catches more signal but does not distinguish between gases/colours. You are relying on the Bayer Matrix of the OSC camera to make that distinction. The question is, how much of this distinction do you need and will this system provide enough of it? Your result strikes me as promising, since it bears a strong resemblance to the colours and colour separation I obtained using Ha 0III LRGB. (I remember being very surprised by the colour of the OIII outer shell but yours is similar.) A related question would be, Does the extra signal of the mono camera over the OSC compensate for its inability to add to the colour differentiation? I'd also be thinking that, in what is close to a natural colour image, natural star colour would be nice. Olly
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.