Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

ollypenrice

Members
  • Posts

    38,132
  • Joined

  • Last visited

  • Days Won

    304

Posts posted by ollypenrice

  1. 1 hour ago, Elp said:

    I was a bit confused with this unless you're stating the old data is only in stretched combined LRGB form. I have in the past imaged RGB only with a DSLR and then combined mono lum afterward in PS as a luminence blend with that layer on top. What it did was tighten up the blurry RGB layer.

    Alternatively with the old data if it's still linear could you not split the channels, stack together with the new split RGB data, then recombine into RGB with lum on top? Just waffling ATM as I don't know what data you have at hand.

     

    14 minutes ago, tomato said:

    Ah, not so straightforward if you don’t have the original linear stacks.

    I do have the linear stacks but have never applied L over RGB at the linear stage because I have always wanted to process them differently, the L for deep stretch and high detail, the RGB for low noise with detail unimportant.

    I'll see what happens...

    Olly

    • Like 1
  2. 5 minutes ago, tomato said:

    I must be missing something here, but why can't OSC RGB be combined with LRGB at the linear stage? I'm now using an ASI178 mono for Lum and an ASI678c for RGB, previously I was using another ASI178 mono with filters to capture RGB alongside the Lum on the dual rig. In PI I channel extract the RGB OSC stack to produce R, G, and B files, these are registered and stacked with the R, G and B stacks from the mono and filters camera, they are different imaging resolution scales but APP handles them, no problem.  Then the combined RGB stacks are registered and combined with the original lum stack from the mono camera.

    Here is an example of just such a combination:

    Image08AP.jpg.290c178ec11df859d2a4beb857889684.thumb.jpg.0c674d322e8b46a4bd6c0a2594541c02.jpg

    This is a very good question and, while I was typing, a little worm in my ear said, 'Have you ever tried it?' I haven't - and that is a very bad reason for not tyring it now!  I don't think I'll try it in Pixinsight because I'd end up in a psychiatric ward but I'll try it for sure. Your image is great. I have this object myself but have never been happy with it.

    Olly

    • Thanks 1
  3. 1 hour ago, vlaiv said:

    At this point in time - I'm finding much easier to answer what drives me away from SGL rather than to it.

    It's become part of my life so I'm not thinking in terms of being drawn to it (much like you don't think what drives you to your family, it's your family, you know), however, there are things that I must actively combat against in order not to be driven away from it.

    I was distinctly unimpressed by another member accusing you of talking nonsense, just recently. You do not talk nonsense and, even if you did, SGL is not a place where that kind of remark is acceptable.

    Which brings me directly to why I like it here. It's so civilized.

    And helpful. There are lots of areas in which I have no competence at all, notably IT and electronics, and I can post a question and get an answer here within an hour.

    It's also a chance for me to chat in my native language, which is relaxing.

    Olly

    • Like 16
  4. 10 hours ago, osbourne one-nil said:

    Ah - so it's purely the aperture which improves things, not the f ratio. So my 8" SCT should be as effective in theory? I guess it then comes down to collimation and optical quality?

    Yes - that makes perfect sense. You've done well; it's not easy making me understand something!

    The active ingredient in 'F ratio' is aperture. That's why, in regular photography, aperture is usually referred to as 'F stop.'  This OK when neither the focal length nor the pixel size is being changed while the aperture is being changed by opening or closing the diaphragm. There is no 'F ratio myth' here but there is no diaphragm on a telescope!

    The F ratio myth creeps in when an astrophotographer becomes fixated on F ratio when 1) comparing scopes of different focal length 2) ignoring pixel size. 

    When comparing two setups of the same focal length and with the same camera you can rely on F ratio to indicate the speed of capture because aperture is the only variable.

    It's probably worth adding that resolution (of fine detail) and signal strength are, in theory, different entities. In real world imaging, though, they have an interconnected relationship. Few of us expose for long enough to get the absolute best out of our systems. Many fine details in galaxies are quite faint. The more signal we have, the more we can sharpen - and that's where signal strength impacts upon the final resolution of detail and, if we are honest, perhaps, the final impression of detail. This is where a faster system will be a benefit.  (Here I'm using Vlaiv's definition of a faster system, not just a faster F ratio.)

    Olly

    • Thanks 1
  5. 21 hours ago, Elp said:

    For you sir, I would have thought it'd be childs play.

    The problem is that the RASA will be OSC and the present image LRGB. The ideal time to combine them is at the linear stage but the L and RGB can't be combined before stretching. Perhaps the OSC could be copied into one luminance and one RGB version and the present L attached to the L version and the RGB to the OSC. These concoctions often sound possible on paper but might give nothing but nonsense in reality!

    Olly

  6. 1 minute ago, vlaiv said:

    I wonder why people insist on "speed" of the telescope as being crucial thing when imaging galaxies?

    Most galaxies are very small in angular size - maybe dozen of arc minutes at most. That is about 700px or less across the image of galaxy if one samples at highest practical sampling rates for amateur setups - which is 1"/px.

    Now take any modern sensor that has more than 12MP - that is 4000x3000px or more, so you have 4000px/700px = ~x5 at least x5 larger sensor than you actually need in terms of pixels. You can bin x5 and you'll be still able to capture galaxy in its entirety + some surrounding space.

    Btw - bin x5 will make F/15 scope work as if it was F/3 scope - so what is the point in going for F/4 Newtonian scope when you can comfortably use compact Cass type - be that SCT, MCT, RC or CC and produce excellent galaxy image.

     

    My take on this would be - get largest aperture that you can afford and comfortably mount and use and adjust your working resolution to range of 1-1.2"/px for best small galaxy captures.

    I take this point but have little experience of binning with CMOS cameras. What's your view on how this should be done and how effective it is?

    Olly

  7. 56 minutes ago, vlaiv said:

    That is really not the reason why images turn out green :D

    It does nothing to do with number of pixels - but with relative sensitivity in different parts of spectrum.

    Most sensors have strongest QE in green and this will result in green cast if image is not color corrected (often wrongly referred as white balance). Some cameras have peak in red part of spectrum (usually those very sensitive in IR) - and those produce reddish tint images.

    @LaurenceT

    You have "white balance" controls in your capture software which you can tweak to get color neutral image. Alternatively - do color balance afterwards in software while at processing stage.

     

    Agreed. I don't find any difference in green noise or balance between OSC and mono captures.

    If the image really is strongly green then an incorrect debayer is very likely.

    Olly

  8. 12 hours ago, Elp said:

    Looks good, don't know if it'll be worth pointing the RASA in the vicinity as it'll be small in the FOV, but could possibly be used to blend that faint outer region into the above.

    It's on the list. A marriage made in Heaven or a processing task from Hell???

    :grin:lly

    • Haha 1
  9. A recent thread on the tidal extensions of M63 made me go back to 25 hours of old linear data (TEC140/Atik 460). The problem, of course, is to try to wring the neck of the faint extensions while preserving a decent core. Well, I'm not sure. Opinions welcome!

    spacer.png

    Olly

    • Like 21
  10. 2 hours ago, AstroGS said:

    @ollypenrice this is how it looks after (my personal best) stretching. I tried through the available Pixinsight tools (that I know of) to remove the halos but, since they are integrated in the nebulosity, I am finding it difficult to remove without leaving a residual shadow.image.thumb.png.c2164ab3f042f1e86aa48e9fc55f92ab.png

    I don't see why the version below wouldn't work.

    OIIIPsrepairweb.thumb.jpg.47987262f08eb94f2cbb2da0f95f30de.jpg

    This was given a soft stretch in Ps Levels and de-starred in StarXt. The artifacts from the two big stars were lassoed and given Content Aware Fill before the stretch was taken further using custom shaped curves. Terrible shame to throw it away!

    (If it would be any good to you I could dropbox you the 16 bit Tiff. Just PM me an email address.)

    Olly

     

    • Like 2
    • Thanks 1
  11. Are you absolutely sure the OIII can't be used? I'm a bit of a hard-core 'fix it and use it' merchant! Could you not stretch it, de-star it and use Content Aware Fill on the haloes? I don't think there would be any need to put the stars back since you could use the OIII as a colour layer while leaving the stars as they are.

    It is a lovely image and, as you say, dramatic because the colour change brings the Jellyfish forward, visually. It's just a shame to lose the outer OIII shell.

    Olly

    • Like 2
  12. 38 minutes ago, dciobota said:

    So this kinda makes me wonder how the 135 Sammy would do wide open in that area for about 20 hours or so?  My thoughts would be that at least from my skies gradients would probably be a big issue, although facing northish is my darker sky area.  Be a worthy project for someone with really dark skies.

    I think that, at Dec 44, we're probably going to pick up IFN, which complicates matters, maybe?

    Anyway, interesting stuff.

    Olly

  13. My interpretation would be a little different. I'm inclined to think that the arms outlined in blue might be normal spiral arms lying in the galactic plane.

    The loop outlined in red seems a little odd since it leaves the galaxy and returns to it. The solid red line suggests this strongly. The closely dotted red line might be part of the same loop and the sparsely dotted line is pure speculation. My hunch (no more than that) is that the red loop lies a little out of the galactic plane. From the point of view we have here, I'd put the solid red loop as rising above the galactic plane and the closely dotted loop as dropping below it. In this interpretation the red loop would be the result of interaction.

     

    M63ExtensionsANNOTATED.jpg.fa8aebe166f5ea8439413d293256a9d2.jpg

    In cross section it would look like this, the galactic plane in blue and the loop in red, as above.

    Possiblesectiopn.jpg.cd4ae02358071b4cd750bc07fb250884.jpg

    There is pretty good agreement between three independently captured and processed images here and I found that neither StarX nor BlurX had any effect on the faint structures. I used DBE in PI and then stretch in Ps. I begin with a pure log stretch pushed too far and used for reference. My final stretch often involves kinking the curve to emphasize contrasts but I check it against the reference stretch to avoid invention. My rule is, 'Emphasis, yes, invention, no.'

    Olly

  14. I'm not quite sure what you're asking. Are you suggesting that you consider your first stack as one light which you then put into a set of new lights as if it were one new sub? If so, you can't do that.

    To save time, you can make a new stack and then combine that with the existing stack, weighted according to exposure time. If your first stack had 2 hours and your second had 1 hour, you'd weight them 2 to 1.

    Olly

  15. OK here is my lum data processed only to reveal any outer extensions. It has 14x30 mins, average combined with 36x15 mins (16 hours). I gave it an initial stretch, de-starred it, noise reduced it and then made a copy which I equalized in Ps. This produces absolutely extreme contrasts and brings out the faintest data. I then used the equalized version as a layer mask and stretched again through that.

    I would say we have three clear spiral arms on the right, the uppermost one having an angular bend rather like the arms of M101.  What is, perhaps, more relevant to the consequences of an interaction, is the broad band of luminosity which seems to be drawn from the lower edge of the galaxy and out of the field of view in the lower right hand corner. This is the area I struggled with first time around. This time I think it might be a long, broad tidal extension, though the two bright clumps on the edge of the frame here will be starglow from two bright stars. The wider FOV and greater speed of the RASA rig ought to make or break this hypothesis. To be continued.

     

    M63Extensions.jpg.c04ee5695363f4d319d83795164420aa.jpg

    Olly

    Edit: Trying to turn this is level of stretch into a presentable image isn't easy.  I think the RASA will do better.

    M6325HrTECRPweb.thumb.jpg.69ec3083007a51e62acdde5bd60f655f.jpg

    • Like 6
  16. My M63 is here.

    spacer.png

    I'm not totally certain about which outlying features you mean, so maybe you could annotate them on your image?

    Mine is quite an old image, processed before the X-suite was available, and I've long been meaning to revisit the linear data. This thread is a good prompt to do so. From memory I was far from confident while distinguishing extensions from gradients, particularly in the lower part of the image, I think the lower right in this orientation. I'll see what happens with a new look today, assuming I can find the original linear stacks.

    I've also been thinking about shooting this in the RASA, which would have two advantages. 1) The photographic speed of the setup would find more faint signal and 2) the wider FOV would give the gradient removal software far more information about the structure of the gradient to be removed.  Naturally it would be nice to rely on flats for separating signal from uneven illumination but, for whatever reason, we rarely can.

    Olly

    • Like 3
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.