Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

Datalord

Members
  • Posts

    831
  • Joined

  • Last visited

Posts posted by Datalord

  1. Another bin4 target. I'm really enjoying the fast acquisition, but it just isn't as easy to process as the bin2 or bin1 data I usually use. That said, it's a pretty good result for 4 hours of acquisition.

    514731362_tupipnebula.thumb.jpg.9aed865e7ddc41942abedd049b363283.jpg

    image.png.3682a132d1ed9ac67fc5f75316eee633.png

    • Like 5
  2. I have a related problem, although it's not about focus, but I think my solution to the problem is applicable for you.

    My narrowband filters are HA-3nm, O3 and S2 are 8nm. The obvious issue is that 8nm lets in a lot more light and so the stars bloat a bit more than in my Ha subs. I solve it by putting the stacked S2 and O3 through a MorphologicalTransformation in PI to reduce the star sizes to the same as the Ha before I combine the images.

  3. After some heated debate in another thread with @vlaiv , I wanted to try out whether I could get anything useful out of bin 4 at a pixel scale of 2.08"pp. So yesterday I found that M57 was in a good position and I started a run.

    I will say that I used everything I could on this data. I drizzled, cropped, PI, PS, back to PI, then Topaz Gigapixel to enlarge, before some more PI and PS.

    1695469049_M57@05x.thumb.jpg.28af75813d31085fc6fcd0eea8cdf61d.jpg

    Decent result. Processing was a bit harder than I'm used to because I had to readjust all settings to a different pixel scale, so more trial and error on that side.

    image.png.fb3d22bf7980c8253e8bf94b024b10d7.png

    Some further comparisons on the drizzle vs non-drizzle in the R channel:

    image.thumb.png.c5206c3155b1d5b6138b79958f7c00f4.png

    I had intended to process both to compare the final result, but I simply didn't bother to process the non-drizzled version. I find the drizzled much better for me to gauge further processing steps that would be impossible in the other version.

    Here's another fun one. This is the combined RGB before stretching! There's SO MUCH information in these bin4 pixels:

    image.png.20dd9f715109e207e5bd587b576fdf1a.png

    Conclusion on it for me is that I managed to get an RGB result with quite a lot of detail on a tiny object in just 3 hours. I doubt I'll make bin4 my goto mode in the future, but I do appreciate that I have this in my toolbox now. Thanks Vlaiv!

    • Like 11
  4. I've been struggling with processing my images for my stars and yesterday I wanted to do something about it. The trouble is the stars grow, but the diffraction spike on the medium large stars doesn't really get enhanced the way I would hope. The spike is there hidden in the data, but too faint.

    So I made a Photoshop Action (see bottom) which adds a spike in the same color, in a separate layer, adds noise, blur and then the opacity can be set to how prominent you want the spike to be.

    1. An example of a star with a stunted spike.

    1.thumb.png.60654a51400f34dc0eecae0b60e00fb5.png

    2. Select the star from the background layer.

    2.png.01381c300d39eba323a0d7fc713887fa.png

    3. Play the DiffractionSpike action

    3.png.a384f494d17c1714aaf82b11d69a32ef.png

    4. The spike is created. The size is relative to the size of your selection on the star and placed slightly off center.

    4.thumb.png.c57a6a5694da442c1a0411fc89e5f0c5.png

    5. The layer is selected, so Ctrl+A and Ctrl+T let's you transform and move the spike in position.

    5.png.0ca6dff0a3aad4da28f6089cf5c82b6a.png

    6.png.48f844ddc4c4ee1e053779a23fc15048.png

    6. Blur the center of the spike.

    7.png.5b9bc1d5d2106b5ce5649fabb2be0bc4.png

    8.png.a5c03c885a180521e46e3b3eec9e3f69.png

    7. Finally, set the opacity of the new layer to somewhere between 10 and 20.

    9.png.ba854ffdeda3449114057130e59a00d6.png

    End result

    10.png.a12c65b09f3f1a3474c9fc59c17c2bc3.png

     

    And here seen in full.

    994624340_Abell2199-denoise@05x.thumb.jpg.14664798bdae15c85abc98c6e3e28b94.jpg

     

    The Action:

    Datalord PS Actions.atn

     

     

  5. Well, as I mentioned, I'm much more comfortable with the longer exposures because it makes a difference in the fainter areas of the images. That said, my M82 I posted last night uses 600s bin2 L and 180s bin2 RGB. But, 1200s bin1 Ha. 

    Comparing the cameras, I have a little bigger pixel size, but also 41,000 full well compared to 25,000 on the 8300. I should be able to cope with a bit longer exposure time. But, differences in aperture etc etc. 

    Honestly, I think I'll just go back 180s bin2 RGB and 300s bin1 L. It seems like the most sane approach, especially for galaxies and star fields. 

  6. 33 minutes ago, wimvb said:

    So, it's not so much about data capture as it is about processing that data. In that case, I'd not change the exposure time. Although, if each of the colour channels shows enough details, you should probably shorten the exposure time for luminance (if it's the same as your colour exposure time). L captures all the colours at once, and there's more risk of bloated stars. You should be able to at least halve your L exposure time, and double the number of exposures.

    Hmm, I usually do twice the exposure time, but bin1 instead of bin2, so that should account for half?

    But, definitely something I can reduce drastically, at least as a test.

  7. 51 minutes ago, ollypenrice said:

    Ah yes, that's a universal problem. It's not just the saturated core you're talking about, then. One thing that looks promising is using Starnet or Straton to de-star a copy of the stretched image then put the linear original on top as a layer in Blend Mode Lighten and stretch in situ till you have the stars as large/bright as you want them. In Blend Mode Lighten only the stars will appear during the stretch.

    Yes, but that messes up any hope of diffraction spikes... 

    I should probably be content. My images are getting pretty good, but Hubble is always grinding my eyeballs! 😂

  8. On 08/05/2020 at 18:27, ollypenrice said:

    For stellar cores what would you need? 3x15 seconds per colour? Pretty quick.

    That sounds incredibly short, but I get your point. I need to find the right exposure to get the stars below clipping point, then mask the core in to the final image.

    On 08/05/2020 at 18:27, ollypenrice said:

    In fact I don't worry about saturated stellar cores because I find I can pull the colour into the core from the outside using Noel's Actions Increase Star Colour or by doing it longhand. 

    Yeah, I don't actually have a problem getting colour into the center of the stars. I only have a problem with them getting large while processing.

    14 hours ago, wimvb said:

    Yes. Have a look here. It's an extreme example, but it shows the technique quite nicely. 

    https://pixinsight.com/tutorials/NGC7023-HDR/index.html#High_Contrast_Small_Scale_Structures

    I've used this before. I'm not sure it gets me what I want. Somehow the stars become fuzzy blobs. Colourful fuzzy blobs, but blobs nonetheless.

  9. 4 hours ago, ollypenrice said:

    I can't help suspecting that a little more sharpening could extract extra detail from the core if done selectively.

    It's a tough balancing act. I've already put it through a ton of sharpening, in PS through the Smart Sharpen with a layer on top to selectively pull out the sharpening, as well as Topaz Sharpen AI.

    I think I have a tendency to zoom in too much to pixel peep, see artifacts and then tone the sharpening down to the detriment of the overall image. Maybe I should give another squeeze.

  10. I was never happy with my previous process of M82. I shot it in December last year and processed it with my then knowledge and it just wasn't great. I squirmed a little whenever it turned up in my screensaver.  So I dug it up last night to look at it again.

    1147210121_M82@05x.thumb.jpg.3c8922808ef019cbe5564766f8ae8718.jpg

    image.png.52d730d47b55c775f6bbe335121d99aa.png

    For comparison, here is the old version:

    70695237_M82_old@05x.thumb.jpg.0f5bc5e873149c373d29828542b6d2e4.jpg

    • Like 8
  11. 6 hours ago, swag72 said:

    I need to go and do some reading about what they even are LOL!!!!

    Wait until you start wondering how far away they are and how you figure that out. Once you start messing with Z value over 1, you brain starts hurting. Supposedly I managed to capture a Z=3.1 quasar in an image, which depending on how you define the constant for the expansion of the universe, will mean it is something like 42 billion light years away today.

    That said, it would be really nice to have a single catalog for these quasars we could just plot into an annotation.

    • Like 2
  12. 5 hours ago, wimvb said:

    Not even close to yours @Datalord's but otoh only 1 hour of data. If I add 5 hours of L, and a decent amount of rgb ...

    There's a reason this galaxy wasn't discovered until 2005. 😋

    Excellent post there. It's interesting to think that there is so much hidden right in front of us. 400kly is nothing compared to our other galaxy targets. 

  13. 52 minutes ago, dannybgoode said:

    Bet that was a bit of a faff to process

    That would be an understatement. I had no concept of what it would mean when I started it. Honestly, I saw a blob on cartes du ciel and sent the telescope over there to image before I knew what this would mean.

    53 minutes ago, dannybgoode said:

    extremely well done though

    Thanks!

    • Like 1
  14. 1 hour ago, andrew s said:

    do you think I you could pick which was which in an unbiased test?

    In this particular case, yes, I think I would. The main difference for this particular image is the noise in the black. Whether that is an artifact of me rushing the processing a bit or if it is because of a difference in how binning and subsequent upscaling treats low signal parts of the image is something I can't say on this one.

    image.thumb.png.1ceeb352741034aa8c9f80076189b5a3.png

    I also want to do this in a real test with true bin4 captured images where the full implications of well depths and lower exposure time is in play. This becomes a completely academic exercise if I can't save time imaging.

    36 minutes ago, vlaiv said:

    I think that again we got derailed in what we are trying to accomplish here, and I would like to point out few things:

    1. Proposal for split image processing approach was to show how much difference if any there is in detail - in support of above theoretical approach and because you mentioned that you actually see the difference - that would give us the chance to inspect that difference in detail

    2. I'm not trying to push a certain approach on you. If you feel comfortable doing it like you have done so far and are happy just continue to do so. I would personally bin the image and process it like that and would leave it at said resolution. I would not upsample it back, as I think there is no point in doing so.

    3. In my view drizzling is not going to produce anything sensible in this case (but that is just my view).

    1. For the sake of this specific test, I see difference, but it is so little that I will consider the bin4 on par with bin1. I need the real world bin4 test to come to proper conclusion as to how it will influence a real image.

    2. Well, there is quite a lot of reason for that if you want to print or put it on a 4K monitor. If I do it myself, I can at least control the upscaling and not let it be a random driver who does whatever it wants.

    3. I'll let that be up to a test as well. If I can shoot in bin4, with same detail captured, but get maybe 10 times more frames, I'll let another experiment dictate. I remember another thread where I compared drizzle to non-drizzle and concluded that the black parts was where the biggest benefit was. But, I must experiment.

    • Thanks 1
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.