Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

Luke Newbould

Members
  • Posts

    107
  • Joined

  • Last visited

Posts posted by Luke Newbould

  1. I found it a major improvement in my area when we finally got switched to LED streetlights! Sure, the light is broadband and thus practically unfilterable, but they are well shielded and not overly bright so I feel that every effort has been made to minimise environmental impact. - some light at night is a necessary evil living here on a main road I guess.

    Our old sodium lamps were very poorly designed units - light everywhere! That said, they were probably designed and installed at around the time of the Norman conquest so I can't judge too harshly!

    These days I'm covered in bruises on my shins from walking into garden furniture in the dark now during imaging sessions, and I couldn't be happier about it! 😂

    • Like 3
  2. Interesting thread! - Only just saw I got tagged in this so I'm sorry about the late response here.
    I've not had the chance to run sensor analysis on the cameras I'm currently testing unfortunately, they just got unboxed, briefly tested then mounted on my scopes so I can't offer anything for reference I'm afraid - first light tests went great though at least!

    Regarding the variance observed between P1's graphs and the ones in this thread, my gut feeling is that there's a few things at play:

     

    First is likely a 'silicon lottery' issue - probably safe to say if you got 100 of these cameras on the bench to test, you'd get 100 very slightly different end results! (same issue as in other fields, PC overclockers run into this issue all the time for example)


    Second would be the test-to-test output variance in Sharpcap's sensor analysis - be it either through slight variances in the test environment (unavoidable without a laboratory grade test bed I'd guess!) or just because of the general shot-to-shot 'noise' in measurements from each frame, something makes it so that SharpCap sensor analysis numbers don't ever truly match for 5 tests in a row - I reckon the graphs published from P1 could just be from a great sensor and then taking the best graph out of 10 for example?

     

    Thirdly, environmental variables could be at play with equipment this sensitive and numbers so small! - testing on a day where you can practically feel the static crackling in the air? using a regulated, well earthed power supply? the test computer's earth connection? strong sources of unshielded magnetism nearby? - the list goes on, but there's probably something at play that's near impossible to anticipate too - worth consideration imo!

     

    Hope that helps! 🙂

  3. That's really interesting mate! It seems we're doing a similar thing now since noiseXterminator came out, finding it works well and saves time on the process for sure 👍

    I'm still tending to finish images off with a little dab of Topaz though, while it's still starless, rather than another noiseXterminator pass. More experimentation needed though, haha! 

     

     

    • Like 1
  4. Cracking shot there mate! Unreal what you've captured with just 2.8 hours! 🙂

     

    I really know what you mean about these summer sessions, it feels like I'm just getting settled down and then it's time to stop, haha!

     

    Reckon I'll have a go at this myself over the summer, lovely region.

     

    Thanks for the shout-out by the way! 

    • Thanks 1
  5. Awesome job! - A nice dusty Iris and I love that Pacman rendition, unreal when you consider the amount of moon you had to deal with! - Top effort mate, it's easy to feel like not bothering when the moon is lighting up the sky like a floodlight but this certainly shows it's worth the effort!

    Clear skies 🙂

    • Thanks 1
  6. Hey there @Budgie1 ! - I'm glad you found my little video overview useful, and thank you for the shout out too! 🙂 

    I whole-heartedly agree with the points made in this thread, when used carefully it's a thing to behold! - taken too far, artefacting can occur as @Clarkey mentions, or things go a bit like a painting as @ONIKKINEN mentioned!

    @Lee_P brings up a great point there about using it on starless images, then adding them back in afterwards - when used on a starless image it's always been at it's best for me, much easier to control the outcome!

    I wonder at where this software will be in a few more years, it's boggling!

    Clear skies 🙂 

    • Like 1
    • Thanks 1
  7. @tomato Hey mate! another lovely job, well done indeed! :)

    I know what you mean about the standard salmon pink look most of this type of data takes on, which can still look effective on the right target of course! - That said, on most targets I prefer a colourful approach, like what you have done here! :)

    Keep it up mate, you have really gotten the hang of it!

    • Thanks 1
  8. @Lee_P Thank you Lee! - I'm so happy to have helped you in some way mate - you really have made a beautiful image there, and genuinely I think anyone looking at it would have an incredibly hard/nigh on impossible time determining if it had been done with a mono camera or not, which really highlights what a great job you did with your data, and the power of these new OSC cameras/filters + the processing options we have available these days!

    Bravo!! 👍

    • Thanks 1
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.