Jump to content

Banner.jpg.b89429c566825f6ab32bcafbada449c9.jpg

bdlbug

Members
  • Posts

    181
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by bdlbug

  1. On subject of Topaz, probably should take this over to the processing section, but just a comment - if its over applied it does generate some strange regular, square, artifacts, across the entire image, either due to the way noise is distributed through the image as I don't dither, or it is a consequence of the way they apply the AI algorithm in overlapping square samples of the image. You can check in PS by applying the equalize function and an regular artifacts are highlighted clearly. B
  2. Another one from me, again taken in March plus I added 3hrs of Ha when we had some clear sky but a 60-70% moon, which has helped to add a bit of bling to M81 and highlight the filaments around M82 core. Total of 7.5 hrs imaging time, 1.5hrs on lum, 3hrs on Ha and 1 hr each RGB. Again all 120s subs at unity gain on ASI1600 Used similar flow to my previous post of Leo Triplet. Without getting into being too self critical the the stars in RH corner are a bit weird, think it could be a focuser droop or flex as the nearly vertical angle I was capturing at was more extreme than I've done before and I was baby sitting the rig in case of a clash with the pier. Thanks Bryan
  3. Taken over 2 nights during new moon in March. Total of 4hrs LRGB, Baader filters, using 120s subs at unity gain with ASI1600MM and AT106EDT refractor Integrated with APP processed initially with PI applying DBE and a light deconvolution, then into PS for stretch and colour balance etc.. I also tried out a new noise reduction plugIn for Photoshop , Topaz DeNoise AI, which I think works rather well as the 4 hrs of data had considerable noise and this new tool removed it without destroying the detailed structures of the galaxies. I'm using it under the 30 day trial. I also used IDAS D2 filter as I have just had LED lights installed on the footpath behind my house, seems to be an effective filter for LP. Thanks for looking as as ever any comments/feedback welcome, Bryan
  4. I use CEWE - you download the application to your pc/laptop and can set the image not to be pre-processed the foam backed prints are reasonably priced and I’m happy with the nebulas and galaxies I got printed
  5. @rl point taken, I know its a trait en-grained in me as a design engineer, always looking at the problems still to solve rather than look at how much has been achieved. thanks for the positive feedback , appreciate it. @celestron8g8 thank you
  6. First light for my new ASI1600MM-Pro - my original ASI1600MM cooling failed, long story, will try to get a repair done..... This is a 4hr total exposure image, so not quite as deep as Emil's version posted yesterday. I used APP, PI, PS and the new Topaz AI noise reducing tool, which worked rather well. I also put the IDAS D2 filter back into the optical chain as the path behind my house has been changed to LED and has messed up broadband RGB with horrendous gradients - the IDAS D2 doesn't remove gradients 100% but at least I got colour into the image.. I found this great article on Skye and Telescope website Yanking Markarian's Chain written by Bob King - hence topic title.... 30 subs of each filter LRGB x 120s Note: I am aware there is a faint slightly curved wide'ish line from top Middle right of image down to bottom right - very confused as no subs had any satellite tracks, none of flats show this artifact - so for time being I'm posting this but will continue to investigate, thanks again for looking and any comments you may have, Bryan
  7. Agree with David, my recent experience with imaging B33 with Baader Ha (7nm) The marked difference in two sessions with Alnitak halo and ASI1600 microlensing definitely demonstrates a dependence on accurate focus, spacing between FF/FR and the orientation of the filter. I got improved results by accident as I had worked out I needed to reduce the Riccardi F/R back focus spacing by about 1-2mm into to eliminate some elongation in stars at corner of images, however when I re-assembled the optical train I put the filter wheel on 'backwards' and the difference is below. There is still microlensing in both images, that can be cosmetically improved in PS, as per Olly's comment above, but the large Halo which is very difficult to remove in any post processing is significantly reduced. So I would definitely suggest to OP, @Miguel1983 , if it is possible to reverse the orientation of EFW and re-run a Ha session on B33 and compare. I am using a AT106EDT, Triplet APO Before modifying back focus and initial Filter wheel orientation After changing FF/FR spacing and reversing filter wheel orientation Bryan
  8. Started to get into projects using a Baader Ha(7nm) filter on the ASI1600. The clear nights I got over last weekend were an unexpected bonus, but with the moon making its presence known , so imaging was always going to be Ha. M42 is just under 3.5hrs of integration from last Friday night Horsehead and Flame just over 4hrs integration time from Monday evening and addition subs from December last year Both images use120s subs at unity gain. I used various levels of stretching on M42 image that I then used in a ridiculous number of masked layers in PS to compress the high dynamic range in M42. I also used quite a bit of artistic license on the Horsehead and Flame with Alnitak to reduce halo and micro-lensing from the ASI1600 CMOS sensor. Processing used APP to stack and integrate then mostly Photoshop to process but I did experiment with PI and used StarNet++ to work out how to stretch the gaseous stuff and not bloat the stars.....not sure that was entirely a success. I also did a Ha wide field of Orion and belt a couple of years ago using Samyang 135mm that I added a mosaic tile to that brought in a bit of Barnards Loop - added that to give perspective on the closeup images. My plan is to build on the mosaic when Orion comes back in the Autumn , I know its not gone completely yet, but by 9pm its past meridian and lost in tall hedges... I do have a rather outrageous colour image of M42, I am hesitant to post up as I used it as an experiment to combine all my DSLR and ASI1600 versions and maximise what I had..... here's a thumbnail, let me know if you would like a full resolution version posted... M42 Great Orion Nebula Horsehead Nebula (Barnard 33) and Flame Nebula (NGC 2024) in Orion SH2-276 Barnards Loop, M42 Orion nebula & Horsehead & Flame nebula
  9. Mick, appreciate the feedback, I also thought that the red was intense, but my daughter thought it looked impactful - it's probably the result of me being a bit heavy handed in blending Ha with Lum to highlight the Ha detail (I think I read should be <15%, I may have doubled that !) - I know how would anyone think do do such a thing however I used blend mode lighten for main overlay of Ha into the LRGB. @carastro has a cone image posted up as well and also had a couple of variations on how to balance the saturation of red. I actually prefer Carole's framing as her image has captured more of the dark molecular cloud below the cone, in her image, I managed to miss most of that above the cone in my framing. I 'll perhaps have another processing session later this evening and see what's possible.. Bryan
  10. Thank Adrian, yes definitely imaging in Ha delivers depth that’s not apparent in RGB which does support the discussion I had with Ian King at IAS in that LRGB images in light polluted skies is difficult to achieve consistent and quality results but going narrowband opens up a whole new universe 🙄 however I prefer broadband images and I’m yet to embrace narrowband ‘style’ or invest in the additional two filters..... Bryan
  11. @astro mick I’m intrigued why you initially thought ..Oops Bryan
  12. All, like many posts last weekend was first imaging opportunity since beginning of December. Given the poor and very damp weather I brought everything in from the observatory in December, so last Friday, with prospect of a decent few days of clear sky I set it all back up again. Like a lot of things in this hobby, it didn't quite go to plan -but I did get polar aligned and a few test images on Friday night before cloud rolled in. Saturday was an evening of frustration as my mount was not behaving, grabbed two subs on he cone and then clouds. We didn't do as well for clear skies as I had hoped for in Northants over the weekend as the cloud just seemed to arrive mid evening. Anyway Monday was the night......Started early got set up and started imaging about 6:30pm, took dogs out for their walk, came back into the obsy about an hour later only to find everything dead - muppet here had forgotten to plug laptop into power socket, so everything had crashed and APT lost all its settings, hopefully no-one overheard my choice words - so complete reset of APT settings ( I've now saved a backup..) and got back imaging about 8:30pm through to nearly 3am. Details 66 subs Ha(7nm) 120s : 2.2hrs 30 subs each LRGB 120s : 4hrs ASI1600MM (-20degC) AT106EDT + Riccardi 0.75x FF/R Processed APP, PS I'm posting the Ha(7nm) Baader as well as the combined LRGBHa - I think they both give a different perspective of this region of the sky. 6hrs on this is target is not really enough, so my processing has pushed things and so noise as usual is the enemy, so a bit of NR applied via masks etc.. Thanks for looking Bryan
  13. love the name, its such a foreboding presence, I think a Lord of the Rings name is very fitting For me 5hrs on this target is just the very start, its really very dim and difficult from our UK back gardens, You're right to think more subs needed - I reckon you need double the sub time and probably more with a DSLR as I imaged NGC1333 recently with a dedicated CMOS camera, ASI1600, and had over 10hrs of capture and had to remove over half as luminosity just gets over whelmed by light pollution from my back garden. Massive step up from the 2016 version - another session on this target in 2022 then
  14. Stu, myself and Mrs bug are doing exactly the same we are off to London tomorrow to see a show then Sunday down to Maritime museum to both these exhibitions thanks the tip about National Rail 2for1, just printed off my vouchers
  15. Brian, that's come out rather well, despite your misgivings about the sub quality - congrats Bryan
  16. Brian I just posted up last night my attempt at NGC1333 , it’s not an easy target in LRGB from UK back gardens. My subs looked really rubbish and even after integration the image needed to be really stretched and noise reduced and detail squeezed out. I have just revisited luminance for this target tonight as the skies are very good and bonus on Sunday the school pitch floodlight are off all evening as no 5 aside 😀 brings me up to over 10hrs on this target now - most I’ve ever collected, and dumped as the luminance is really tricky, at leat that’s my experience. Best of luck with your attempt at NGC1333 in LRGB Bryan
  17. A stunning image of the reflection Nebula NGC1333 was posted by @ollypenrice back in October and it inspired me to image the same target from my back garden just outside Northampton. Well it’s not easy : Despite the lack of clear skies in past 6 weeks I have managed to get about 8 hrs of subs over 3 nights - but all luminance discarded as the integration was so noisy and mostly a collage of white. Last night was again plagued by low level mist/fog but I got 3hrs in before I gave up. I used the RGB from late October mixed with RGB subs from last night that I integrated in APP to create a super luminance. I have then spent most of today and many iterations teasing the last bits of useable signal out all of the channels with PI and PS and actions - Thanks to Olly and his technique to enhance the molecular cloud structure using the equalise function in PS 👍 doubt I can remember how it all came together but I believe I have got a recognisable image of NGC1333. Captured using ASI1600MM in 120s subs and AT106EDT There’s probably about 5hrs of data in this version of the image . thanks Bryan
  18. Almost a year ago I captured an image of M33 and at the time used the new IDAS D2 light pollution filter. This week, Wednesday 2nd I got out and captured 3 hrs of subs, 1hr Ha(7nm) and 2hr RGB , but I removed the D2 filter from the light path. I processed the data using Astro Pixel Processor and Photoshop CC2019 then added the luminance layer from last year's session. Result is below. My personal opinion is that I have got more colour in the stars than the image I took last year through D2 - may be I am improving my processing skills, but I suspect trying to capture RGB behind a D2 filter attenuates the RGB signal and the result is less than satisfactory. Captured using120s subs with ASi1600 running at -20degC and unity gain behind AT106EDT on AZEQ6GT thanks for looking Bryan The original M33 topic from October 2018 can be read here
  19. There are quite a few excellent milkyway images being posted at the moment and I particularly like that we get views of the milkyway not only from UK but from members posting from locations all over the world and also when we UK astronomy types holiday abroad and can get some astro kit packed under the radar into the luggage - I managed to convince my better half that I simply must have my tripod and ball head along with the very important wide angle lens, despite the fact we were on a safari and the 500mm zoom was the default lens. Anyway, it was a holiday of firsts, first time in Africa, first time south of the equator and first time in such dark dark skies. I had heard that the Milkyway core could cast shadows - I thought that was urban myth - nope - its incredibly bright when you are under truly dark skies. So below is my image, consisting of around twenty 20s unguided images using unmodified Cannon 600D with Samyang 14mm lens at ISO3200. I used Microsoft ICE to stitch all the images together then Photoshop to warp into a 'flat' image and then further processing to reduce noise and increase saturation. The image was taken from a location called Miramboi, a tented lodge between Tarangire National park and Lake Manyara, about 3 deg south of equator. There are a few clouds in the image, and my impromptu mosaic plan left a few gaps hence the 'triangle artifact ' bottom center of image...... next time, if there every is a next time, the Star Adventurer will get packed !!!! Bryan
  20. Thanks for the quick replies - the cable setup described sounds very straight forward - I will give it a try out, but as posts above are stating I don't really need to guide, but finding out what is possible as a backup if I did put a longer FL lens on the Star Adventurer is very useful to know, thanks again
  21. All, I have been searching forum for information and there is some but I am still confused as to exactly what cables go where in a Star Adventurer ST4 port using a small guide scope Cam, ST4 and a Laptop running PHD2. I am absolutely up to speed with using EqDirect cable plugged into the handset socket of my AZEQ6 mount into USB on my laptop running PHD2 to create pulse guide corrections based on images sent from a LodeStar X2 in my guide scope. This is all very clear and I am happy I know what I am doing. Whats really confusing me and I don't want to plug things in where they shouldn't be is how guiding is setup with the Star Adventurer ST4 configuration Before I get into detail - Yes I understand star Adventurer can only be controlled in RA, and I'm not sure if I really want to guide, as I've got a polestar and got the Star Adventurer adapter so I can get the mount accurately polar aligned. I am looking at a wide field capture setup with 50mm or less lens on my ASI1600 that should allow 60s+ unguided, but given the current weather I'm looking at how to exploit all the features I bought with this new mount - and this is driving me nuts trying to work out the correct configuration. So to set up a Star Adventurer to be guided do I use the same wiring configuration as I have with my main mount but plug my EqDirect cable into the ST4 socket on the Star Adventurer , everything else is the same and PHD2 issues ST4 commands based on images from the guide cam ? OR...do I have to use a ST4 cable and connect the LodeStar X2 ST4 output into the Star Adventurer ST4 input and then Lodestar X2 USB connection to Laptop - but I then don't understand how PHD2 will send tracking commands into the ST4 input. I've seen screenshots where people have posted their guide plots with Star Adventurer using PHD2 - so clearly it can be done - Anybody done this and can post a diagram or annotated image of what cable goes where, and possibly a why explanation, or if there is a post or link that has this info I'd appreciate that as well. thanks
  22. Just bought a C11

    1. JB80

      JB80

      Very nice!

    2. tingting44

      tingting44

      lucky [removed word]! did u get it from FLO?

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.