Jump to content

sgl_imaging_challenge_2021_annual.thumb.jpg.3fc34f695a81b16210333189a3162ac7.jpg

A320Flyer

Members
  • Posts

    185
  • Joined

  • Last visited

Everything posted by A320Flyer

  1. Would this work with NINA? That would be very useful. Are you willing to share?
  2. Sorry, just saw your post. Unfortunately, I don’t have a design for this. The difficulty is that there is a limit to what can be printed on the small-sized heated bed of the printer. Up to about 120mm aperture for a refractor is about as far as you can go. Bill
  3. Cheers. I have the scope but waiting on the flattener. Tried imaging with my A183 but stars were horrible without the flattener. Your stars are excellent so I’m looking forward to getting the FF. Bill
  4. Very nice. Did you use the dedicated 0.8x reducer?
  5. Hi, Here is my attempt. Processed in PixInsight with the following workflow: Combined Sii, Ha and Oiii to give SHO colour image Stretched using HT Colours remapped to Hubble Palette with Curves, using appropriate ColourMasks Saturation increased Ha used as Lum Slight Deconvolution NR using TVGD and MMT Stretched using HT Contrast increased on high SNR areas using iterations of LHE Detail increased in background using HDRMT Lum merged with SHO Dark Structure Enhance using the DSE Script Small stars decreased in size/intensity using MT and a star mask Transferred to Photoshop for final colour balance and slight sharpening of the smaller details. I rotated the canvas 180deg so that the highlights appear to be lit from above, as I think this looks more natural. Sh2-155.tif
  6. Having seen the latest very interesting posts on small observatories, attached is a write-up I did for my local Astro Society on a very small imaging observatory in my back garden. Its been operating very successfully now for a couple of years without any real issues. I have only very recently modified the lift-off lid so that more of the panel against the fence remains in place - this made the roof a lot lighter and much easier to flip open. I have since also installed an EQ6R-Pro with dual imaging capability, which is significantly bigger than the HEQ5 it was originally designed for, so I had to cut down the pier a little to accommodate that - otherwise its all as described. I hope this might be useful to anyone thinking of doing something similar. Bill The Garforth Nano Observatory.pdf
  7. I got myself a Sharpstar 61EDPH ll so I had to make a Flats Box for it. I've attached the files if any good to anyone Bill Sharpstar 61EDPH ll Flats Box.zip
  8. I look at an RGB image a bit like a cake mix or cocktail - it’s difficult to extract the individual ingredients once they are all mixed together. In an RGB image, the luminance is perceived and if we want to extract a good approximation of it, we need to say roughly how we think it is made up. Hence RGBWS. Here is a quote from a well known PI guru on the PI Forum: well it gets pretty deep into color theory but for the purposes of PI it modifies the R/G/B channel weights for the purposes of extracting something closer to true Luminance when extracting L* from an RGB image. you want them to be weighted equally which is why you change the weights to 1,1,1. the human eye is most sensitive to green so with default channel weighting in an RGB image the green channel participates more in the calculation of L* than the other channels. if you're trying to get an equivalent L image that might have come from an L filter, there's no green bias there - all wavelengths pass equally thru the filter. it doesn't seem to do anything because it doesn't really change how the data is displayed, just how it is interpreted behind the scenes.
  9. 4. Since an RGB image does not have a separate luminance channel, it is usual to set all channels to have equal weighting prior to extracting the luminance. this ensures that the R, G and B channels all contribute equally. You could use 0.33, 0.33 0.33 as long as they are all egual. If you used 1.0,0,0 say, you would effectively just extract the Red channel. 2. No mask. Just three HT and four Curves something like this: No NR on RGBHa; the Convolution carried out on 9. sufficiently dealt with the noise. After the PI process, I transferred to Photoshop to smooth the IFN. I duplicated the image, made a mask for the galaxies and blurred the background using the Dust & Scratches filter, reducing the opacity to suit. I prefer PS for doing this but if you wanted to do in PI, you could probably do it by using MLT to switch off the lower layers and blend the results with PixelMath and a luminance mask. Hope this helps. Bill
  10. Thanks for your comment. I hope this helps. Referring back to my saved PI Project: 1. Dynamic Crop and DBE on all masters. 2. Renamed masters as Ha, Lum, Red, Green, Blue. 3. Added Ha to Red using PixelMath, straight Red+Ha, rescaled, no mask. Renamed this as RedHa 4. Added Ha to Lum using PixelMath, straight Lum+Ha, rescaled, no mask. Renamed this as LumHa 1. Channel Combination using RedHa, G and B. Renamed this as RGBHa 2. BN and CC on RGBHa. 3. Masked Stretch on RGBHa. 4. RGBWorkingSpace on RGBHa with all channels as 1.0. 5. Extracted the Lum from RGBHa. 6. Several small saturation boosts in Curves, using the extracted Lum as a mask. 7. SCNR. 8. HT to balance the channels. 9. Convolution to slightly blur. 10. Repeat RGBWorkingSpace on RGBHa with all channels as 1.0. 11. Set this RGBHa aside for later. 1. Noise Reduction on the LumHa using Jon Rista’s method (https://jonrista.com/the-astrophotographers-guide/pixinsights/effective-noise-reduction-part-1/). 2. Stretch using several small iterations of HT until the galaxies are well defined, and then use several small iterations of Curves to bring up the background and IFN whilst keeping the galaxies controlled. 3. HDRMT to compress the dynamic range of the galaxy cores, using 5 wavelet layers, de-ringing on, lum only on, no mask. 1. If required, extract the Lum from RGBHa. Balance this Lum using LinearFit, with LumHa as the reference. Add this extracted Lum back into RGBHa using ChannelCombination, using CIE L*a*b*, with a* and b* disabled. This ensures that RGBHa is at a similar brightness as, and ready to receive, LumHa as luminance data. This step may not be necessary if both the luminance and colour are already at similar levels. 2. Add LumHa into RGBHa with ChannelCombination, using CIE L*a*b*, with a* and b* disabled. 3. Make a star mask for the smaller stars and reduce their size/brightness using MT. 4. Rename as LumHaRGBHa.
  11. Thanks for all the help on this. I used PHD2's Polar Drift Align routine to get better PA. I then re-ran the GA. I let it run for longer than the minimum recommended 2mins and I noticed that the RA drift line started to come back on itself (see attachment). This indicated to me that the RA drift was primarily down to PE. So I used EQMod's Auto PEC routine to record a PE curve and play it back as PEC. I then re-did PHD2's GA and this gave almost perfectly flat RA and DEC drifts (albeit with the general raggedness caused by seeing). I followed this up with an imaging session where guiging RMS was consistently less than 0.5arcsec. Again, thanks for all the help. Bill
  12. Thanks for your response. Yes, it does seem to guide OK but I'm a little concerned that it should really track better than it does. All other screen grabs I've seen of a GA plot generally shows a random variance - not a dive like I've got. Should I be concerned?
  13. Hi, I'm trying to setup PHD2 for guiding. Everything is connected and I can calibrate and guide. However, when I run the Guiding Assistant to fine-tune the settings, I see a marked drift in the Ra axis (see attached screen grab and log file). This is a brand-new EQ6R-Pro, straight out of the box. I have pretty good focus on the guide star, good balance and reasonable PA. Any suggestions as to cause and how to rectify would be very much appreciated. Thanks Bill PHD2_GuideLog_2021-08-22_163523.txt
  14. Most sit on the outside of the dewshield. It’s only the SW150P that sits inside the tube. I had to do it that way because of the limit on the size I could get on my print bed. HTH
  15. Here's my attempt. Processed in PI with Ha added to Red, Ha added to Lum then combined together Bill
  16. Pictures of my SW150P and SW ED72 boxes, plus an exploded Inventor pic of the SW150P box.
  17. Hi. My name is Bill and, couple of years ago, my friend Chris and I began making Flats Boxes for our local Astro Society members. They were 3D-printed and custom made to fit perfectly over the top of your scope. They came as a kit of parts that was easily put together using very basic DIY skills. They took about 1hr to build (just a tiny bit of soldering) and produced an excellent flat field. We were going to try and market them to other Astro Societies as a way of fundraising but the pandemic put paid to that. Nevertheless, as a “Thank You” for all the help that this forum has given us over the years, Chris and I would like to donate the designs and print files to the community. There are Flats Box designs to fit the following scopes: ED80-102 Esprit 120 Starwave 70ED SW ED72 SW 150P/150PDS Takahashi FSQ-85EDX TMB105-65 Thomas Beck TS70 WO12134 LZOS The designs have been done using Autodesk Inventor, which was then used to export the STL files for printing. Both the IPT and STL files are included in the attached Zip file. All you need to supply are some daylight LED’s, some lengths of cable and a power supply. A basic set of instructions and an information leaflet is attached. They were generally printed at 50% infill using RepetierHost. Obviously, no warranty etc, etc, is provided. All we ask is, if you modify one of the designs for use on a different scope, maybe consider posting your design back here for others to use. We hope they are of some use to members. Clear Skies Bill and Chris. Flyer.pdf Instructions.pdf Flats Boxes.zip
  18. Hi, Excellent data once again. This is my go at processing. Mostly done in PixInsight, with final tweaks in Photoshop: Slight Crop on each channel DBE on each channel Slight NR (MLT) on R, G and B Combined R, G and B into RGB colour RGB stretched with HT SynthLum created by summing Lum, R, G and B Slight NR (MLT) on the SynthLum SynthLum stretched with HT HDRMT on the centre of the Iris Small iterations of LHE on the centre of the Iris, progressing outwards to the dusty clouds with increasing Kernal Radius Combined the SynthLum with the RGB using Channel Combination Starnet++ to shrink the stars slightly, then MT to further reduce the size of the very smallest stars Transfer to Photoshop as a 16bit TIFF Curves, Levels and Selective Colour tweaks Flatten and save as JPEG Cheers Bill On hindsight, I felt that using the SynthLum made the stars too soft and I lost definition. This is basically the same LRGB but using straight Lum.
  19. Hi, Excellent data. So pleased to be given a chance to process it. Anyway, this is my go at processing. Mostly done in PixInsight, with final tweaks in Photoshop: Slight Crop on each channel DBE on each channel Slight NR (MLT) on Ha, R, G and B Combined R, G and B into RGB colour BN and CC of the RGB image RGB stretched with Masked Stretch to preserve the colour Ha stretched with HT to match mean intensity of the RGB Ha added to the RGB in blend mode Lighten (MAX function in PixelMath) to give HaRGB Lum given a very light Deconvolution, using a mask to protect the stars and background Slight NR (MLT) on the Lum Lum stretched with an initial application of ArcSinh, then HT to match mean intensity of the HaRGB Slight LHE on the galaxy arms (using the same mask used for Deconvolution) Ha added to the Lum, again in blend mode Lighten, to give HaRGB HaRGB intensity matched to Lum with LinearFit Combined the Lum with the HaRGB using Channel Combination to give LumHaRGB Transfer to Photoshop as a 16bit TIFF Curves, Levels and Selective Colour tweaks Slight reduction in size of the smaller stars Flatten and save as JPEG Cheers Bill
  20. Here is my effort using Pixinsight. I think I might have over-processed this so here is a more subtle version. Bill
  21. I use an ASI1600 on a SW ED72 with the OVL FF and have the same issue. I have gradually increased the spacing and am currently at 58.5mm. Stars are much better now with only the very edge stars showing elongation. I have read elsewhere that the sweet spot is 60mm. The problem is that CCD Inspector reports the curvature getting worse but I see the stars getting better. Try increasing your spacing. Cheers Bill
  22. Hi, Would someone with this setup be able to advise on the optimum spacing distance they have in order to get round stars across the full frame of a ZWO ASI1600. I am having a bit of trouble dialling in the correct distance. I started at the quoted 55mm and gradually increased. I do see an improvement but am currently at 58.5mm and wonder how much further might be necessary. Thanks Bill
  23. Next clear night I'll make sure my guide scope is accurately aligned.
  24. When I play the images through PI Blink, there is a consistent movement from left to right (RA), no stutters, no discernible up/down movement. If PE was an issue, it would have "corrected" itself on each worm cycle. Dithering is random on both axes, so I would not have thought it would not have manifested itself as a consistent left/right movement. My PA is pretty good as there is very little drift on DEC. This particular target/imaging session was almost at Zenith, so I'm thinking DF would be minimal? I have my mount East-heavy as this gives me much better guiding RMS and dither recovery. Could I have it TOO East-heavy? I'm using MaximDL multi-star guiding so I would have though at the start of each image, any RA drift would have been corrected as the guiding kicked back in? I might try single-star guiding with/without Dither to see if that reveals anything. Thanks for all your suggestions. Bill
  25. Hi folks, I have noticed that there is a significant drift between the first and last images in an imaging session, even though I am guiding. I want to try an minimise this as much as possible. For background information, for this particular session, imaging setup was: ZWO ASI 1600 MM Pro, imaging through a SW ED72 on a HEQ5 Pro mount with belt mod. Guiding with a ZWO ASI 120mono via a SW 50ED Guide scope. Image acquistion via MaximDL6, guiding via MaximDL6 using EQMod Pulse Guiding. Guiding RMS was between 0.5 and 0.9 arcsec/px with a small dither of 1.0 pixels between each 300sec exposure and a settle criteria of 0.25 pixels and 15secs delay. My imaging camera is fairly accurately aligned with the long side of the chip in RA. As you can see from the attached images, stars a pretty round but there is a significant drift in RA between Image1 and Image51. As I understand things, MaximDL stops guiding between exposures to allow download of the image and then recommences guiding. With the above settle criteria, this is about 25secs between exposures. Polar Aligned using Sharpcap with a PA error of less than 10sec. Any suggestions or help would be very much appreciated. Cheers Bill
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.