Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

wimvb

Members
  • Posts

    8,771
  • Joined

  • Last visited

  • Days Won

    4

Posts posted by wimvb

  1. Some of my images from 2023

    Equipment for all images:

    SkyWatcher 190MN on an AZ-EQ6 mount, and XWO 294MM camera

    Processing: PixInsight

    And IV, the satellite that wasn't

    M31-M32-AndIV-ngc206_annotated.thumb.jpg.f45b43ecac37e1f05db5bdc5e2cde9c5.jpg

     

    M81 and M82 in HaRGB, a 2 panel mosaic

    M82_M81_HaRGB_mosaic.thumb.png.b1b7ce91f238c58010d04ca2c98480fc.png

     

    Tadpoles in Ha

    IC410_HaRGB1.thumb.jpg.f02a8d7879a28d513cf1b0d101067322.jpg

     

    pgc 12421, the truly hidden galaxy:

    pgc14241_16h_synlrgb_bxtv2_2.thumb.jpg.87eb26935ecd33cbc978c2bc9b7691de.jpg

     

    ugc 12632

    ugc12632_LRGB_v42.thumb.jpg.ee7cc17dbdf6e687886d8722092bb755.jpg

     

    • Like 3
    • Thanks 1
  2. 4 hours ago, gorann said:

    Would make life easier🤣

    Apparently not (after reading this thread). (😉 implied)

    But back on track; the total integration time needed to achieve a certain quality level depends very much on light pollution, probably more than on equipment or anything else. This makes it very difficult to compare results with others, especially if you don’t know how and where those other images were taken.

    • Like 1
  3. After stacking, I combine the three master images into an RGB image, and process from there. If you don't use the RGB palette,(eg in narroeband processing) you may need to do a linear fit before combining the different masters. Linear fit puts the very lowest pixel value at 0, and the very highest pixel value at 1 (or 65000 in 16 bit), without disturbing the linear relationship between pixels.

  4. 8 hours ago, Paul M said:

    You can select the whole image as a search for Simbad and filter for object type.

    That's what the TypeCat script in pixinsight does.

    11 hours ago, alan4908 said:

    It would be good to have a PI script that allowed you to only display objects above a particular apparent magnitude,

    The annotation script can do this, supposedly, if magnitude is supplied in the catalogue. Although, I've never got it to work reliably.

  5. On 18/01/2024 at 23:29, Skipper Billy said:

    -.- . . .--. ....... .- - ....... .. - ....... -- .- - . -.-.-- !! 😀

    Well put.

    On 19/01/2024 at 00:08, dannybgoode said:

    I got the keep at it mate - it is the bit I have quoted I can't work out...

    -.-.-- = !

    So, really: keep at it mate!!!

    Regarding your computer issues, I empathize, having had my fair share of those (atm my old laptop seems bricked). If you're not afraid to use a command shell, I can wholeheartedly recommend you get a Raspberry Pi. I just got the StellarMate OS, but also have Astroberry. Last week I've been learning and configuring StellarMate,  but when I got tired of it, I just switched the micro-SD card and could go imaging with my working astroberry distro. The switch didn't take more than 5 minutes. I did this two nights in a row.

  6. Last week I purchased the StellarMate OS and app, and the last few days I've been playing with it. Despite 6 years experience with INDI and Kstars, I still had some trouble configuring everything. The StellarMate app isn't always easy to navigate. On my Samsung Galaxy Tablet, some of the buttons in the app were unresponsive at times. Especially when clicking on a number field to open the numeric keyboard, the app was slow. And the configuration of optical trains assumed ST4 guiding by default (which I don't use). It took me a while to figure that out. And I still haven't come to grips with autofocusing, but that may in part be due to poor sky conditions.

    Earlier this week I experienced how easy it is to switch capture software with a Raspberry Pi and INDI. We've had a few clear nights, and I started configuring StellarMate, but after a while I just swapped SD-cards in the Raspberry Pi and did my imaging in Astroberry. So easy!

    Here it is: first light. Earlier this evening we had another gap in the clouds and I pointed my telescope at Messier 34, which was conveniently placed in the South. I got 55 minutes of data before the clouds moved in, 4 x 5 minutes per filter. I had to discard one green sub due to star trails.

    Processed in PixInsight (without any noise reduction). There are even a few small galaxies in the background.

    M34_55min_RGB.thumb.jpg.6be03cb272ae23e6e3c9f063b1860964.jpg

    • Like 5
  7. 8 hours ago, gorann said:

    I wonder if ZWO or QHY have it.

    Probably not, since the power plug has a standard pin = +, barrel = - configuration.

    17 hours ago, tomato said:

    When discussing if the money saved buying a cheap  camera from China is worth it, the issue of what happens when it goes wrong is rightly raised as a concern. 
    Well, I could be in that boat now as I somehow applied a short to my RCIMX571 when plugging in the power, resulting in a burning odour and a whiff of smoke. Looking on CN the general view is don’t bother trying to send it back, and my view was if it lasted a few years it won’t owe me anything, although I’m obviously a bit miffed that my actions have broke it.

    I’m just wondering if anybody has sent one back for repair and what were your experiences. 
     

     

    As others wrote, contact them first. But if they are reluctant to deal with it, it should be easy to assess the damage. The electronics is probably just behind a cover held in place by a few screws. Unless the printed circuit board is fried, only some components will need replacing.

  8. 1 hour ago, tomato said:

    Thanks to @wimvb's encouragement I have continued my attempts to add in the meagre 1 hr of dual band data I collected. This is my best attempt so far, adding the red channel from the NBZ RGB into the red channel of the RGB broadband data. I've cropped it so the Ha regions can be seen a bit better, but of course that is revealing my processing artefacts also.

    Image05Hacrop.thumb.jpg.84f5d830df283c7ea5cd4e1e32c85c19.jpg

    That's it!

    • Thanks 1
  9. I use this method with a few tweaks. One of them is to use a mask that covers everything but the galaxy. 

    https://pixinsight.com/examples/M31-Ha/index.html#Continuum_Subtraction

    For osc, I would

    • Extract the channels
    • Use pixelmath on red (R) and Ha: Ha - F×(R - med (R))  where F is a number lower than 1, usually 0.4 -0.6 for me. This creates a new image: Ha-R
    • Add this new image back into the red with pixelmath and a mask: R + F×(Ha-R -med(Ha-R)), where F is a new number, depending on how strong you want the Ha to be.
    • Combine the channels with this latest image going into the red channel.

    Hope this helps

    • Like 1
  10. 22 hours ago, geeklee said:

    Lovely FOV @wimvb.  You've done well with NGC 1023A, a nice little faint cloud with the contrasting stronger blue you mention.  The Ha is subtle but worthwhile addition - very nice.

    Thanks, Lee. Without that colour contrast, which was extremely faint, ngc 1023 would be just another blob. The small blue cloud makes a difference.

    • Like 1
  11. 4 hours ago, Rodd said:

    The challenge is only using data from good seeing

    Agreed, I've all but given up on that luxury. For my latest image (ngc 1023 and ic239), I measured a fwhm between 3.5 and 4". I didn't even bother with collecting luminance. I'm considering giving up on galaxy hunting for a while, and shoot nebulae instead.

  12. Two and a half larger galaxies and a bunch smaller ones. The spiral galaxy on the right is ic 239, the one and a half on the left are ngc 1023 and ngc 1023A. The latter is an irregular satellite galaxy to ngc 1023. It is ever so slightly bluer than its larger companion.

    Also in the field of view are a number of smaller galaxies, many of which, quite unexpectedly, are closer by than ngc 1023 and ic 239. So they really are smaller.

    Since ic 239 is a a spiral galaxy with active star formation, I decided to collect H-alpha as well as RGB. So far this image has a total integration time of 11  hours, of which 5 hours H-alpha and 6 hours RGB. During capture there was a layer of high cloud that softened the details, fwhm was 3.5 - 3.8 arc seconds.

    As always, I used the SW 190MN with ZWO ASI294MM camera. Processed in PixInsight.

    ic239_240107_HaRGB.thumb.jpg.75227cd043bbbc645ca0e87dbf182bcc.jpg

    • Like 10
  13. 52 minutes ago, Adreneline said:

    Maybe it is all down to the bad old Moon laughing at you and mocking your attempts to image 😆

    My Ha filter has a bandwidth of 7 nm, quite wide for a narrowband filter. A 3 nm would give more contrast, but since I rarely image nebulae, I hadn't planned investing in one.

    • Like 1
  14. On 30/12/2023 at 01:02, Adreneline said:

    Any chance you can post a side-by-side with the straight Ha - no tone mapping - just grey-scale?

    Sorry for the late reply, with the holidays and all, I totally forgot about it. But here's a side-by-side of the starless Ha and the toned image. Also the raw Ha image with only a simple stretch applied.

    sidebyside.thumb.jpg.fbf1fe97ade55de23af154f45ffae66f.jpg

    Pacman_Ha.thumb.jpg.fa217d8d90f5274c18854d60724bc4be.jpg

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.