Jump to content

Stargazers Lounge Uses Cookies

Like most websites, SGL uses cookies in order to deliver a secure, personalised service, to provide social media functions and to analyse our traffic. Continued use of SGL indicates your acceptance of our cookie policy.

sgl_imaging_challenge_banner_globular_clusters_winners.thumb.jpg.13b743f39f721323cb5d76f07724c489.jpg

Hibou

Advanced Members
  • Content Count

    100
  • Joined

  • Last visited

Community Reputation

21 Excellent

About Hibou

  • Rank
    Star Forming

Contact Methods

  • Website URL
    http://neutronoptics.com/

Profile Information

  • Gender
    Male
  • Interests
    Scientific imaging.
  • Location
    French Alps
  1. Hi Rob. My short guide is mainly for non-astronomy use of Starlight Live (SLL) so I have left things out HiloDon has posted a more complete guide. Max Pixel Displacement: I understand that SLL tries to correct for small shifts of the FOV between frames (tracking errors etc). So it uses the positions of bright stars to register the current frame against the sum/average, correcting for the shift. Then Max Pixel Displacement would be the maximum shift allowed and Max FWHM must be used for selecting stars for registration. My scientific cameras have a fixed FOV, and no stars, so SLL can't correct frames for FOV shift. But you should certainly experiment with it for astro-imaging. Defective Pixel Removal : I am not sure how that works in SLL. I assumed it tries to replace anomalously bright/dark pixels with the average of surrounding pixels, but it doesn't remove all bright pixels, I suppose because some of the isolated pixels might be stars not noise, and of course it wouldn't work if adjacent pixels were bright. For my purposes I would switch it off, and use imageJ to clean up the image, removing noise, de-interlacing etc, and only then bin if required. ImageJ is like Photoshop for scientists including astronomers. I prefer collecting raw un-binned, unfiltered data and cleaning it up afterwards Alan.
  2. It's been a while now since Paul Shears has posted (and even longer for me). I use Starlight Live 3.3 for scientific imaging, and have a short guide on http://neutronoptics.com/starlight.html I agree that it's a great application, and it would be even better if there was an "automatic" option for the image display as in programs such as Nebulosity, Artemis Capture, Artemis Infinity etc. As it is, you have to manually bracket the intensity distribution between the black and white levels. Do people know if Paul is still working on this great application, or even how to contact him ? Alan
  3. Astrojedi may be interested to see that Atik now have an option for colour software binning in their Infinity stacking software. Software binning is not quite as good as hardware binning of the CCD charge - it's more like stacking. Yes, my original point was that bigger chips will always win for intensity, if you can adjust your focal length to give the same FOV, and even if the pixels are then too small, you can always bin to make them "larger" to collect more light. Astrojedi argued that hardware (charge) binning was not possible for colour CCDs, and appeared to dismiss software binning. Glad we are in agreement now that the facts are straight :-) BTW, my favourite chip is now the 1" Sony monochrome CCD (Atik 460EX and fast readout Atik VS60). It's the biggest CCD Sony make, with great sensitivity and exceptionally low noise. Bigger is better.
  4. As a new member, I was indeed surprised to receive such frank criticism from AstroJedi, but was not really offended. I regret that others were. One call to order may have been sufficient, but with 3 strikes, I'm out :-) Thank you for setting up and maintaining such an interesting discussion group, which I am pleased to see I can still follow even as a non-member. Kind regards, Hibou.
  5. You can of course use the standard Artemis Capture application that comes with the Atik Core Software to capture multiple frames for off-line stacking. It will work with the Infinity camera just as with other Atik cameras. Infinity (and Starlight Live) need guide stars to correct for drift of the FOV over the longer total imaging times needed for faint objects. But there is also a valid reason why you might want to live stack for large objects like planets - so that you can start with a short exposure and continue exposing until you judge that the exposure is sufficient. When I asked Paul, the Starlight Live author to include an option to do that, he obliged with a setting of "Max pixel displacement=0" which simply stacks frames without registration. ChrisG at Atik did a similar thing for the Beta-3 Infinity Software with an advanced setting called "Calc Image Movement". He wrote "When deselected, the image stacking will assume that the object is in exactly the same place, and won't bother to calculate the shift and rotation. Thus the stacking will always succeed and you should be able to use it for your application". That will allow stacking even for Jupiter, but of course won't compensate for the atmospheric shimmering seen in Spacedout's video.
  6. No-one is doubting your qualifications Jorman. Whatever you think of the rest of us, some people did try to help, and still got no-where.
  7. The XCM2000 uses a 70% bigger KAI-2020 RGGB CCD with 1602x1202 bigger 7.4 μm pixels (with lower QE and higher noise, but cooled to reduce dark current). And it cost twice as much. If you expected an uncooled UltraStar-C with a smaller chip to do better than stacking an XCM2000 I can understand your disappointment. I can (almost) see Astrojedi's point :-)
  8. Well v3.0.1 works for me on a 64-bit Win-7 machine. When you launch it, it asks you to select a FITS file and displays it when you double click it. If you see a black display it may be that your levels are not set appropriately; select "automatic" levels. Mind you, it doesn't seem to do much. You can do more with FITS files with Nebulosity (demo) or imageJ (free) which will both work on your system.
  9. Logically, one can't be "condescending" and keep it to oneself, as I'm sure you know :-)
  10. Let's wait and see :-) We still don't have a FITS file. The screen shots already show that the display levels are set strangely. By "stretching" the display, it means selecting the 8-bits you can see from the 16-bits that the camera outputs. You should at least bracket the observed intensity by the black and white levels, which the screen shots don't show. If you really can't upload FITS files you could at least drop them on the demo copy of Nebulosity, which actually tries to set reasonable levels. As for the remarks about "toys" and the lack of sensitivity of the Ultrastar-C (and by inference the Infinity-C), I... don't agree :-). The ICX825 is one of Sony's latest and most sensitive CCDs with ~75% QE and relatively large 6.45um pixels. OK, you are losing much of the light with the RGGB filters, but compare it with most other colour CCDs used here. Apart from the ICX829 (lodestar) which has even larger 8.6um pixels, which ones are brighter ? I suspect there are going to be some red faces here. Hope it's not mine :-)
  11. As Don remarked, the background in your images looks very dark compared to his where you can see the noise. Are you sure you have adjusted your levels correctly :-) Sorry for this naive question, but you did say you were unfamiliar with the software. Perhaps you should post a raw 16-bit FITS file that you are unhappy with together with imaging times and conditions. These jpeg files are not all that informative. I can't believe that other people find the Ultrastar-C and the similar Infinity-C just "very expensive toys" as you write in your frustration thread :-)
  12. I looked up the transmission of a narrow band Ha filter (below). Yes, you should use a monochrome CCD, and then just colour everything in software with the bandwidth of the filter :-)
  13. Windows has all sorts of key combination short cuts that can be typed inadvertently. I find grandchildren are good at that :-)
  14. That's normal. You are just seeing the intensities of the pixels through their filters. If you zoom right in you will see the filter pattern that must be interpreted by LL itself or e.g. by Nebulosity to see the different colour intensity. I am not familiar with FITS Liberator so don't know if it can interpret the filter pattern.
  15. This is because somehow you have changed the size of your SL window. Type "Size" into Search on your computer and select "Make text and other items larger or smaller". Then drag the slider all the way to "Smaller" and click "Apply" (see screen capture below).
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.