Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

A Question of "Seeing"


Recommended Posts

Hi,

I've been reading some books and articles lately that seem to quantify the quality of seeing in terms of arcseconds. For example, a typical statement might be "If your seeing is better than 4 arcseconds....".

What I'm wondering is how you actually measure that?

Suppose I have a CCD camera and scope setup and I've computed the arcsecond resolution of my camera's pixels. Let's say for discussion purposes that I have 1 arcsecond per pixel.

Also let's assume my optical system is optimally focused.

If I am viewing a star image on the CCD and I notice that the diameter is approximately 4 pixels (thus 4 arcseconds), is that a measure of the "seeing"? Or is it some combined measure including seeing and diffraction and other things?

(I know measuring star image diameter is a whole other topic with techniques such as FWHM and HFD often used. But say that my CCD software tells me that the HFD of the star image is approx. 4 for the above example.)

How can I quantify how good/bad my local "seeing" is at a given time?

Thanks for any advice.

David

Link to comment
Share on other sites

You discussion is pretty close.

The atmospheric effects just spread more light around the Airy disk ( all things being equal)

The accepted measure is the FWHM of the star's image.

If you look at the profile of a typical star you see a Gaussian "bell shaped curve" the width at half the peak intensity is the FWHM. This also represents the resolution of the system.

The physical shape of the curve can extend well beyond the FWHM but the maximium enegry is contained in this smaller area.

Ken

Link to comment
Share on other sites

Hi, the quick way to judge how good/bad the seeing conditions are is how much are the stars "twinkling". I realise this doesn't give an exact answer, but most nights you cannot, as the seeing varies from moment to moment sometimes. Ed.

Link to comment
Share on other sites

I've been reading some books and articles lately that seem to quantify the quality of seeing in terms of arcseconds. For example, a typical statement might be "If your seeing is better than 4 arcseconds....".

What I'm wondering is how you actually measure that?

You get a feel for it - with smaller scopes in terms of the size of the diffraction pattern, with larger scopes in terms of the size you expect to see planets at a given magnification.

You also have to distinguish between "slow" seeing (image moves around but stays essentially sharp) & "rapid" seeing (image is all smeared out).

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.