Jump to content

Banner.jpg.b83b14cd4142fe10848741bb2a14c66b.jpg

Telescope Focal Length?


Recommended Posts

Hi all, probably a stupid question, does the focal length on a telescope mean the same as on a camera lens?, for example a 700mm focal length on a full frame camera is equal to 700mm and 700mm on a aps-c camera 1120mm? At what sensor size does the focal length of a telescope become correct and true?

Link to comment
Share on other sites

44 minutes ago, Cornelius Varley said:

The focal length of a telescope doesn't change. A 700mm focal length telescope will always be a 700mm telescope irrespective of the camera attached to it. What does change is the field of view which is governed by the size of the sensor.

Just seen this thread. Would a smaller size sensor be better for picking up faint stars? or the opposite? or neither? I am not clued up on cameras hence my asking. I am familiar with field of view, so a larger sensor gives a wider field? and a larger sensor means higher pixels?

Link to comment
Share on other sites

2 hours ago, Cornelius Varley said:

The focal length of a telescope doesn't change. A 700mm focal length telescope will always be a 700mm telescope irrespective of the camera attached to it. What does change is the field of view which is governed by the size of the sensor.

So it's similar to dslr lenses, the field of view changes? What size sensor on a telescope would give the field of view as an ordinary lens of the same focal length?

Link to comment
Share on other sites

Forget this 'crop factor' business because it is meaningless and causes confusion. You have the following key variables with a given focal length:

Chip size. This determines field of view. (A small chip does not 'get you closer' or more 'zoomed in,' as the term crop factor might imply.)

Pixel size. This determines your image scale in arcseconds per pixel and, in so doing, determines your resolution of detail. It also determines the size a given object will appear on your PC screen at 'full size.' The object's image size as projected by the scope will vary only with focal length but its size on the PC screen depends only on how many pixels the projected image lands on. More pixels (ie smaller pixels) means bigger image. When we say an image is being viewed at full size we mean that one camera pixel is given one screen pixel.

So we now see that a longer focal length scope with larger pixels might actually produce a smaller final screen image of an object than a shorter FL scope with a smaller pixel camera.

The key values, again, are 1) Field of View in degrees and minutes of arc. 2) Image Scale in arcseconds per pixel. These two values are independent of other values and specify the image completely.

Olly

 

Edited by ollypenrice
Typo
  • Like 2
Link to comment
Share on other sites

For many years, 35mm film was the most popular format and so this formed folks' standards of lens "length". So a 500mm was a fairly extreme telephoto, a 24mm was a very wide but not crazy-wide lens, and 50mm approximated naked-eye perspectives. "Crop factor" is just a way to relate the piece of the image circle that a smaller sensor lops out to the 35mm mental standard; a 35mm lens on an APS-C sensor looks about the same as a 50 on 35mm.

But focal length is focal length, so if you put a 35mm sensor behind, say, a 500mm F-L telescope, you will get a picture identical to a 500mm telephoto on a 35mm film camera.

The image circle at the focal plane of a 500mm telescope is exactly the same, no matter what sensor you put there to capture the light. Bigger sensor? More of the image circle. Smaller? Less.

Since sensors can have wildly different pixel densities, the same number of pixels might represent all of the image circle for one sensor, or a tiny piece of it for another. For example, my Pentax DSLR has pixels about 4.8 microns across. My ASI183 has much smaller pixels (2.4?), so a given object would cover twice as many of them in each direction, and appear bigger if those pixels were shown on a computer screen at the same scaling.

However the Pentax sensor is physically bigger, so it can use a bigger percentage of the image circle and so capture a bigger object.

You would think that a vast number of tiny pixels would always be the best, and yet it turns out to be not quite the case. The smaller the pixel, the more susceptible it is to noise, generally speaking, which is why cell phones perform so much more poorly in low light than DSLRs. So it's a classic engineering tradeoff. Another factor is the limiting resolution of the optics, but even more so the wavering induced by motion in the atmosphere ("seeing"); a really super-high-resolution sensor merely devotes lots of pixels to lovingly depicting the blur as a star or feature jumps around during the exposure. For most folks, the sweet spot is between 1 and 2 arcseconds per pixel.

Astronomy.tools has some neat calculators for playing around with this stuff, e.g. CCD suitability.

  • Like 1
Link to comment
Share on other sites

10 hours ago, rickwayne said:

For many years, 35mm film was the most popular format and so this formed folks' standards of lens "length". So a 500mm was a fairly extreme telephoto, a 24mm was a very wide but not crazy-wide lens, and 50mm approximated naked-eye perspectives. "Crop factor" is just a way to relate the piece of the image circle that a smaller sensor lops out to the 35mm mental standard; a 35mm lens on an APS-C sensor looks about the same as a 50 on 35mm.

But focal length is focal length, so if you put a 35mm sensor behind, say, a 500mm F-L telescope, you will get a picture identical to a 500mm telephoto on a 35mm film camera.

The image circle at the focal plane of a 500mm telescope is exactly the same, no matter what sensor you put there to capture the light. Bigger sensor? More of the image circle. Smaller? Less.

Since sensors can have wildly different pixel densities, the same number of pixels might represent all of the image circle for one sensor, or a tiny piece of it for another. For example, my Pentax DSLR has pixels about 4.8 microns across. My ASI183 has much smaller pixels (2.4?), so a given object would cover twice as many of them in each direction, and appear bigger if those pixels were shown on a computer screen at the same scaling.

However the Pentax sensor is physically bigger, so it can use a bigger percentage of the image circle and so capture a bigger object.

You would think that a vast number of tiny pixels would always be the best, and yet it turns out to be not quite the case. The smaller the pixel, the more susceptible it is to noise, generally speaking, which is why cell phones perform so much more poorly in low light than DSLRs. So it's a classic engineering tradeoff. Another factor is the limiting resolution of the optics, but even more so the wavering induced by motion in the atmosphere ("seeing"); a really super-high-resolution sensor merely devotes lots of pixels to lovingly depicting the blur as a star or feature jumps around during the exposure. For most folks, the sweet spot is between 1 and 2 arcseconds per pixel.

Astronomy.tools has some neat calculators for playing around with this stuff, e.g. CCD suitability.

If I am getting this correctly, let's assume they have brought out a new camera. If the camera was your pentax, with pixels 4.8 microns across, but the new part was that these pixels could be halved/quartered........down to 1.2, (electronic manual control in increments). Would this mean that the screen image on your pc could look as if it was zoomed in/out, obviously depending on pixel size chosen?

Link to comment
Share on other sites

9 minutes ago, Anvil Basher said:

If I am getting this correctly, let's assume they have brought out a new camera. If the camera was your pentax, with pixels 4.8 microns across, but the new part was that these pixels could be halved/quartered........down to 1.2, (electronic manual control in increments). Would this mean that the screen image on your pc could look as if it was zoomed in/out, obviously depending on pixel size chosen?

No. The field of view is controlled by the size of the sensor. It wouldn't matter about the pixel size. Most digital cameras allow you to change the resolution (pixel size) of the image, but these leave the field of view the same. On some cameras you can use a 640x480 cropped mode which only uses the centre area of the sensor 640 x 480 pixels in size and this would have the appearance of a zoomed in image.

Link to comment
Share on other sites

1 hour ago, Cornelius Varley said:

No. The field of view is controlled by the size of the sensor. It wouldn't matter about the pixel size. Most digital cameras allow you to change the resolution (pixel size) of the image, but these leave the field of view the same. On some cameras you can use a 640x480 cropped mode which only uses the centre area of the sensor 640 x 480 pixels in size and this would have the appearance of a zoomed in image.

Yes, it would give the appearance of it being zoomed in. Thanks, I think I understand the camera a little more now. The cmos cameras for scopes have a set field of view, regardless of pixel count. Do dslr's have the ability to change fov without lenses? If not, can a dslr with lens attached be mounted on scopes? I just remember my days of 35mm film cameras, (which I was quite big into), and I thought I could change the fov with a ring on the camera.....or was it on the lens???......I cannot remember.

Link to comment
Share on other sites

So you're both right here, but for different reasons. Let's say your telescope's magnification is such that the Orion Nebula  takes up 10 millimeters' diameter on the sensor. A camera with smaller pixels will make it look bigger on the computer screen  -- if you blow up the resulting images on your computer screen to the same image-pixel-to-screen-pixel ratio. The content in the overall uncropped images will depend on strictly on the size of the sensor. So if your sensor were a 10mm circle, it would have only the Orion Nebula on it. 20mm, a bit of the background too. 200mm, and the Nebula will be a pretty tiny part of the overall image.

Edited by rickwayne
Link to comment
Share on other sites

2 hours ago, rickwayne said:

So you're both right here, but for different reasons. Let's say your telescope's magnification is such that the Orion Nebula  takes up 10 millimeters' diameter on the sensor. A camera with smaller pixels will make it look bigger on the computer screen  -- if you blow up the resulting images on your computer screen to the same image-pixel-to-screen-pixel ratio. The content in the overall uncropped images will depend on strictly on the size of the sensor. So if your sensor were a 10mm circle, it would have only the Orion Nebula on it. 20mm, a bit of the background too. 200mm, and the Nebula will be a pretty tiny part of the overall image.

Indeed (though telescopes don't magnify!)  As I said in an earlier post I think the easiest way to think about this is to ask how many pixels you put under the scope's projected image of an object. If the object's projected image has 100 pixels x 100 under it with one camera and 200 pixels x 200 under it in a second one, the final image will be twice as high and twice as wide in the second one.

Olly

 

Edited by ollypenrice
Link to comment
Share on other sites

Take the below as an example. 102mm scope with a focal length of 714mm. The larger rectangle is the size of the ASI1600 chip. The smaller one is the ASI120 chip. Both have the same pixel size of 3.75 microns. As you can see, both M81 and M82 fit into the field of view of the larger chip. Now look at the smaller chip. Only M81 fits on the sensor. But this doesn't mean you are getting a better 'zoomed in' image. The resolution is exactly the same as both sensors have the same size pixels and are using the same focal length. Say you take an image with the ASI1600. It might look ok at fullscreen, but if you zoomed in on the computer screen to only fit M81 it would look a little bit more grainy and pixelated. This is exactly the same image you would get with the smaller sensor, a little bit grainy and pixelated until you zoomed back out to match the same field of view as the larger camera! Only a longer focal length gets you the 'zoomed in' effect your thinking of.

Screenshot_20190426-201349_SkySafari 6 Pro.jpg

Link to comment
Share on other sites

Right. Assuming same-size pixels. But that assumption doesn't hold over many cameras, so you actually have to look at both. If the 120 had the same number of pixels, crammed super-densely into its much-smaller physical rectangle, the area depicted would be exactly as you portray -- but the UltraSuperMega120's image of M81 wouldn't be grainy and pixelated when blown up to the same size as the 1600 image, it would be smooth and hi-res. Zoomed-in-looking, in fact, just as if you'd stacked a bunch of Barlows or had a longer focal-length objective.

That's obviously a crazily exaggerated example -- a sensor with pixels that tiny is impractical at the current state of the art -- but the principle is valid. In fact it's the same principle as using a short-focal-length eyepiece visually, you're taking a smaller portion of the image circle and making it be the entire visual field.

Link to comment
Share on other sites

No, seeing and scope resolving power would still be limitations. At the focal length I quoted above, you would still only resolve so much. So if you zoomed into your hypothetical pixel scale the image 'blur' would just be spread out over many more pixels.

 

Example. The Dawes Limit is a simple formula to calculate the resolving power of a scope, approximately at least. The formula is :

 

Resolution in arc seconds = 11.6/scope aperture in centimetres.

So my scope would be 11.6 ÷ 10.2cm

So the resolving power of my scope is 1.137 arcseconds. If I wanted to observe a double star that had 2 arcseconds separation between them then my scope should see empty space between them. If there was less than an arcsecond separation then no amount of magnification can resolve this. The double star would always be a single blob in this scope. The same would hold true if you took a picture of the close double star with the camera with tiny pixels. The blob would just be spread out over more pixels.

Edited by david_taurus83
Link to comment
Share on other sites

Oh, yes, absolutely, if we're talking in practical terms. What's the fun in THAT? ? I was just nattering about the geometry, to be sure that the OP got that. (I bet I've long since bored them into fleeing this thread.)

And while I'm at it I hereby decree that the ASI 120 x 10^8, or whatever this camera is going to be, not only defeats seeing and the Dawes limit, but its nanometer-scale photosites exhibit no quantum tunneling to each other because they're all individually plated on the sides with unobtanium. So there.

  • Haha 1
Link to comment
Share on other sites

On 26/04/2019 at 20:26, david_taurus83 said:

Take the below as an example. 102mm scope with a focal length of 714mm. The larger rectangle is the size of the ASI1600 chip. The smaller one is the ASI120 chip. Both have the same pixel size of 3.75 microns. As you can see, both M81 and M82 fit into the field of view of the larger chip. Now look at the smaller chip. Only M81 fits on the sensor. But this doesn't mean you are getting a better 'zoomed in' image. The resolution is exactly the same as both sensors have the same size pixels and are using the same focal length. Say you take an image with the ASI1600. It might look ok at fullscreen, but if you zoomed in on the computer screen to only fit M81 it would look a little bit more grainy and pixelated. This is exactly the same image you would get with the smaller sensor, a little bit grainy and pixelated until you zoomed back out to match the same field of view as the larger camera! Only a longer focal length gets you the 'zoomed in' effect your thinking of.

Screenshot_20190426-201349_SkySafari 6 Pro.jpg

Thank you, I understand it now, cheers!!

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.