Jump to content

NLCbanner2024.jpg.2478be509670e60c2d6efd04834b8b47.jpg

M31 Luminosty


Guest

Recommended Posts

The weather was terrible yesterday evening and last night but I had to go out about 10pm. There was a break in the clouds which allowed me to see a strip of stars which included part of the milky way, Andromeda and Pegasus. It seemed amazingly clear and the milky way looked amazing.

I have a 14inch f4.5 Skywatcher dob and I set up to look at M31 using a wide angle eyepiece. I have observed it several times before. The centre part was clear, looking like a kind of flying saucer shape, and I could just see a faint disk which extended outside the field of view on either side. It looked excellent. So I was wondering about those terrific photos of M31 you see. I know enough to realise that I won't see those kind of views using a 14inch dob but I did wonder about how realistic those photos are and whether they artificially boost the luminosity of the disk part of the galaxy in relation to the inner part. I wondered if the photos make the outer part look more substantial than it really is.

Cheers

Steve

 

 

Link to comment
Share on other sites

Interesting question. I've tried photographing M31 a few times and do struggle with balancing the core vs the dust lanes and spiral arms - usually getting detail on the outer areas results in overexposing the core. Insofar as I'm interested in maintaining "correct" images (i.e. images that could be used for astrometry or photometry) i have not tried stretching different parts of the image to different extents, but I'm sure that's one approach that some people use to draw out the detail. However, there are bound to be other approaches.

The question about the arms being more substantial than they really is the one that interests me most though, since vision is a mental as well as optical process, and one that is fundamentally different from how a camera works. For example, if we were to somehow get much closer to Andromeda there is no point at which it would look like the images we see when we photograph it. It would be much closer to what you see through that 14 inch.

 

Link to comment
Share on other sites

Indeed they do.

Part of image processing is something called non linear stretch. This brightens dark parts of image more then light ones. It does how ever keep "brightness order", meaning parts that are darker in reality will still be darker in the image. This is done so that entire structure can be visible in image. Human eye has very large dynamic range, and devices that we use to display images are very limited in dynamic range - take computer monitor for example - it has couple of hundred levels of brightness - not nearly enough to accommodate either true range of light from target, nor human eye response. So you end up in situation where couple hundreds of thousands levels of shade need to be represented in just couple of hundreds. If you linearly brighten image to display faint parts of galaxy - all of central region of galaxy will be "saturated" - white, without any detail.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.