Jump to content

Is Vega the standard?


GazOC

Recommended Posts

Not a dumb question, but it does open a can of worms, Gaz.

The magnitude system was devised mostly arbitrarily by eye in the time of Hipparchus, around 150 BC. Ptolemy later divided the stars into 6 brightness groups, with the brightest being 1st magnitude and the faintest being 6th. Actual mathematical designations were developed in the 19th century by William Herschel. His method was to compare pairs of stars in two telescopes, first by covering part of the lens to make the brighter star appear the same as the fainter one, then calculating the amount of aperture change needed. The two stars were then assigned more accurate magnitude values. A scale was developed after Herschel's measurements that is roughly logarithmic. That is, every 3 magnitudes denotes about a ten time increase in brightness. Not exactly, but close enough. The brighter stars were reanalyzed to reflect this in the to mid 1800's, which is why some stars now have negative values. All this describes only apparent magnitude. Absolute magnitude is calculated by taking the star's distance from us into account. Its temperature, determined by spectroscopy, reveals its actual energy output which is then used to calculate how bright it would appear from a standard distance of 10 parsecs, or about 30 light years.

More details are available in any standard astronomy text.

I hope this helps.

Link to comment
Share on other sites

Hi Astroman & Gaz.

I had a look in my dictionary of Astronomy and it said "In order to make the magnitude scale precise, the English astronomer N.R. Pogson proposed, in 1856, that a difference of five magnitudes should correspond to a brightness ratio of exactly 100:1. This is now the universally (I think they mean amongst Homo Sapiens - not sure about LGMs :D ) adopted scale of magnitude. The zero of the scale was established by asigning magnitudes to a group of standard stars near the north celestial pole, known as the North Polar Sequence" .

Interesting - I never knew that - you learn something new every day in this game :D

Tom

Link to comment
Share on other sites

His method was to compare pairs of stars in two telescopes, first by covering part of the lens to make the brighter star appear the same as the fainter one, then calculating the amount of aperture change needed. The two stars were then assigned more accurate magnitude values. A scale was developed after Herschel's measurements that is roughly logarithmic.

Cheers, I didn't know that was how it was done! Where was the "baseline" magnitude? It just seems funny that a star is exactly 0.0?

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. By using this site, you agree to our Terms of Use.