Magnitude

Hipparchus (Wikipedia)In around 150BC the Greek astronomer Hipparchus (left) divided the stars up into six classes of apparent brightness, the brightest stars being ranked as first class and the faintest as sixth. This system is known as apparent magnitude and classifies the stars and other celestial objects according to how bright they actually appear to the observer.

In 1856 the English astronomer Norman Robert Pogson (1829 – 1891) (right) refined Hipparchus’s system by classing a 1st magnitude star as being 100 times as bright as one of 6th magnitude, giving a difference between successive magnitudes of  5√100 or 2.512. In other words, a star of magnitude 1.00 is 2.512 times as bright as one of magnitude 2.00 and 6.31 (2.512 x 2.512) times as bright as a star of magnitude 3.00 and so on.

The same basic system is used today, although modern telescopes enable us to determine values to within 0.01 of a magnitude. Negative values are used for the brightest objects including the Sun (-26.8), Venus (-4.4 at its brightest) and Sirius (-1.42). Generally speaking, the faintest objects that can be seen with the naked eye under good viewing conditions are around 6th magnitude, whilst binoculars will allow you to see stars and other objects down to around 9th magnitude or so.

Leave a Comment

Filed under Uncategorized

Comments are closed.