Home Up

 

Astronomers talk about two different kinds of magnitudes: apparent and absolute.  The apparent magnitude, m, of a star expresses how bright it appears, as seen from the earth, ranked on the magnitude scale.  Two factors affect the apparent magnitude:

  1. How luminous the star is
  2. How far away the star is from the earth.

Absolute magnitude, M, expresses the brightness of a star as it would be ranked on the magnitude scale if it was placed 10 pc (32.6 ly) from the earth.  Since all stars would be placed at the same distance, absolute magnitudes show differences in actual luminosities.

The magnitude scale on which stars are rated has evolved from a convention first established by Ptolemy and is now traditional.  In his catalog of stars he classified their apparent magnitudes by rating the brightest star he could see as magnitude 1 and the faintest as magnitude 6.   As this system evolved, some stars were found to be brighter than magnitude 1; for example, Vega is magnitude 0, and Sirius is magnitude -1.4.  The first peculiarity to note about the magnitude scale is that the larger negative magnitude a star (or any celestial object) has, the brighter it is; but the larger positive magnitude, the fainter the star.

How the human eye judges light affects the magnitude scale.  Suppose you tried to estimate the relative brightness of a 100 W and a 200 W light bulb.  The second bulb does not appear twice as bright as the first.  The eye does not sense light in the simple way you might expect; if an object has twice the luminosity, it does not appear twice as bright.

Stars of first magnitude are 100 times brighter than stars of sixth magnitude; a difference of 5 magnitudes corresponds to a brightness ratio of 100.  A difference of 1 magnitude then amounts to a brightness ratio of 2.512.  This strange number pops up because 2.512 x 2.512 x 2.512 x 2.512 x 2.512 = 2.5125 = 100, another way of stating that a difference of five magnitudes equals a ratio of 100 in brightness.

The following table will help keep the magnitude differences and brightness ratios straight.

A Magnitude Difference of: Equals a Brightness Ratio of:
0.0 1.0
0.2 1.2
1.0 2.5
1.51 4.0
2.0 6.3
2.5 10.0
4.0 40.0
5.0 100.0
7.5 1000.0
10.0 10,000.0

Both the absolute and apparent magnitudes are given on the same scale.  For example the apparent magnitude of the Sun is about -26.  The brightest star in the sky, Sirius, has an apparent magnitude of -1.4.  The difference in magnitude is about 25; for every five magnitudes the brightness ratio is 100.  So Sirius is 102 x 102 x 102 x 102 x 102 = 1010 times less bright than the sun in apparent magnitude.  This result tells us nothing about the luminosities of the Sun and Sirius, only how bright they appear in the sky.  In contrast, the absolute magnitude of the sun is approximately +4.8, and Sirius is +1.4.  The difference in absolute magnitudes is roughly 4, corresponding to a brightness ratio of approximately 40.  This comparison of absolute magnitudes tells us that Sirius is roughly 40 times as luminous as the Sun.  A similar comparison can be made to find the luminosities of other stars.

Using a little algebra, we can convert from magnitude differences to brightness ratios quickly by using the formula:

b1 / b2 = (2.512)m2-m1

where b1 and m1 are the brightness and magnitude of star 1 and where b2 and m2 are the brightness and magnitude of star 2.

Reworking the Sun-Sirius example in more detail:

bSirius / bSun = (2.512)mSun- mSirius

= (2.512)4.8-1.4

= (2.512)3.4

= 2

The last step was done on a pocket calculator.

 

Introduction to the Night Sky - Part III

Return to Main Outline << Previous Subject Next Subject >>