
Astronomers talk about two different kinds of magnitudes: apparent and absolute. The apparent magnitude, m, of a star expresses how bright it appears, as seen from the earth, ranked on the magnitude scale. Two factors affect the apparent magnitude:
Absolute magnitude, M, expresses the brightness of a star as it would be ranked on the magnitude scale if it was placed 10 pc (32.6 ly) from the earth. Since all stars would be placed at the same distance, absolute magnitudes show differences in actual luminosities. The magnitude scale on which stars are rated has evolved from a convention first established by Ptolemy and is now traditional. In his catalog of stars he classified their apparent magnitudes by rating the brightest star he could see as magnitude 1 and the faintest as magnitude 6. As this system evolved, some stars were found to be brighter than magnitude 1; for example, Vega is magnitude 0, and Sirius is magnitude 1.4. The first peculiarity to note about the magnitude scale is that the larger negative magnitude a star (or any celestial object) has, the brighter it is; but the larger positive magnitude, the fainter the star. How the human eye judges light affects the magnitude scale. Suppose you tried to estimate the relative brightness of a 100 W and a 200 W light bulb. The second bulb does not appear twice as bright as the first. The eye does not sense light in the simple way you might expect; if an object has twice the luminosity, it does not appear twice as bright. Stars of first magnitude are 100 times brighter than stars of sixth magnitude; a difference of 5 magnitudes corresponds to a brightness ratio of 100. A difference of 1 magnitude then amounts to a brightness ratio of 2.512. This strange number pops up because 2.512 x 2.512 x 2.512 x 2.512 x 2.512 = 2.512^{5} = 100, another way of stating that a difference of five magnitudes equals a ratio of 100 in brightness. The following table will help keep the magnitude differences and brightness ratios straight.
Both the absolute and apparent magnitudes are given on the same scale. For example the apparent magnitude of the Sun is about 26. The brightest star in the sky, Sirius, has an apparent magnitude of 1.4. The difference in magnitude is about 25; for every five magnitudes the brightness ratio is 100. So Sirius is 10^{2} x 10^{2} x 10^{2} x 10^{2} x 10^{2} = 10^{10 }times less bright than the sun in apparent magnitude. This result tells us nothing about the luminosities of the Sun and Sirius, only how bright they appear in the sky. In contrast, the absolute magnitude of the sun is approximately +4.8, and Sirius is +1.4. The difference in absolute magnitudes is roughly 4, corresponding to a brightness ratio of approximately 40. This comparison of absolute magnitudes tells us that Sirius is roughly 40 times as luminous as the Sun. A similar comparison can be made to find the luminosities of other stars. Using a little algebra, we can convert from magnitude differences to brightness ratios quickly by using the formula: b_{1} / b_{2} = (2.512)^{m}_{2}^{m}_{1} where b_{1} and m_{1} are the brightness and magnitude of star 1 and where b_{2} and m_{2} are the brightness and magnitude of star 2. Reworking the SunSirius example in more detail: b_{Sirius} / b_{Sun} = (2.512)^{m}_{Sun}^{ m}_{Sirius} = (2.512)^{4.81.4} = (2.512)^{3.4} = 2 The last step was done on a pocket calculator.

