The measurement of the brightness of a star is called its magnitude. The apparent magnitude of a star describes how bright it appears from Earth. Hipparchus ranked stars from first magnitude for the brightest stars, to sixth magnitude for those that can just be seen by the naked eye.
The human eye does not work in a linear way but follows a logarithmic rule. In 1856 Norman Robert Pogson (1829-91) proposed that a difference of five magnitudes to correspond to a factor of 100 in brightness, i.e., that a difference of one magnitude to correspond to a difference in brightness of 2.512 times (i.e. 2.5125=100). The dimmer the star, the larger its number on the Pogson scale: a magnitude 4 star is 2.5122 times dimmer than a magnitude 2 star. Negative numbers are used for stars brighter than those considered by Hipparchus.
The absolute magnitude of a star is a measure of the actual brightness of a star. It is the apparent magnitude that a star would have if it were at a distance of ten parsecs from us. Luminosity is the amount of radiation a star emits, and is directly related to its absolute magnitude.
Leave a Reply