# Apparent magnitude

The apparent magnitude of a star, planet or other heavenly body is a measure of its brightness as seen from Earth.

The scale on which magnitude is measured is a somewhat strange one. It has its roots in the tradition of dividing those stars visible to the naked eye into six magnitudes. The brightest stars are said to be of first magnitude, the next brightest are of second magnitude, and so on down to sixth magnitude, the limit of naked eye visibility. This somewhat crude method of indicating the brightness of stars was popularized by Ptolemy in his Almagest, and is generally believed to have originated with Hipparchus.

In 1856, Norman R. Pogson noticed that the traditional system could be approximated by assuming that a difference of one magnitude corresponds to a brightness ratio equal to the fifth root of 100, so that a typical first magnitude star is 100 times brighter than a typical sixth magnitude star. The fifth root of 100 used in this scale is known as Pogson's Ratio, and is approximately equal to 2.51188643150958. Pogson's scale was originally fixed by assigning Polaris a magnitude of exactly 2. Astronomers have since discovered that Polaris is slightly variable so other stars are now used to define the scale, but the principle remains the same.

The first thing to notice about this scale is that higher numbers correspond to dimmer objects. Really bright objects have negative magnitudes. For example, Sirius, the brightest star in the night sky, has an apparent magnitude of -1.46.

The second thing to notice is that the scale is logarithmic: the relative brightness of two objects is determined by the difference of their magnitudes. For example, a difference of 3.2 means that one object is about 19 times as bright as the other, because Pogson's ratio raised to the power 3.2 is 19.054607... The logarithmic nature of the scale is due to the fact of the human eye itself having a logarithmic response.

Magnitude is complicated by the fact that light is not monochromatic. The sensitivity of a light detector varies according to the wavelength of the light, and the way in which it varies depends on the type of light detector. For this reason, it is necessary to specify how the magnitude is measured in order for the value to be meaningful. For this purpose the UBV system is widely used, in which the magnitude is measured in three different wavelength bands: U (centred at about 350 nm, in the near ultraviolet), B (about 435 nm, in the blue region) and V (about 555 nm, in the middle of the human visual range). The V band was chosen so that it gives magnitudes closely corresponding to those seen by the human eye, and when an apparent magnitude is given without any further qualification, it is usually the V magnitude that is meant.

Since cooler stars, such as red giants and red dwarfs, emit little energy in the blue and UV reaches of the spectrum their power is often under-represented by the UBV scale. Indeed, some L and T class stars would have a UBV magnitude of well over 100 since they emit extremely little visible light, but are strongest in infra-red.

Magnitude is a minefield and it's extremely important to measure like with like. On photographic film, the relative brightnesses of Rigel and Betelgeuse are reversed compared to what our eyes see since film is more sensitive to red light than it is to blue light. Rigel is a blue supergiant, Betelgeuse is a red supergiant. It's therefore important that apples and oranges are kept discrete!