In astronomy, absolute magnitude is a measure of the brightness of an object independent of the distance at which it is viewed.
For most objects absolute magnitude is defined to be the apparent magnitude it would have if it were at a distance of 10 parsecs (about 32.616 light years). In defining absolute magnitude it is necessary to specify the type of radiation that one is measuring. When refering to total energy output, the proper term is bolometric magnitude.
Absolute magnitude can make for some mindboggling facts. Many stars you can see have an absolute magnitude capable of casting strong shadows if they actually were 10pc away! Rigel (-7.0), Deneb (-7.2) and Naos (-7.3) for a start! Betelgeuse also fits in there at -5.6 although it is slightly variable. For comparison, the bright star Sirius is an unconvincing 1.4 and our Sun is a faint 4.5. Ten parsecs is not a close by distance, the red giant Arcturus is at about that distance and although much more powerful than our sun, it's not an overly bright star from Earth. Comparing with visual apparent magnitudes, what you see when you look up at night, Sirius is -1.4, Venus gets to -4.3 at best and a full moon is -12. The last object with a magnitude comparable to the absolute magnitude of those three stars above named was visible as a supernova a thousand years ago, it's remnant is the Crab Nebula, M1. The Chinese astronomers reported being able to read by it, see their shadows in it's light and observe it clearly in broad daylight.
Confusingly, for comets and asteroids a different definition of absolute magnitude is used, because the above one would be of little use. In this case, the absolute magnitude is defined as the apparent magnitude that the object would have if it were one astronomical unit from both the Sun and the Earth and at a phase angle of zero degrees. This is a physical impossibility, but it is convenient for purposes of calculation.