In about 120 B.C., Hipparchus devised the system of quantifying the brightness of stars still in use today. He catalogued about 1000 stars visible to the unaided eye into six categories according to their apparent brightness. Each category was assigned a unit of apparent magnitude.
|1st class||magnitude 1||brightest|
|2nd class||magnitude 2|
|3rd class||magnitude 3|
|4th class||magnitude 4|
|5th class||magnitude 5|
|6th class||magnitude 6||faintest|
In 1854, the British astronomer N.R. Pogson figured out that the naked eye is capable of seeing stars that differ in brightness by about a factor of 100 so that 1st magnitude stars are about 100 times brighter than 6th magnitudes ones. The constant factor 2.512 is then the 5th root of 100, that is 2.512 X 2.512 X 2.512 X 2.512 X 2.512 = 100.
Today, we we can measure magnitudes to about .01 precision, so we can do better than simply six categories.
|Magnitude||Times fainter than 0th magnitude star|
NOTICE: Brighter objects have smaller magnitudes!