Stellar brightnesses are expressed in terms of magnitudes. Magnitudes are not a simply and straightforward measure of the intensity of light, because they are tied to the perception of light by the human eye ( a so-called nonlinear detector).
The mathematical definition of magnitudes is as follows. We have two stars with brightnesses
(as measured with a light meter) the magnitudes
are related to
by
or
If you don't like these equations, check out Table 16-2 of your textbook.
There are two types of magnitudes used in astronomy. Apparent Magnitude (noted by a lower case m) is the magnitude of a star as it appears to you in the night sky. Table 16-1 gives the apparent magnitudes of some common astronomical objects. A star with a bright apparent magnitude can appear this way either because it is very close, or because it is very luminous.
The Absolute Magnitude (indicated by upper case M) is a measure of the
luminosity or intrinsic brilliance of a star. It is defined as the apparent magnitude
that a star would have if it were at a distance of 10 parsecs.