In this series we are exploring the weird and wonderful world of astronomy jargon! You’ll soon see what we’re talking about this week: luminosity!
Point at a random star on the night sky. Just how bright is that star? Yes, you could measure its brightness, but that’s from your vantage point here on Earth. The brightness that you measure depends on many things that have nothing to do with the star itself. The same star placed further away would appear less bright. The same star but with loads more interstellar dust in front of it would also appear less bright. You are only measuring the brightness in visible light – but the star is also glowing in everything from radio to X-ray.
That’s why astronomers prefer not to use the brightness of a star, but rather its luminosity. Luminosity is, in some sense, the true brightness of an object. It’s a measure of the actual amount of electromagnetic energy emanating from a star. That includes all wavelengths of light, both visible and invisible. It doesn’t matter how much intervening dust there is. It doesn’t matter how far away the star is.
It’s an intrinsic, real property of the star itself. But since we can only measure a limited amount of the radiation coming from a star, calculating the luminosity usually involves modeling the total light output.
By default, the word “luminosity” is short for “bolometric luminosity”, which means the total luminosity across the entire electromagnetic spectrum. But sometimes astronomers might refer to the luminosity in a specific band of wavelengths.