Courses
Courses for Kids
Free study material
Offline Centres
More
Store Icon
Store
seo-qna
SearchIcon
banner

What is the measure of a star’s brightness called?

Answer
VerifiedVerified
402k+ views
Hint: Since ancient time brightness of star is measured by the people living on the earth it has changed time to time and it becomes better by the time as stars are far away and when their light reaches our eyes the get destroyed so it is peculiar to find the brightness but is possible because of science.

Complete answer:
In the night sky above the surface of the earth we see many stars. Some of them are brighter while some are dim. This is because the brightness of a star depends on the composition and how far they are from our planet earth.

Astronomers define star brightness in terms of
Apparent magnitude: Brightness of the star from the surface of Earth.
Absolute magnitude: Brightness of the star from the standard distance of 32.6 light-years, or 10 parsecs.
luminosity: the amount of energy (light) that a star emits from its surface.

People have been measuring a star's brightness since ancient times. This shows that people were concerned about that fact which makes it an ancient one on which astronomers are still working but today they use more precise tools and devices to obtain the calculation.

Therefore, apparent magnitude, absolute magnitude and luminosity is used to measure the brightness of a star. Along with this an ancient astronomer named Hipparchus about 2000 year ago classified stars from 1 to 6. Brightest were the 1 while faint was the 6.

Note: Most people mistake to use Hipparchus scale as the correct scale to measure the brightness of a star but the Hipparchus scale was changed and was modified by adding the minus side as Sun, Moon, Venus 100 times brighter than the star at the first magnitude after that this scale was also discarded and modern ways were find out.