An astronomer has found two stars (star A and star B) that appear to be the same brightness in the sky. However, he knows that star A is 50 light years away, and star B is 300 light years away....

An astronomer has found two stars (star A and star B) that appear to be the same brightness in the sky. However, he knows that star A is 50 light years away, and star B is 300 light years away. Which star is actually brighter? And how much brighter is that star than the other? Show all your working out.

 

Expert Answers
gsenviro eNotes educator| Certified Educator

The brightness of a star is inversely proportional to the square of the distance between the star and earth. 

i.e., `B alpha 1/d^2`

where, B is the brightness of star and d is the distance from earth. 

Here, stars A and B appears to have same brightness, as observed from earth. However, the distance of star A is 50 light years and that of star B is 300 light years. In other words, star B is 6 times (= 300/50) as far away from us as compared to star B. 

Comparing the brightness levels of stars A and B, we can say that

`B_A/B_B = d_B^2/d_A^2`

or, `B_A/B_B = 300^2/50^2 = 36`

This means that the apparent brightness of star A should be 36 times that of star B given their distances from Earth and assuming the same actual brightness of each star. Since, the scientist observed the same brightness for both the stars, it only means that star B is actually brighter than star A and that too by a factor of 36. 

Thus, Star B is brighter by a factor of 36, as compared to star A, however due to its greater distance from earth, it appears to have the same brightness as star A. 

Hope this helps.