Course:MATH110/Archive/2010-2011/003/Teams/Thurgau/Homework 13

From UBC Wiki
< Course:MATH110‎ | Archive‎ | 2010-2011‎ | 003‎ | Teams‎ | Thurgau

For this homework, we chose to investigate on the apparent magnitude of stars. For those unfamiliar with this term, the apparent magnitude of stars is basically the brightness of the star to an observer on earth, normalizing its value to what it should be if there was no atmosphere. Originally the way people measured the apparent brightness of stars was through the self-reported perception of observers on a magnitude scale of 1 to 6, with 6 being the faintest and 1 being the brightest. This was the earliest way that people measured apparent magnitude and it was said that each level of magnitude was twice as bright as the next grade, or in essence, a logarithmic scale.

A logarithmic scale is a scale that shows differences in magnitudes or scales through the logarithms of values, instead of actual values themselves. This way of presenting data is particularly useful for representing data that is over a very large range of values or for very uniformly patterned information (such as apparent magnitude, richter scale, etc.).

As the apparent magnitude increases, the stars get dimmer and dimmer. As for very bright stars their apparent magnitudes are less than 1, assigning them negative numbers. These negative values have been created to accommodate the new modern system that has gone beyond the 1-6 scale. For example, the start Sirius, the brightest star in the celestial sphere has a magnitude of -1.4 and the moon has one of -12.74. Out sun, Sol is at a magnitude of -26.74. The Hubble space telescope has even been able to find stars with the magnitude of 30. The difference between 1 magnitude to another is by a factor of 2.512. As a result, using the scale 1 - 6, as the values increase across the values, the brightness of the star decreases by a factor of 2.512. Hence, the difference between a star with a magnitude of 1 and a star with a magnitude of 6 is 100. Since a star with a magnitude of 6 is 5 steps larger on the magnitude scale, it is measured as (2.512)^(5), which is 100. Logarithms are essential towards the study of Astrophysics. The following is an example from a physics textbook of the apparent brightness of stars:
(example)

In the case of Absolute magnitudes, it is written as:

In the case of Absolute magnitudes, it is written as:
m-M=5log(d/10) where;
M = Absolute Magnitude
m = Apparent Magnitude
d = distance (measured in parsecs)
To calculate the Absolute Magnitude of a star (Brightness of a star seen from 10 parsecs), the equation can be rearranged to:
M = -5log(d/10)+m The following is how this equation is applied:
The star Thurgau is roughly 3.1 parsecs from Earth and an apparant magnitude of 3.6, so what would its Absolute Magnitude be?
M = -5log(.31)+3.6
M = -5(-4.1086) <br? M = 20.5432