# 2.7 - Chapter 2 Summary

Chapter 2 defined continuous random variables and investigated some of their properties. We saw several examples of commonly used continuous distributions, including the famous normal distribution.

## Relationship Between CDF and PDF

One of the key features of a random variable is its associated probability distribution, which gives the probabilities that we can observe a certain event, or set of values, under the given random variable. This distribution can take the form of either a cumulative distribution function (CDF) or a probability density function (PDF) for continuous random variables. These two functions are related by the Fundamental Theorem of Calculus:

 ${\displaystyle F(x)=\int _{-\infty }^{x}f(t)dt}$

The integrand is the PDF of our continuous random variable, and the corresponding integral is the CDF.

## Calculating Probabilities

These two functions give the probabilities associated with observing certain events under a random variable X in question. The CDF has a direct probabilistic interpretation, given by

 ${\displaystyle F(x)={\text{Pr}}(X\leq x)}$

Using the relationship between the CDF and the PDF, probabilities for events associated to continuous random variables can be computed in two equivalent ways. Suppose we wish to calculate the probability that a continuous random variable X is between two values a and b. We could use the PDF and integrate to find this probability.

 ${\displaystyle {\text{Pr}}(a\leq X\leq b)=\int _{a}^{b}f(x)dx}$

Alternatively, if we wish to use the CDF, F(x), we can evaluate the difference F(b) - F(a) to find this probability.

 ${\displaystyle {\text{Pr}}(a\leq X\leq b)=F(b)-F(a)}$

Of course we know that both approaches yield the same result. This fact is precisely the statement of the Fundamental Theorem of Calculus.

## Expected Value, Variance and Standard Deviation

Just as with discrete random variables, the expectation represents the "center" of a random variable, an expected value of an experiment, or the average value of the outcomes of an experiment repeated many times. The variance and standard deviation of a random variable is a numerical measure of the spread, or dispersion, of the PDF of X. Given the PDF f(x) of a continuous random variable X, we can calculate these quantities.

 ${\displaystyle \mathbb {E} (X)=\int _{-\infty }^{\infty }xf(x)dx}$ ${\displaystyle {\text{Var}}(X)=\int _{-\infty }^{\infty }(x-\mathbb {E} (X))^{2}f(x)dx}$ ${\displaystyle \sigma (X)={\sqrt {{\text{Var}}(X)}}}$