Difference between revisions of "Science:MATH105 Probability/Lesson 1 DRV/1.01 Discrete Random Variables"

From UBC Wiki
Jump to: navigation, search
Line 25: Line 25:
 
Because there are two outcomes that are equally likely, we assign the probability of 0.5 to each of them.
 
Because there are two outcomes that are equally likely, we assign the probability of 0.5 to each of them.
  
*  ''p''<sub>0</sub> = 0.5
+
*  ''p''<sub>0</sub> = 1/2
*  ''p''<sub>1</sub> = 0.5
+
*  ''p''<sub>1</sub> = 1/2
  
 
As required, the sum of the probabilities equals 1, and each probability is a number in the interval [0, 1]. Notice that ''p''<sub>0</sub> = 1 - ''p''<sub>1</sub>.
 
As required, the sum of the probabilities equals 1, and each probability is a number in the interval [0, 1]. Notice that ''p''<sub>0</sub> = 1 - ''p''<sub>1</sub>.
  
We can define the '''random variable''' ''X'' to represent this coin tossing experiment.  That is, ''X'' is the discrete random variable that takes the value 0 with probability ''p''<sub>0</sub> and takes the value 1 with probability ''p''<sub>1</sub>. We say that ''X'' is a '''Bernoulli''' random variable with parameter ''p''<sub>0</sub>.
+
We can define the '''random variable''' ''X'' to represent this coin tossing experiment.  That is, ''X'' is the discrete random variable that takes the value 0 with probability ''p''<sub>0</sub> and takes the value 1 with probability ''p''<sub>1</sub>. Notice that with this notation, saying that we "toss a coin and observe heads" is the same as saying "the random variable ''X'' is observed to take the value 0". We say that ''X'' is a '''Bernoulli''' random variable with parameter ''p''<sub>0</sub> = 1/2 and can write ''X ~'' Ber(1/2).
  
 
==Example: Tossing a Fair Coin Twice==
 
==Example: Tossing a Fair Coin Twice==
Line 43: Line 43:
 
Using our notation for probability, we can assign  
 
Using our notation for probability, we can assign  
  
*  ''p''<sub>1</sub> to be the probability that the outcome will be HH
+
*  ''p''<sub>0</sub> to be the probability that the outcome will be HH
*  ''p''<sub>2</sub> to be the probability that the outcome will be HT
+
*  ''p''<sub>1</sub> to be the probability that the outcome will be HT
*  ''p''<sub>3</sub> to be the probability that the outcome will be TH
+
*  ''p''<sub>2</sub> to be the probability that the outcome will be TH
*  ''p''<sub>4</sub> to be the probability that the outcome will be TT
+
*  ''p''<sub>3</sub> to be the probability that the outcome will be TT
  
Because the coin is fair, each outcome is equally likely to occur. There are 4 possible outcomes, so we assign each outcome a probability of 1/4 = 0.25. That is, ''p''<sub>1</sub> = ''p''<sub>2</sub> =''p''<sub>3</sub> =''p''<sub>4</sub> = 0.25.  
+
Because the coin is fair, each outcome is equally likely to occur. There are 4 possible outcomes, so we assign each outcome a probability of 1/4. That is, ''p''<sub>0</sub> = ''p''<sub>1</sub> =''p''<sub>2</sub> =''p''<sub>3</sub> = 1/4. Equivalently, we notice that for any of the four possible events to occur, we must observe two distinct events from two separate flips of a fair coin.  So for example, to observe the sequence HH, we must flip a fair coin once and observe H, then flip a fair coin again and observe H once again. (We say that these two events are ''independent'' since the outcome of one event has no effect on the outcome of the other.) Since the probability of observing H after a flip of a fair coin is 1/2, we see that the probability of observing the sequence HH should be (1/2)×(1/2) = 1/4.  
  
Again, all of our probabilities sum to 1, and each probability is a number on the interval [0, 1].  
+
Again, all of our probabilities sum to 1, and each probability is a number on the interval [0, 1]. Naturally, we can define a new random variable ''Y'' to represent this new coin tossing experiment, i.e. the experiment of independently tossing a fair coin twice and observing the sequential outcome.  So ''Y'' is the discrete random variable that takes the value 0 with probability ''p''<sub>0</sub>, the value 1 with probability ''p''<sub>1</sub>, etc. Notice that with this notation, saying that we "toss two coins in a row and observe the sequence tails then heads" is the same as saying "the random variable ''Y'' is observed to take the value 2". We say that ''Y'' is a '''binomial''' random variable with parameters ''p''<sub>0</sub> = 1/2 and 2 (the number of times we flip the fair coin). We can write ''Y ~'' Bin(1/2,2). In general, we can think of tossing a fair coin ''n'' times and observing the sequential outcome of tails and heads.  This too is a binomial random variable which we can denote by ''Y ~'' Bin(1/2,''n'').
  
  

Revision as of 12:54, 12 February 2012

In many areas of science we are interested in quantifying the probability that a certain outcome of an experiment occurs. To quantify the probability that an event occurs, we use a number between 0 and 1. A probability of 0 implies that the outcome cannot occur, whereas a probability of 1 implies that the outcome must occur. Any value in the interval (0, 1) means that the outcome only occurs some of the time. Equivalently, if an event occurs with probability p, then this means there is a p(100)% chance of observing this event.

When there is a discrete list of probabilities that can occur, we use the notation pk to denote the probability that event k will occur.

Discrete Probability Rules
In discrete probability,
  1. Probabilities are numbers between 0 and 1: 0 ≤ pk ≤ 1 for all k
  2. The sum of all probabilities for a given experiment is equal to one: ∑k pk = 1
  3. The probability of an event is 1 minus the probability that any other event occurs: pj = 1 - ∑k≠ j pk


Example: Tossing a Fair Coin Once

If we toss a coin into the air, there are only two possible outcomes: it will land as either "heads" (H) or "tails" (T). If the tossed coin is a "fair" coin, it is equally likely that the coin will land as tails or as heads. In other words, there is a 50% chance that the coin will land heads, and a 50% chance that the coin will land tails.

Using our notation for the probability of a discrete event, we can assign

  • p0 to be the probability that the tossed coin will land as heads
  • p1 to be the probability that the tossed coin will land as tails

Because there are two outcomes that are equally likely, we assign the probability of 0.5 to each of them.

  • p0 = 1/2
  • p1 = 1/2

As required, the sum of the probabilities equals 1, and each probability is a number in the interval [0, 1]. Notice that p0 = 1 - p1.

We can define the random variable X to represent this coin tossing experiment. That is, X is the discrete random variable that takes the value 0 with probability p0 and takes the value 1 with probability p1. Notice that with this notation, saying that we "toss a coin and observe heads" is the same as saying "the random variable X is observed to take the value 0". We say that X is a Bernoulli random variable with parameter p0 = 1/2 and can write X ~ Ber(1/2).

Example: Tossing a Fair Coin Twice

Similarly, if we toss a fair coin two times, there are four possible outcomes. Each outcome is a sequence of heads (H) or tails (T):

  • HH
  • HT
  • TH
  • TT

Using our notation for probability, we can assign

  • p0 to be the probability that the outcome will be HH
  • p1 to be the probability that the outcome will be HT
  • p2 to be the probability that the outcome will be TH
  • p3 to be the probability that the outcome will be TT

Because the coin is fair, each outcome is equally likely to occur. There are 4 possible outcomes, so we assign each outcome a probability of 1/4. That is, p0 = p1 =p2 =p3 = 1/4. Equivalently, we notice that for any of the four possible events to occur, we must observe two distinct events from two separate flips of a fair coin. So for example, to observe the sequence HH, we must flip a fair coin once and observe H, then flip a fair coin again and observe H once again. (We say that these two events are independent since the outcome of one event has no effect on the outcome of the other.) Since the probability of observing H after a flip of a fair coin is 1/2, we see that the probability of observing the sequence HH should be (1/2)×(1/2) = 1/4.

Again, all of our probabilities sum to 1, and each probability is a number on the interval [0, 1]. Naturally, we can define a new random variable Y to represent this new coin tossing experiment, i.e. the experiment of independently tossing a fair coin twice and observing the sequential outcome. So Y is the discrete random variable that takes the value 0 with probability p0, the value 1 with probability p1, etc. Notice that with this notation, saying that we "toss two coins in a row and observe the sequence tails then heads" is the same as saying "the random variable Y is observed to take the value 2". We say that Y is a binomial random variable with parameters p0 = 1/2 and 2 (the number of times we flip the fair coin). We can write Y ~ Bin(1/2,2). In general, we can think of tossing a fair coin n times and observing the sequential outcome of tails and heads. This too is a binomial random variable which we can denote by Y ~ Bin(1/2,n).