Difference between revisions of "Science:MATH105 Probability/Lesson 1 DRV/1.01 Discrete Random Variables"
Line 1:  Line 1:  
−  In many areas of science we are interested in quantifying the '''probability''' that a certain outcome of an experiment occurs. We can use a '''random variable''' to identify numerical events that are of interest in an experiment. In this way, a random variable is a theoretical representation of the physical or experimental process we wish to study.  +  In many areas of science we are interested in quantifying the '''probability''' that a certain outcome of an experiment occurs. We can use a '''random variable''' to identify numerical events that are of interest in an experiment. In this way, a random variable is a theoretical representation of the physical or experimental process we wish to study. More precisely, a random variable is a quantity without a fixed value, but which can assume different values depending on how likely these values are to be observed; these likelihoods are probabilities. 
To quantify the probability that a particular event occurs, we use a number between 0 and 1. A probability of 0 implies that the event ''cannot'' occur, whereas a probability of 1 implies that the event ''must'' occur. Any value in the interval (0, 1) means that the event will only occur some of the time. Equivalently, if an event occurs with probability ''p'', then this means there is a ''p''(100)% chance of observing this event.  To quantify the probability that a particular event occurs, we use a number between 0 and 1. A probability of 0 implies that the event ''cannot'' occur, whereas a probability of 1 implies that the event ''must'' occur. Any value in the interval (0, 1) means that the event will only occur some of the time. Equivalently, if an event occurs with probability ''p'', then this means there is a ''p''(100)% chance of observing this event.  
−  When there is a discrete list of events that can occur, we can use the notation ''  +  Conventionally, we denote random variables by capital letters, and particular values that they can assume by lowercase letters. So we can say that ''X'' is a random variable that can assume certain particular values ''x'' with certain probabilities. 
+  
+  When there is a discrete list of events that can occur, we can use the notation Pr(''X'' = ''x<sub>k</sub>'') to denote the probability that the random variable ''X'' assumes the particular value ''x<sub>k</sub>''. Note that a ''discrete list of events'' means that the random variable ''X'' can assume only finitely many or countably many values, meaning we should be able to ''list'' the values that ''X'' can take, even if this list is infinite (as with a list of all positive integers).  
+  
+  This is in contrast to a ''continuous'' random variable, where the values the random variable can assume are given by a continuum of values (for example, we could define a random variable that can take any value in the interval [1,2]). We will discuss continuous random variables in detail in the second part of this module. For now, we deal strictly with '''discrete random variables'''.  
{ border="1" cellspacing="0" cellpadding="4" align="center"  { border="1" cellspacing="0" cellpadding="4" align="center" 
Revision as of 11:43, 15 February 2012
In many areas of science we are interested in quantifying the probability that a certain outcome of an experiment occurs. We can use a random variable to identify numerical events that are of interest in an experiment. In this way, a random variable is a theoretical representation of the physical or experimental process we wish to study. More precisely, a random variable is a quantity without a fixed value, but which can assume different values depending on how likely these values are to be observed; these likelihoods are probabilities.
To quantify the probability that a particular event occurs, we use a number between 0 and 1. A probability of 0 implies that the event cannot occur, whereas a probability of 1 implies that the event must occur. Any value in the interval (0, 1) means that the event will only occur some of the time. Equivalently, if an event occurs with probability p, then this means there is a p(100)% chance of observing this event.
Conventionally, we denote random variables by capital letters, and particular values that they can assume by lowercase letters. So we can say that X is a random variable that can assume certain particular values x with certain probabilities.
When there is a discrete list of events that can occur, we can use the notation Pr(X = x_{k}) to denote the probability that the random variable X assumes the particular value x_{k}. Note that a discrete list of events means that the random variable X can assume only finitely many or countably many values, meaning we should be able to list the values that X can take, even if this list is infinite (as with a list of all positive integers).
This is in contrast to a continuous random variable, where the values the random variable can assume are given by a continuum of values (for example, we could define a random variable that can take any value in the interval [1,2]). We will discuss continuous random variables in detail in the second part of this module. For now, we deal strictly with discrete random variables.
Discrete Probability Rules 


Example: Tossing a Fair Coin Once
If we toss a coin into the air, there are only two possible outcomes: it will land as either "heads" (H) or "tails" (T). If the tossed coin is a "fair" coin, it is equally likely that the coin will land as tails or as heads. In other words, there is a 50% chance that the coin will land heads, and a 50% chance that the coin will land tails.
Using our notation for the probability of a discrete event, we can assign
 p_{0} to be the probability that the tossed coin will land as heads
 p_{1} to be the probability that the tossed coin will land as tails
Because there are two outcomes that are equally likely, we assign the probability of 0.5 to each of them.
 p_{0} = 1/2
 p_{1} = 1/2
As required, the sum of the probabilities equals 1, and each probability is a number in the interval [0, 1]. Notice that p_{0} = 1  p_{1}.
We can define the random variable X to represent this coin tossing experiment. That is, X is the discrete random variable that takes the value 0 with probability p_{0} and takes the value 1 with probability p_{1}. Notice that with this notation, the experimental event that "we toss a fair coin and observe heads" is the same as the theoretical event that "the random variable X is observed to take the value 0". We say that X is a Bernoulli random variable with parameter p_{0} = 1/2 and can write X ~ Ber(1/2).
Example: Tossing a Fair Coin Twice
Similarly, if we toss a fair coin two times, there are four possible outcomes. Each outcome is a sequence of heads (H) or tails (T):
 HH
 HT
 TH
 TT
Using our notation for probability, we can assign
 p_{0} to be the probability that the outcome will be HH
 p_{1} to be the probability that the outcome will be HT
 p_{2} to be the probability that the outcome will be TH
 p_{3} to be the probability that the outcome will be TT
Because the coin is fair, each outcome is equally likely to occur. There are 4 possible outcomes, so we assign each outcome a probability of 1/4. That is, p_{0} = p_{1} =p_{2} =p_{3} = 1/4.
Equivalently, we notice that for any of the four possible events to occur, we must observe two distinct events from two separate flips of a fair coin. So for example, to observe the sequence HH, we must flip a fair coin once and observe H, then flip a fair coin again and observe H once again. (We say that these two events are independent since the outcome of one event has no effect on the outcome of the other.) Since the probability of observing H after a flip of a fair coin is 1/2, we see that the probability of observing the sequence HH should be (1/2)×(1/2) = 1/4.
Observe that again, all of our probabilities sum to 1, and each probability is a number on the interval [0, 1]. If we define the random variable Y to represent this new coin tossing experiment, we see that Y takes the value 0 with probability p_{0} = 1/4, 1 with probability p_{1} = 1/4, 2 with probability p_{2} = 1/4, and 3 with probability p_{3} = 1/4. Notice that with this notation, the experimental event that "we toss two fair coins and observe first tails, then heads" is the same as the theoretical event that "the random variable Y is observed to take the value 2". We say that Y is a uniform discrete random variable with parameter 4 since Y takes each of its four possible values with equal, or uniform, probability. To denote this distributional relationship, we can write Y ~ Uniform(4).