Science:MATH105 Probability/Lesson 1 DRV/1.01 Discrete Random Variables
In many areas of science we are interested in quantifying the probability that a certain outcome of an experiment occurs. To quantify the probability that an event occurs, we use a number between 0 and 1. A probability of 0 implies that the outcome cannot occur, whereas a probability of 1 implies that the outcome must occur. Any value in the interval (0, 1) means that the outcome only occurs some of the time. Equivalently, if an event occurs with probability p, then this means there is a p(100)% chance of observing this event.
When there is a discrete list of probabilities that can occur, we use the notation p_{k} to denote the probability that event k will occur.
Discrete Probability Rules 

In discrete probability,

Example: Tossing a Fair Coin Once
If we toss a coin into the air, there are only two possible outcomes: it will land as either "heads" (H) or "tails" (T). If the tossed coin is a "fair" coin, it is equally likely that the coin will land as tails or as heads. In other words, there is a 50% chance that the coin will land heads, and a 50% chance that the coin will land tails.
Using our notation for the probability of a discrete event, we can assign
 p_{0} to be the probability that the tossed coin will land as heads
 p_{1} to be the probability that the tossed coin will land as tails
Because there are two outcomes that are equally likely, we assign the probability of 0.5 to each of them.
 p_{0} = 1/2
 p_{1} = 1/2
As required, the sum of the probabilities equals 1, and each probability is a number in the interval [0, 1]. Notice that p_{0} = 1  p_{1}.
We can define the random variable X to represent this coin tossing experiment. That is, X is the discrete random variable that takes the value 0 with probability p_{0} and takes the value 1 with probability p_{1}. Notice that with this notation, saying that we "toss a coin and observe heads" is the same as saying "the random variable X is observed to take the value 0". We say that X is a Bernoulli random variable with parameter p_{0} = 1/2 and can write X ~ Ber(1/2).
Example: Tossing a Fair Coin Twice
Similarly, if we toss a fair coin two times, there are four possible outcomes. Each outcome is a sequence of heads (H) or tails (T):
 HH
 HT
 TH
 TT
Using our notation for probability, we can assign
 p_{0} to be the probability that the outcome will be HH
 p_{1} to be the probability that the outcome will be HT
 p_{2} to be the probability that the outcome will be TH
 p_{3} to be the probability that the outcome will be TT
Because the coin is fair, each outcome is equally likely to occur. There are 4 possible outcomes, so we assign each outcome a probability of 1/4. That is, p_{0} = p_{1} =p_{2} =p_{3} = 1/4.
Equivalently, we notice that for any of the four possible events to occur, we must observe two distinct events from two separate flips of a fair coin. So for example, to observe the sequence HH, we must flip a fair coin once and observe H, then flip a fair coin again and observe H once again. (We say that these two events are independent since the outcome of one event has no effect on the outcome of the other.) Since the probability of observing H after a flip of a fair coin is 1/2, we see that the probability of observing the sequence HH should be (1/2)×(1/2) = 1/4.
Again, all of our probabilities sum to 1, and each probability is a number on the interval [0, 1]. Naturally, we can define a new random variable Y to represent this new coin tossing experiment, i.e. the experiment of independently tossing a fair coin twice and observing the sequential outcome. So Y is the discrete random variable that takes the value 0 with probability p_{0}, the value 1 with probability p_{1}, etc.
Notice that with this notation, saying that we "toss two coins in a row and observe the sequence tails then heads" is the same as saying "the random variable Y is observed to take the value 2". We say that Y is a binomial random variable with parameters p_{0} = 1/2 and 2 (the number of times we flip the fair coin). We can write Y ~ Bin(1/2,2). In general, we can think of tossing a fair coin n times and observing the sequential outcome of tails and heads. This too is a binomial random variable which we can denote by Y ~ Bin(1/2,n).