Science:MATH105 Probability/Lesson 1 DRV/1.01 Discrete Random Variables
In many areas of science we are interested in quantifying the probability that a certain outcome of an experiment occurs. To quantify the probability that an event occurs, we use a number between 0 and 1. A probability of 0 implies that the outcome cannot occur, whereas a probability of 1 implies that the outcome must occur. Any value in the interval (0, 1) means that the outcome only occurs some of the time. Equivalently, if an event occurs with probability p, then this means there is a p(100)% chance of observing this event.
When there is a discrete list of probabilities that can occur, we use the notation pk to denote the probability that event k will occur.
|Discrete Probability Rules|
|In discrete probability,
Example: Tossing a Fair Coin Once
If we toss a coin into the air, there are only two possible outcomes: it will land as either "heads" (H) or "tails" (T). If the tossed coin is a "fair" coin, it is equally likely that the coin will land as tails or as heads. In other words, there is a 50% chance that the coin will land heads, and a 50% chance that the coin will land tails.
Using our notation for the probability of a discrete event, we can assign
- p0 to be the probability that the tossed coin will land as heads
- p1 to be the probability that the tossed coin will land as tails
Because there are two outcomes that are equally likely, we assign the probability of 0.5 to each of them.
- p0 = 1/2
- p1 = 1/2
As required, the sum of the probabilities equals 1, and each probability is a number in the interval [0, 1]. Notice that p0 = 1 - p1.
We can define the random variable X to represent this coin tossing experiment. That is, X is the discrete random variable that takes the value 0 with probability p0 and takes the value 1 with probability p1. Notice that with this notation, saying that we "toss a coin and observe heads" is the same as saying "the random variable X is observed to take the value 0". We say that X is a Bernoulli random variable with parameter p0 = 1/2 and can write X ~ Ber(1/2).
Example: Tossing a Fair Coin Twice
Similarly, if we toss a fair coin two times, there are four possible outcomes. Each outcome is a sequence of heads (H) or tails (T):
Using our notation for probability, we can assign
- p0 to be the probability that the outcome will be HH
- p1 to be the probability that the outcome will be HT
- p2 to be the probability that the outcome will be TH
- p3 to be the probability that the outcome will be TT
Because the coin is fair, each outcome is equally likely to occur. There are 4 possible outcomes, so we assign each outcome a probability of 1/4. That is, p0 = p1 =p2 =p3 = 1/4.
Equivalently, we notice that for any of the four possible events to occur, we must observe two distinct events from two separate flips of a fair coin. So for example, to observe the sequence HH, we must flip a fair coin once and observe H, then flip a fair coin again and observe H once again. (We say that these two events are independent since the outcome of one event has no effect on the outcome of the other.) Since the probability of observing H after a flip of a fair coin is 1/2, we see that the probability of observing the sequence HH should be (1/2)×(1/2) = 1/4.
Observe that again, all of our probabilities sum to 1, and each probability is a number on the interval [0, 1].