Our treatment of discrete random variables has been brief, but the concepts we have introduced are fundamental to any random process. These fundamentals will be explored again in the next chapter when we apply them to continuous random variables. We will see that many similarities exist between the discrete and continuous cases, but we will also notice many important differences between the two as well.
We summarize some of the important concepts that were introduced in Chapter 1.
The probability mass function (PMF) of a random variable X is the function that assigns probabilities to the possible outcomes of X. We write

to denote this function of the possible values x_{k} of X.
The cumulative distribution function (CDF) of a random variable X is the function that accumulates the probabilities from a specified value. We define the CDF to be F(x) = Pr(X ≤ x) and note that the CDF is intimately related to the PMF via our identity for the probability of disjoint events; i.e., the CDF is given by a sum over values of the PMF.

The concepts of expectation, variance and standard deviation are crucial and will be revisited again when we explore continuous random variables. Students should know that

The expectation represents the "center" of a random variable, an expected value of an experiment, or the average of outcomes of an experiment repeated many times. The variance and standard deviation of a random variable are numerical measures of the spread, or dispersion, of the distribution of the random variable.