Statistics Dictionary

To see a definition, select a term from the dropdown text box below. The statistics dictionary will display the definition, plus links to related web pages.

Select term:

Cumulative Probability

A cumulative probability refers to the probability that the value of a random variable falls within a specified range. Frequently, cumulative probabilities refer to the probability that a random variable is less than or equal to a specified value.

Consider a coin flip experiment. If we flip a coin two times, we might ask: What is the probability that the coin flips would result in one or fewer heads? The answer would be a cumulative probability. It would be the probability that the coin flip results in zero heads plus the probability that the coin flip results in one head. Thus, the cumulative probability would equal:

P(X < 1) = P(X = 0) + P(X = 1) = 0.25 + 0.50 = 0.75

The table below shows both the probabilities and the cumulative probabilities associated with this experiment.

Number of heads Probability Cumulative Probability
0 0.25 0.25
1 0.50 0.75
2 0.25 1.00

See also:   Statistics Tutorial: Probability Distributions