Stat Trek

Teach yourself statistics

Stat Trek

Teach yourself statistics

Combinations of Random Variables

Sometimes, it is necessary to add or subtract random variables. When this occurs, it is useful to know the mean and variance of the result.

Recommendation: Read the sample problems at the end of the lesson. This lesson introduces some useful equations, and the sample problems show how to apply those equations.

Sums and Differences of Random Variables: Effect on the Mean

Suppose you have two variables: X with a mean of μx and Y with a mean of μy. Then, the mean of the sum of these variables (μx+y) and the mean of the difference between these variables (μx-y) are given by the following equations.

μx+y = μx + μy       and       μx-y = μx - μy

The above equations for general variables also apply to random variables. If X and Y are random variables, then

E(X + Y) = E(X) + E(Y)
and
E(X - Y) = E(X) - E(Y)

where E(X) is the expected value (mean) of X, E(Y) is the expected value of Y, E(X + Y) is the expected value of X plus Y, and E(X - Y) is the expected value of X minus Y.

Sums and Differences of Independent Random Variables: Effect on Variance

Suppose X and Y are independent random variables. Then, the variance of (X + Y) and the variance of (X - Y) are described by the following equations

Var(X + Y) = Var(X - Y) = Var(X) + Var(Y)

where Var(X + Y) is the variance of the sum of X and Y, Var(X - Y) is the variance of the difference between X and Y, Var(X) is the variance of X, and Var(Y) is the variance of Y.

Note: The standard deviation (SD) is always equal to the square root of the variance (Var). Thus,

SD(X + Y) = sqrt[ Var(X + Y) ]
and
SD(X - Y) = sqrt[ Var(X - Y) ]

Test Your Understanding

Problem 1

The table below shows the joint probability distribution between two independent random variables - X and Y. (In a joint probability distribution table, numbers in the cells of the table represent the probability that particular values of X and Y occur together.)

X
0 1 2
Y 3 0.1 0.2 0.2
4 0.1 0.2 0.2

What is the mean of the sum of X and Y?

(A) 1.2
(B) 3.5
(C) 4.5
(D) 4.7
(E) None of the above.

Solution

The correct answer is D. The solution requires three computations: (1) find the mean (expected value) of X, (2) find the mean (expected value) of Y, and (3) find the sum of the means. Those computations are shown below, beginning with the mean of X.

E(X) = Σ [ xi * P(xi) ]
E(X) = 0 * (0.1 + 0.1) + 1 * (0.2 + 0.2) + 2 * (0.2 + 0.2)
E(X) = 0 + 0.4 + 0.8 = 1.2

Next, we find the mean of Y.

E(Y) = Σ [ yi * P(yi) ]
E(Y) = 3 * (0.1 + 0.2 + 0.2) + 4 * (0.1 + 0.2 + 0.2)
E(Y) = (3 * 0.5) + (4 * 0.5) = 1.5 + 2 = 3.5

And finally, the mean of the sum of X and Y is equal to the sum of the means. Therefore,

E(X + Y) = E(X) + E(Y) = 1.2 + 3.5 = 4.7

Note: A similar approach is used to find differences between means. The difference between X and Y is E(X - Y) = E(X) - E(Y) = 1.2 - 3.5 = -2.3; and the difference between Y and X is E(Y - X) = E(Y) - E(X) = 3.5 - 1.2 = 2.3

Problem 2

Suppose X and Y are independent random variables. The variance of X is equal to 16; and the variance of Y is equal to 9. Let Z = X - Y.

What is the standard deviation of Z?

(A) 2.65
(B) 5.00
(C) 7.00
(D) 25.0
(E) It is not possible to answer this question, based on the information given.

Solution

The correct answer is B. The solution requires us to recognize that Variable Z is a combination of two independent random variables. As such, the variance of Z is equal to the variance of X plus the variance of Y.

Var(Z) = Var(X) + Var(Y) = 16 + 9 = 25

The standard deviation of Z is equal to the square root of the variance. Therefore, the standard deviation is equal to the square root of 25, which is 5.