Minimum variance for sum of three random variables

Hi all,

I have been working on the following problem:

Given you have VarX = 1, VarY = 4, and VarZ = 25, what is the minimum possible variance for the random variable W = X + Y + Z, or min Var(X+Y+Z)?

My first thought is to complete the variance-covariance expansion as follows:
Var(X + Y + Z) = VarX + VarY + VarZ +2[Cov(X,Y) + Cov(Y,Z) + Cov(X,Z)]

Then to use the Cauchy-Schwarz inequality to determine the minimum covariance for each of the covariance terms (i.e. |Cov(X,Y)| <= sqrt(VarXVarY) ). However, I am obtaining a negative potential minimum, which leads me to think that the lower bound could be zero?

In order a square matrix to be a valid variance covariance matrix, it has to be positive-semidefinite and symmetric. The symmetric property is automatically satisfied if we let the covariance matrix of the random vector

in the form of

To check the positive-semidefinite, you may apply Sylvester Criterion:

P.S. One additional thing you may check before trying the above method: any random variable has a zero variance if and only if it is a constant. Therefore, you may try to assume . Then by moving one of them to the RHS, say

Now you try to check if it is possible to have the variance of LHS equal to the variance of RHS. If not, you know that zero variance is not attainable by contrapositive.

Re: Minimum variance for sum of three random variables

Thank you! I figured that I had to take into consideration restrictions on the variance covariance matrix which cannot be addressed through the application of the Cauchy-Schwarz inequality in the case of more than 2 random variables.