Minimum variance for sum of three random variables

#1
Hi all,

I have been working on the following problem:

Given you have VarX = 1, VarY = 4, and VarZ = 25, what is the minimum possible variance for the random variable W = X + Y + Z, or min Var(X+Y+Z)?

My first thought is to complete the variance-covariance expansion as follows:
Var(X + Y + Z) = VarX + VarY + VarZ +2[Cov(X,Y) + Cov(Y,Z) + Cov(X,Z)]

Then to use the Cauchy-Schwarz inequality to determine the minimum covariance for each of the covariance terms (i.e. |Cov(X,Y)| <= sqrt(VarXVarY) ). However, I am obtaining a negative potential minimum, which leads me to think that the lower bound could be zero?

Var(X+Y+Z) = 1 + 4 + 25 + 2[-2 - 5 - 10] = 30 - 34 ???

The other thought is that using Cauchy-Schwarz in this way is not correct and my approach is wrong.

My next thought is to consider the expansion as Var[(X+Y), Z], but was not sure how to proceed by considering the sum of 2 variables (X+Y) and Z.

Any thoughts on how to proceed are appreciated.
 

BGM

TS Contributor
#2
Actually this is a very good question.

http://en.wikipedia.org/wiki/Covariance_matrix#Properties

In order a square matrix to be a valid variance covariance matrix, it has to be positive-semidefinite and symmetric. The symmetric property is automatically satisfied if we let the covariance matrix [math] \Sigma [/math] of the random vector

[math] \begin{bmatrix} X \\ Y \\ Z \end{bmatrix} [/math]

in the form of

[math] \Sigma = \begin{bmatrix} 1 & \sigma_{XY} & \sigma_{XZ} \\
\sigma_{XY} & 4 & \sigma_{YZ} \\ \sigma_{XZ} & \sigma_{YZ} & 25 \end{bmatrix} [/math]

To check the positive-semidefinite, you may apply Sylvester Criterion:

http://en.wikipedia.org/wiki/Sylvester's_criterion

which leads to the following two inequality:

[math] \sigma_{XY}^2 - 4 \geq 0 [/math]

[math] 100 + 2\sigma_{XY}\sigma_{XZ}\sigma_{YZ} - 25\sigma_{XY}^2 - 4\sigma_{XZ}^2 - \sigma_{YZ}^2 \geq 0 [/math]

So any covariances satisfy the above two inequalities will be valid. The remaining optimization can be done by KKT multiplier, see

http://en.wikipedia.org/wiki/Karush–Kuhn–Tucker_conditions



P.S. One additional thing you may check before trying the above method: any random variable has a zero variance if and only if it is a constant. Therefore, you may try to assume [math] X + Y + Z = c [/math]. Then by moving one of them to the RHS, say

[math] Y + Z = c - X [/math]

Now you try to check if it is possible to have the variance of LHS equal to the variance of RHS. If not, you know that zero variance is not attainable by contrapositive.
 
#3
Thank you! I figured that I had to take into consideration restrictions on the variance covariance matrix which cannot be addressed through the application of the Cauchy-Schwarz inequality in the case of more than 2 random variables.