Differences between 2 X1 and X1 + X2 under normal distribution

#1
The question is as below:

1. If X1 ~ N(μ, σ2) and X2 ~ N(μ, σ2) are normal random variables, then what is the difference between 2 X1 and X1 + X2? Discuss both their distributions and parameters.

I try the following solution however I am unable to identify any differences. Please help.

Assume that X1 + X2 have same μ =1, σ = 2

2 X1 = 2(1,22) = 2(1,4) = (2,8)

X1 + X2 = (1 + 1, 22 + 22) = (2,8)
 

spunky

Doesn't actually exist
#2
You're almost there, but you're forgetting how variances/standard deviations change under linear transformations.

Just go back to the definition of the variance and using the algebra of expectations see how Var(X1+X2) is different from Var (k*X1) (where k is a constant)
 
#3
You're almost there, but you're forgetting how variances/standard deviations change under linear transformations.

Just go back to the definition of the variance and using the algebra of expectations see how Var(X1+X2) is different from Var (k*X1) (where k is a constant)
I look it up and got the following rule :

1) Var (k*X1) = k ^2 * Var (X)

2) If X and Y are independent,
V(X + Y) = V(X) + V(Y)

I try to apply to the question.
Given that X1 ~ N(µ, σ2) and X2 ~ N(µ, σ2)
Assume that X1 and X2 have same µ =1, σ =2


Var(2*X1) = 2^2*Var(1,2^2)
= 4*Var(1,4)
= Var(4,16)

Var(X1 + X2) = Var(X1) + Var (X2)
= Var(1, 2^2) + Var(1, 2^2)
= var (1,4) + var (1,4)
= var (2,8)

It means that Var (2*X1) has 2 times more variance value than Var(X1+X2). Is it correct?