Finding Mean and Variance for X^2 Given Mean and Variance for X

I am working on a question that has me stumped. X1, X2, ... Xn are normally distributed and independent. Each has a mean of 0 and variance sigma^2. I am asked to find E(Xi^2) and V(Xi^2). There seem to me two ways to approach this:

First, since the Xi are independent, we could say E(Xi^2) = E(Xi)*E(Xi) = 0*0 = 0. Second, we could also say V(Xi) = E(Xi^2) - [E(Xi)]^2 = E(Xi^2) - 0 = E(Xi^2) in which case E(Xi^2) = V(Xi) = sigma^2.

From these two, we are led to conclude that sigma must be 0 (in which case this is a uniform distribution, not normal). Have I made some error in thinking or is this problem flawed?


Dark Knight
Second approach is right. Similar way you can calculate Var(Xi^2)

[math]Var(X_i^2)= E[(X_i^2)^2] -(E[X_i^2])^2 [/math]
[math]Var(X_i^2)= E[X_i^4] -(E[X_i^2])^2 [/math]
[math]E[X_i^4] [/math] is the 4th moment