# SST = SSE + SSR, proof

#### stephen123456

##### New Member
Hi,

I've been trying to figure out for a long time how to derive the starting equation of the proof. I have no idea how to come up with it. It's the third equation on the picture in the attachement.

Thank you very much for your help!

#### Buckeye

##### Member
I read into this a bit last night. I believe you can add and subtract $\hat{Y}_i$. So, the right side of line 3 becomes: $\sum_{i = 1}^n (\hat{Y}_i-\bar{Y}+Y_i-\hat{Y}_i)^2$. I think you have a $\hat{Y}$ in place of $\bar{Y}$. Then expand from here?

This is the first line of the proof. I referenced https://web.njit.edu/~wguo/Math644_2012/Math644_Chapter 1_part4.pdf
Maybe we can finish the proof after fixing this small mistake?

Last edited:

#### Dragan

##### Super Moderator
It is perhaps best to use the deviation form of the regression function i.e.,

$y_{i}=\hat{y_{i}}+e_{i}$.

Squaring both sides and summing over the sample yields,

$\sum y_{i}^{2}=\sum \hat{y_{i}}^{2}+\sum e_{i}^{2}+2\sum \hat{y_{i}}e_{i}$

$=\sum \hat{y_{i}}^{2}+\sum e_{i}^{2}$

$=\hat{\beta _{1}^{2}}\sum x_{i}^{2}+\sum e_{i}^{2}$

because

$\sum \hat{y_{i}}e_{i}=0$.

As such, the various sums of squares above are:

$\sum y_{i}^{2}$ is the total variation (TSS),

$\sum \hat{y_{i}}^{2}=\hat{\beta _{1}^{2}}\sum x_{i}^{2}$ is the explained sum of squares from the regression (ESS),

$\sum e_{i}^{2}$ is the residual sum of squares (RSS).