It is perhaps best to use the deviation form of the regression function i.e.,

[math]y_{i}=\hat{y_{i}}+e_{i}[/math].

Squaring both sides and summing over the sample yields,

[math]\sum y_{i}^{2}=\sum \hat{y_{i}}^{2}+\sum e_{i}^{2}+2\sum \hat{y_{i}}e_{i}[/math]

[math]=\sum \hat{y_{i}}^{2}+\sum e_{i}^{2}[/math]

[math]=\hat{\beta _{1}^{2}}\sum x_{i}^{2}+\sum e_{i}^{2}[/math]

because

[math] \sum \hat{y_{i}}e_{i}=0[/math].

As such, the various sums of squares above are:

[math] \sum y_{i}^{2} [/math] is the total variation (TSS),

[math]\sum \hat{y_{i}}^{2}=\hat{\beta _{1}^{2}}\sum x_{i}^{2}[/math] is the explained sum of squares from the regression (ESS),

[math]\sum e_{i}^{2}[/math] is the residual sum of squares (RSS).

Thus, it follows that TSS=ESS+RSS.