Sum of the residuals

val92

New Member
#1
Hi,
My question may sound trivial but here it goes:
When doing a sample regression by the ordinary least squares method, does the sum (nonsquared! ) of the residuals have to be equal to zero??

Here's what I know. If we have numerous "y observations" per x, one important assumption is that the residuals conditional on a given X follow an identical distribution usually with mean 0 (which also suggests that the sum of the residuals is 0)

i.e

Σ e_ij= 0 where j is the iterating term and where
e_ij = (Yj - Y(estimated)) for a given X_i


However when we do a sample regression we usually have one Y observation per X. With the ordinary least squares method we try to :

min Σ (e_i)^2

Does this however mean that the sum of the residuals will be equal to 0?

i.e

Σ e_i = 0 ??


Thanks
 

Dragan

Super Moderator
#2
Hi,
My question may sound trivial but here it goes:
When doing a sample regression by the ordinary least squares method, does the sum (nonsquared! ) of the residuals have to be equal to zero??

Σ e_i = 0 ??


Thanks
Yes, the sum of the error terms will be zero. You can see this here in the context of simple regression. Note that I am ommiting the subscript i.

\(\sum e=\sum\left ( y-\hat{y} \right )=\sum \left ( y-\left ( \bar{y}+r\frac{s_{y}}{s_{x}}\left ( x-\bar{x} \right ) \right ) \right )\)

\(=\sum y-\sum \bar{y}-r\frac{s_{y}}{s_{x}}\sum \left ( x-\bar{x} \right )=n\bar{y}-n\bar{y}=0\).