## show that the sum of e(i) = 0 and the sum of e(i)*Yhat(i) = 0

Consider a usual multiple linear regression model: Y = X β + ε
where Y is a (n x 1) response vector, X is a (n x p) matrix with its first column composed of 1's, β is a (p x 1) parameter vector, and ε is an (n x 1) error vector. Let b and Yhat denote the least squares estimator of β and the fitted response vector.
Let e = Y - Yhat be the residual vector.
a) Show that the sum of e(i) = 0

Note: You may use the well known results relating to the hat matrix H = [X(X'X)^-1X'] as follows:
1) H^2 = H.
2) HX = X.
3) (I - H)^2 = (I - H).
4) H is symmetric