Multiple Linear Regression Proof

Hi everyone,

I am enrolled in a regression analysis course at university and the prof really loves to ask for proofs on his assignments. Unfortunately, he never does any in class, and no one at the help centres on campus can ever figure out his problems either.

Here is the one I am currently struggling with:
Suppose we fit a model
y = Xβ + ε
with k predictors to n cases of data (yi,xi),i = 1,...,n and obtain the minimum sum of squares SSRes(n,k). By working with the sum of squares function S(β) show that if an additional case (xn+1,yn+1) is added to the data the new minimum sum of squares must be as large or larger, i.e.
SSRes(n + 1, k) ≥ SSRes(n, k).

I am not even sure which sum of squares function to use... do you think he means the matrix version or the original messy multiple regression sums of squares equation? Any help would be really appreciated. Thanks!!