I have 9 variables in the original model, which I know should find the best predictions. I know this should be the right selection as I'm following a widely-cited scientific paper which justifies those 9 variables and no others.

However, when I add variables to the model, leave-one-out error, RMSE, MAE always all decrease. The pseudo R-squared always increases. This just shouldn't be happening, as far as I'm aware, as that implies that these models with more variables would explain the left-out data point data more accurately, and, therefore, the out-of-sample data too. Does anyone know any reason this might be happening?