Variance of a sum of predictions

#1
Hi,

I predict the spatial distribution of a species on a regular grid using a regression model.
Species numbers vary with environmental covariates. Now I want to predict the overall size of the population within a given area, which means technically that I sum up all predicted values. But how can I calculate the standard error / confidence interval for this sum? Just by summing up all pointwise standard errors?

Thank you in advance
 

Dragan

Super Moderator
#2
If you sum up the predicted values (Yhats), then this sum will be equal to the summation of the actual values of the dependent variable (Y). In short, the mean of the predicted values will be equal the mean of the actual values. (That is the purpose of the intercept term in a regression model.)
 
#3
Sorry, I can't relate this answer to my question. The question is: Does it make sense to sum up the pointwise given standard deviations of the prediction in order to get the standard deviation of the sum? In other words: My predict function gives me outcome values and standard deviations for different values of covariates. How doe I get the standard error of the sum of these values?
 

ondansetron

TS Contributor
#4
The question is: Does it make sense to sum up the pointwise given standard deviations of the prediction in order to get the standard deviation of the sum?
To help you answer your question here, first answer this question. Are standard deviations additive? The answer to this can help point you towards the right direction, or away from the wrong direction.
 

Dragan

Super Moderator
#5
But how can I calculate the standard error / confidence interval for this sum? Just by summing up all pointwise standard errors?

How doe I get the standard error of the sum of these values?

Your overall query is rather vague. That said, I do know this, you cannot get the standard error by simply summing the standard errors. The way to approach this would be to sum the Variance Estimates and then take the square root of that sum (to get the standard error that you're looking for).
 
Last edited:

ondansetron

TS Contributor
#6
You overall query is rather vague. That said, I do know this, you cannot get the standard error by simply summing the standard errors. The way to approach this would be to sum the Variance Estimates and then take the square root of that sum (to get the standard error that you're looking for).
Well, you spoiled my surprise for the OP! I was hoping he or she would see that the variances would be additive, rather than the standard deviations/standard errors.
 
#9
Hi, sorry for formulating it wrong, it was clear to me that I have to sum up variances instead of standard errors, that was not my concern. My main concern is: if we sum up the poinwise given variances in order to geht the variance of the sum, this is (as far as I know) only valid if covariances are zero. I am used to the concept of covariance calculation based on different random variables. However, in the case of a prediction from a regression model (where I have only predicted values and it's standard errors for each covariate value) can I always assume that covariances between different predicted points are zero, if I specified the regression model correctly? It is even not clear to me how and if the definition of covariance exist in this context.

Thanks in advance