I am in the process of calculating the prediction intervals on a time series data linear regression model

The independent variable in my model is time measured as 1,2,3,4,...,40. I have a dependent variable which is a continuous variable. Now when calculating the prediction intervals using the standard formula described in the literature. The formula has one term

(X-Xm)^2 where X is the independent variable and Xm is the mean of independent variable.

For calculating the prediction interval for 41st point, Should X be 41 and Xm be mean of 1 to 40? Does taking mean from 1 to 40 of time scal is logical?

Appreciate your views on this.