I have fitted a dynamic regression model, i.e. a model of the type
y(t) = k*x(t) + N(t)
x(t) is the input time series, while N(t) is the time series of residuals, which turns out to be seasonally autocorrelated.
To remove the seasonal autocorrelation from the residuals, I tried to calibrate a SARIMA-model for the residual series (as suggested by Pankratz (1991): Forecasting with Dynamic Regression Models). However, the problem with the final model is that the forecast does not converge (exactly) to the historical mean. y(t) is a time series for river flow, which is clearly stationary within each season, so the forecast should be equal to the historical mean in the long run.
I know that a SARIMA model is not the best choice for a seasonally stationary series, and that deseasonalized ARIMA og periodic ARIMA models explicitly state that the mean within each season is equal to the historical mean. However, if I try to deseasonalize both series (x(t) and y(t)) (and then fit an ordinary ARIMA to the residuals), the strong correlation structure between x and y disappears.
Do anyone have a suggestion regarding how I can achieve a forecast which converges to the historical mean without "loosing" the strong correlation between x and y?
Advertise on Talk Stats