Simple Question Seasonal Autoregressive Model

Hi, I hope I'm posting this in an appropriate place. It's not exactly linear regression.

If you have a seasonal moving average model like \(\displaystyle ARIMA(0,0,0)(0,0,1)_{12}\), and you want to find the expectation of \(\displaystyle y_t\), then you just use the fact that expectation is a linear operator and that the expected value of any error term is zero. However if you have a seasonal auto-regressive model like \(\displaystyle ARIMA(0,0,0)(2,0,0)_{12}\) for example, it will look like \(\displaystyle y_t=cy_{t-24} + \varepsilon_t\). So trying to compute \(\displaystyle E(y_t),\) you get \(\displaystyle E(y_t)=cE(y_{t-12})+\displaystyle E(\varepsilon_t)=cE(y_{t-12})\) but where do you go from there?
Last edited:


Dark Knight
There is a small correction ARIMA(0,0,0)(2,0,0) is

\( y_t = c_0 + c_1 y_{t-12} + c_2 y_{t-24} +\epsilon_t\)
The constant part is zero in pure SARIMA model.
The derivation of expectation is similar to the derivation of expectation of AR(2).
Use repeated substitution or stationary property
OK, I have two attempts.

[TEX]\displaystyle E(y_t)=c_1E(y_{t-12})\ plus\ c_2E(y_{t-24})=c_1^2(y_{t-24})\ plus\ c_2^2(y_{t-36})=c_1^3(y_{t-36})\ plus\ c_2^3(y_{t-48})=\ldots[/TEX]
Unfortunately I don't know when to stop. Can you tell me where I'd find a derivation for AR(2)?

However, perhaps this will work.
[TEX]\displaystyle E(y_t)=c_1E(y_{t-12})\ plus\ c_2E(y_{t-24})=c_1E(y_t)\ plus\ c_2E(y_t)[/TEX] so that
[TEX]\displaystyle E(y_t)(1-c_1-c_2)=0[/TEX] so that [TEX]\displaystyle E(y_t)=0[/TEX] unless [TEX]\displaystyle 1=c_1 \ plus\ c_2[/TEX]? Is this right?

(I use the word "plus" because the + does not appear in TeX for some reason.)
\( works much better. So here is the post in readable form:

OK, I have two attempts.

\( E(y_t)=c_1E(y_{t-12})+ c_2E(y_{t-24})=\) \(c_1^2E(y_{t-24})+ c_2^2E(y_{t-36})=c_1^3E(y_{t-36})+c_2^3E(y_{t-48})=\ldots\)
Unfortunately I don't know when to stop. Can you tell me where I'd find a derivation for AR(2)?

However, perhaps my second attempt will work:
\(\displaystyle E(y_t)=c_1E(y_{t-12})+ c_2E(y_{t-24})=c_1E(y_t)+ c_2E(y_t)\) so that
\(\displaystyle E(y_t)(1-c_1-c_2)=0\) so that \(\displaystyle E(y_t)=0\) unless \(\displaystyle 1=c_1 + c_2\)? Is this right?\)


Dark Knight
White noise is [tex]\epsilon_t [/tex] in your model. The assumption on [tex]\epsilon_t [/tex] are, there are uncorrelated with mean 0 and constant variance.

You have considered this in your derivation( E[et] =0). You need to use the property [tex]0<c_1,c_2 <1 [/tex] and [tex] c_1^n, c_2^n \rightarrow 0 [/tex] as n large.
Right, how about this
\(E(y_t)=c_1^n(E(y_t))+c_2^n(E(y_t))\ \forall n\)
so that

Is there something else I should do?