# Simple Question Seasonal Autoregressive Model

#### Time Serious

##### New Member
Hi, I hope I'm posting this in an appropriate place. It's not exactly linear regression.

If you have a seasonal moving average model like $$\displaystyle ARIMA(0,0,0)(0,0,1)_{12}$$, and you want to find the expectation of $$\displaystyle y_t$$, then you just use the fact that expectation is a linear operator and that the expected value of any error term is zero. However if you have a seasonal auto-regressive model like $$\displaystyle ARIMA(0,0,0)(2,0,0)_{12}$$ for example, it will look like $$\displaystyle y_t=cy_{t-24} + \varepsilon_t$$. So trying to compute $$\displaystyle E(y_t),$$ you get $$\displaystyle E(y_t)=cE(y_{t-12})+\displaystyle E(\varepsilon_t)=cE(y_{t-12})$$ but where do you go from there?

Last edited:

#### vinux

##### Dark Knight
There is a small correction ARIMA(0,0,0)(2,0,0) is

$$y_t = c_0 + c_1 y_{t-12} + c_2 y_{t-24} +\epsilon_t$$
The constant part is zero in pure SARIMA model.
Hint:
The derivation of expectation is similar to the derivation of expectation of AR(2).
Use repeated substitution or stationary property

#### Time Serious

##### New Member
OK, I have two attempts.

[TEX]\displaystyle E(y_t)=c_1E(y_{t-12})\ plus\ c_2E(y_{t-24})=c_1^2(y_{t-24})\ plus\ c_2^2(y_{t-36})=c_1^3(y_{t-36})\ plus\ c_2^3(y_{t-48})=\ldots[/TEX]
Unfortunately I don't know when to stop. Can you tell me where I'd find a derivation for AR(2)?

However, perhaps this will work.
[TEX]\displaystyle E(y_t)=c_1E(y_{t-12})\ plus\ c_2E(y_{t-24})=c_1E(y_t)\ plus\ c_2E(y_t)[/TEX] so that
[TEX]\displaystyle E(y_t)(1-c_1-c_2)=0[/TEX] so that [TEX]\displaystyle E(y_t)=0[/TEX] unless [TEX]\displaystyle 1=c_1 \ plus\ c_2[/TEX]? Is this right?

(I use the word "plus" because the + does not appear in TeX for some reason.)

#### vinux

Use $$insted of $$i can't really read your post.$$ #### Time Serious ##### New Member $$works much better. So here is the post in readable form: OK, I have two attempts. \( E(y_t)=c_1E(y_{t-12})+ c_2E(y_{t-24})=$$ $$c_1^2E(y_{t-24})+ c_2^2E(y_{t-36})=c_1^3E(y_{t-36})+c_2^3E(y_{t-48})=\ldots$$ Unfortunately I don't know when to stop. Can you tell me where I'd find a derivation for AR(2)? However, perhaps my second attempt will work: $$\displaystyle E(y_t)=c_1E(y_{t-12})+ c_2E(y_{t-24})=c_1E(y_t)+ c_2E(y_t)$$ so that $$\displaystyle E(y_t)(1-c_1-c_2)=0$$ so that $$\displaystyle E(y_t)=0$$ unless $$\displaystyle 1=c_1 + c_2$$? Is this right?\) #### vinux ##### Dark Knight You are right. If c0 =0, the expectation of yt is zero. $$Unfortunately I don't know when to stop. Can you tell me where I'd find a derivation for AR(2)?$$ $$Substitution you need to int terms of white noise.$$ #### Time Serious ##### New Member Substitution you need to int terms of white noise. Thanks for your help. I'm afraid I don't know what you mean by the quoted text. I'm not sure if "int" should be "in" but even then I don't know what to do. #### vinux ##### Dark Knight White noise is [tex]\epsilon_t$$ in your model. The assumption on $$\epsilon_t$$ are, there are uncorrelated with mean 0 and constant variance.

You have considered this in your derivation( E[et] =0). You need to use the property $$0<c_1,c_2 <1$$ and $$c_1^n, c_2^n \rightarrow 0$$ as n large.

#### Time Serious

##### New Member
$$E(y_t)=c_1^n(E(y_t))+c_2^n(E(y_t))\ \forall n$$
$$E(y_t)=\lim_{n\to\infty}E(y_t)=E(y_t)\lim_{n\to\infty}(c_1^n+c_2^n)=E(y_t).0=0$$?