Hi,

as far as I know (ad as stated in different time series literature) homoscedacity is also an assumption in time series. And why should this not be the case? I mean time is also just a continuous variable. The only difference to other continuous variables (such as space) is that correlation makes only sense ''in backward direction'', but I don't see why this should lead to a neglection of homogeneity of variance over time.

Regarding linearity: If we violate linearity, this leads to residual patterns, but we can deal with this by incorporating an appropriate autoregression structure. However, I think this is only the second choice if we are able to describe nonlinearities instead directly via nonlinear terms.