I have heard the term LAG used for time series and sort of have a very basic grasp of it. Could someone please try explaining it (in a very basic way) - perhaps with an example? Also, what happens is LAG changes from small to large or vice versa?
So LAG=x is basically seeing how correlated Time T is with time T-x? How does one choose which LAG to to use (or what happens when LAG increases or decreases?). Is using higher LAG better than lower LAG?