What is LAG (time series question)

I have heard the term LAG used for time series and sort of have a very basic grasp of it. Could someone please try explaining it (in a very basic way) - perhaps with an example? Also, what happens is LAG changes from small to large or vice versa?



Fortran must die
A lag is simply a later point in the data series. For example if time is measured at point T than T=1 is a lag later.
So LAG=x is basically seeing how correlated Time T is with time T-x? How does one choose which LAG to to use (or what happens when LAG increases or decreases?). Is using higher LAG better than lower LAG?