Is randomness a necessary condition for mean reversion?

#1
I am using the following definition for mean reversion I found on the web:

"Reversion to the mean is the statistical phenomenon stating that the greater the deviation of a random variate from its mean, the greater the probability that the next measured variate will deviate less far. In other words, an extreme event is likely to be followed by a less extreme event."

My friend insists there is no requirement for randomness and uses a similar definition of mean reversion but without the word "random". The application is using mean reversion for the pricing of commodities.

So, is randomness a necessary condition for mean reversion?

Is there a better definition of mean reversion?

How can I convince my friend that mean reversion only applies to random variables?

Thanks in advance.
 

hlsmith

Not a robit
#2
Sorry, I have to start off by saying you obviously mean "regression" to the mean, right?

Yeah, I don't know the indepth theoretical process associated with the phenomenon, but to me it definitely seems like a randomness issue. So if I do well on Exam #1, I mean really well, I will likely do worse on Exam #2 since the randomness went my way the first time (I assume multiple sources in the data generating process -> I so happen to know the content, my guesses were right by chance, etc.), especially if it was say a multiple choice exam.

Other examples, tall people having shorter progeny, there is a stochastic component in the parent to bump them up the little bit more than most tall people, not saying the next generation won't be tall, just on average more likely shorter, regressed toward mean. The above content are just fleeting thoughts on the topic, so take them with a grain of salt!!