To accomplish this, I calculate the differences in a time series, delta(ts)=ts(t) – ts(t-1). Now I am trying to calculate an upper limit of 90% of those deltas. In other words, draw a smooth line on those differences such that only about 10% of the differences exceed that line. I will use that 90% limit to establish a maximum to normalize the differences between 0-1. Is this the best way to do this?

I’ve been working on this for months, mostly linear programmatic methods, with no success. And trying to get it to work across 50 data sets is killing me. I’m sure there has to be an elegant mathematical way to do this. I can’t be the first guy in town trying to normalize a relative degree of change of a time series.

Any help or directions for research are greatly appreciated! Obviously my math skills are weak so examples would be most helpful. Thank you everyone for your time and brain power!