# Probability that a normally distributed variable exceeds a threshold within a set period of time.

#### x703jko

##### New Member
I have measurements from a sensor (the type doesn't matter). I measure the input to the sensor once per second over a period of time T0. The noise is normally distributed and over the period T0 it has a mean mu0 and a stand deviation sigma0. What I want to be able to determine is the value Y such that the signal will exceed Y only once in a period of time T which is greater than T0 with 99.9% confidence.

I know this can be calculated if the measure noise is gaussian and we assume no difference in behavior of the sensor between T0 and the end of T but I don't remember how it's done.

Can anyone provide some insight?

#### x703jko

##### New Member
Let me clarify....the my question should read "What I want to be able to determine is (to 99.9% confidence) the value Y such that the signal will exceed Y only once in a period of time T which is greater than T0 ."

Thanks

#### Dason

##### Ambassador to the humans
Is there autocorrelation in the data?

#### x703jko

##### New Member
No. The events are not correlated. There is no period structure to the data.

#### GretaGarbo

##### Human
No. The events are not correlated. There is no period structure to the data.
Show us the evidence.

Show us the autocovariance function.

#### GretaGarbo

##### Human
It seems worrying that the correlation is not restricted to be between -1 and +1.

#### x703jko

##### New Member
It seems worrying that the correlation is not restricted to be between -1 and +1.
It’s what Origin calculates. Sorry I can’t provide more insight.

#### GretaGarbo

##### Human
There seems to be some cycles in the corrY2 (what ever that really is).

Most time series have some autocorrelation.

#### x703jko

##### New Member
Understood, but getting back to my original question?