Probability that a normally distributed variable exceeds a threshold within a set period of time.

#1
I have measurements from a sensor (the type doesn't matter). I measure the input to the sensor once per second over a period of time T0. The noise is normally distributed and over the period T0 it has a mean mu0 and a stand deviation sigma0. What I want to be able to determine is the value Y such that the signal will exceed Y only once in a period of time T which is greater than T0 with 99.9% confidence.

I know this can be calculated if the measure noise is gaussian and we assume no difference in behavior of the sensor between T0 and the end of T but I don't remember how it's done.

Can anyone provide some insight?

Thanks for reading.
 
#2
Let me clarify....the my question should read "What I want to be able to determine is (to 99.9% confidence) the value Y such that the signal will exceed Y only once in a period of time T which is greater than T0 ."

Thanks