Probability that one random variable is greater than another

b_d

New Member
#1
I need to find the probability that a normally-distributed random variable is greater than a uniformly-distributed random variable. That is:

\(X \sim \mathcal{N}(\mu, \sigma^2)\)
\(Y \sim \mathcal{U}(0, 1)\)

What I'm looking for is:
\(Pr(X \ge Y)\).

I got this thus far, but I'm not sure it's right, and it's not too useful in seeing how changing the mean and standard deviation affect the probability.
\(Pr(X \ge Y) = 1 - \int^1_0 \Phi(\frac{x-\mu}{\sigma}) dx\)

Is my solution on the right track? And is there an easier way to compute this?
 
Last edited:

BGM

TS Contributor
#2
You may try to conditional on the values of \( X \) instead:

\( \Pr\{Y \leq X\} = \left(\int_{-\infty}^0 + \int_0^1 + \int_1^{+\infty}\right)
\Pr\{Y \leq x\} \frac {1} {\sqrt{2\pi\sigma^2}}
\exp\left\{-\frac {(x - \mu)^2} {2\sigma^2}\right\} dx \)

You should be able to simplify the integrals. One of them maybe use integration by parts.
 

b_d

New Member
#3
I'm sorry, but my knowledge of statistics is limited to a high-school education. Could you explain that a little, or maybe link to some relevant Wikipedia articles?