maximizing probability that one variable is greater than another

hi,
There is a random variable X with a known distribution along its support interval, say, [0,a].

Need to find a distribution function of another independent random variable Y with given expected value E(Y) and distributing along the same interval [0,a] so that P(Y>X) is maximal.

In my particular example X has a mass point at zero and uniformly distributes on (0,a]. My solution candidate Y has a mass at a and uniformly distributes on [0,a), but I'm unable to show that Y indeed maximizes P(Y>X). Also, as I've said before, I'm looking for a general method to determine Y.

Re: maximizing probability that one variable is greater than another

General formulation:

Given a random variable with CDF , find a random variable (i.e. equivalent to find its CDF ) to

Maximize

subject to

where is a given constant.

This, in a general Mathematical context should be related something like functional optimization, which I cannot go into details here as I do not know much either.

For your particular example, I can try to give a heuristic guess for it.

Let the point mass at 0 for is .

Lets test some trivial cases first:

If you set to be a constant, then

If you set to be a discrete random variable with 2 support points such that and

then

which is the same. Inductively, for any discrete random variable without support on , the probability is the same. And it is easy to see that for any discrete random variable with support it must be worse.

For to be a purely continuous random variable, with

Then

Again it has the same probability.

For mixed case can work out need to work out tomorrow.

Re: maximizing probability that one variable is greater than another

Thank you very much for your answer!

So, as I understand, the described X produces the same P(Y>X) for any Y given E(Y).
And this is probably true in the opposite direction, i.e. given Y as described below, every possible X with given E(X) produces the same P(Y>X).