+ Reply to Thread
Results 1 to 3 of 3

Thread: maximizing probability that one variable is greater than another

  1. #1
    Points: 998, Level: 16
    Level completed: 98%, Points required for next Level: 2

    Posts
    3
    Thanks
    1
    Thanked 0 Times in 0 Posts

    maximizing probability that one variable is greater than another




    hi,
    There is a random variable X with a known distribution along its support interval, say, [0,a].

    Need to find a distribution function of another independent random variable Y with given expected value E(Y) and distributing along the same interval [0,a] so that P(Y>X) is maximal.

    here http://math.stackexchange.com/questi...e-is-less-than is explained how to calculate the probability that one variable is greater than another, but how to maximize it?

    In my particular example X has a mass point at zero and uniformly distributes on (0,a]. My solution candidate Y has a mass at a and uniformly distributes on [0,a), but I'm unable to show that Y indeed maximizes P(Y>X). Also, as I've said before, I'm looking for a general method to determine Y.

    Thanks in advance.

  2. #2
    TS Contributor
    Points: 22,410, Level: 93
    Level completed: 6%, Points required for next Level: 940

    Posts
    3,020
    Thanks
    12
    Thanked 565 Times in 537 Posts

    Re: maximizing probability that one variable is greater than another

    General formulation:

    Given a random variable X with CDF F_X, find a random variable Y (i.e. equivalent to find its CDF F_Y) to

    Maximize
    \Pr\{Y > X\} = \int_{-\infty}^{+\infty} \int_{-\infty}^y dF_X(x)dF_Y(y)

    subject to

    \int_{-\infty}^{+\infty} ydF_Y(y) = \mu

    where \mu < +\infty is a given constant.

    This, in a general Mathematical context should be related something like functional optimization, which I cannot go into details here as I do not know much either.


    For your particular example, I can try to give a heuristic guess for it.

    Let the point mass at 0 for X is p.

    Lets test some trivial cases first:


    If you set Y = \mu to be a constant, then

    \Pr\{Y > X\} = \Pr\{X < \mu\} = p + (1 - p)\frac {\mu} {a}



    If you set Y to be a discrete random variable with 2 support points \{y_1, y_2\} such that 0 < y_1 < \mu < y_2 < a and

    p_Yy_1 + (1 - p_Y)y_2 = \mu

    then

    \Pr\{Y > X\} = \Pr\{X < y_1\}p_Y + \Pr\{X < y_2\}(1 - p_Y)

    = \left[p + (1 - p)\frac {y_1} {a}\right]p_Y + 
\left[p + (1 - p)\frac {y_2} {a}\right](1 - p_Y)

    = p + (1 - p)\frac {\mu} {a}

    which is the same. Inductively, for any discrete random variable without support on 0, the probability is the same. And it is easy to see that for any discrete random variable with support 0 it must be worse.


    For Y to be a purely continuous random variable, with

    \int_0^a yf_Y(y)dy = \mu

    Then

    \Pr\{Y > X\}

    = \int_0^a \Pr\{X < y\}f_Y(y)dy

    = \int_0^a \left[p + (1 - p)\frac {y} {a}\right] f_Y(y)dy

    = p + (1 - p) \frac {\mu} {a}

    Again it has the same probability.


    For mixed case can work out need to work out tomorrow.

  3. The Following User Says Thank You to BGM For This Useful Post:

    Vova (07-29-2015)

  4. #3
    Points: 998, Level: 16
    Level completed: 98%, Points required for next Level: 2

    Posts
    3
    Thanks
    1
    Thanked 0 Times in 0 Posts

    Re: maximizing probability that one variable is greater than another


    Thank you very much for your answer!

    So, as I understand, the described X produces the same P(Y>X) for any Y given E(Y).
    And this is probably true in the opposite direction, i.e. given Y as described below, every possible X with given E(X) produces the same P(Y>X).

    Best,
    Vova.

+ Reply to Thread

           




Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts






Advertise on Talk Stats