I'm working on a game theory model of incomplete information, where players observe certain attributes via noisy signals. I am looking to solve for two different probability functions, though I think the math should be very similar:
Any help with this would be much appreciated! I've read a bit about convolutions and posterior predictive distributions, but I don't have the grasp on them that I need to solve for these functions.
Thanks!
- Say there is some random variable [math]X \sim U(a - \epsilon, a + \epsilon)[/math]. You observe a draw from this distribution, call it [math]S_1[/math], but you do not know [math]a[/math]. Given [math]S_1[/math], what is the probability that another draw, [math]S_2[/math], from the same distribution, will be greater than or equal to an arbitrary number [math]b[/math]? That is, what is:
[math]P(S_2 \geq b \mid S_1)[/math]
- (Note: these problems are separate, so, for example, [math]b[/math] here does not mean the same thing as [math]b[/math] in part 1). Say that [math]b[/math] and [math]c[/math] are two independent draws from [math]U(0,1)[/math], the standard uniform distribution. Now, [math]B \sim U(b - \epsilon, b + \epsilon)[/math] and [math]C \sim U(c - \epsilon, c + \epsilon)[/math]. You observe one draw from [math]B[/math] and one draw from [math]C[/math], but you do not know the true values [math]b[/math] or [math]c[/math]. What is the probability that a new draw from [math]B[/math] will be greater than a new draw from [math]C[/math]? That is, if [math]S_1^b[/math] is your observed draw from [math]B[/math] and [math]S_1^c[/math] is your observed draw from [math]C[/math], then what is:
[math]P(S_2^b \geq S_2^c \mid S_1^b, S_1^c)[/math]
Any help with this would be much appreciated! I've read a bit about convolutions and posterior predictive distributions, but I don't have the grasp on them that I need to solve for these functions.
Thanks!