+ Reply to Thread
Results 1 to 2 of 2

Thread: correct update of bayesian posteriors

  1. #1
    Points: 1,216, Level: 19
    Level completed: 16%, Points required for next Level: 84

    Thanked 0 Times in 0 Posts

    correct update of bayesian posteriors

    the question looks a bit artificial but I simply tried to get rid of useless details.

    There are two persons, A, and B. they each got a number, a and b, correspondingly. and their task is to guess the right number c=(a+b)/2. Each of them knows his/her own number but not the number of the other.

    If the B's prior beliefs were distributed normally (or by any distribution that is known to B) with a mean of b, how B should update his posterior probabilities if we informed B that a <K (K - a certain number).

    In other words, we just tell B, that c<(K+b)/2 .

    I guess he should 'cut' the tail of his distribution right to K, and then redistribute the probabilities to the left of the K. Is that correct? What is the right way to do it?

  2. #2
    TS Contributor
    Points: 22,410, Level: 93
    Level completed: 6%, Points required for next Level: 940

    Thanked 565 Times in 537 Posts

    Re: correct update of bayesian posteriors

    I am not a Bayes expert, here is my guess only:

    By Bayes Theorem, we have the follow formula

    p(\theta|x) = \frac {p(x|\theta)p(\theta)} {p(x)}

    which is useful to update the posterior distribution when we did actually observed the realization of data X = x

    Now you observed a censored data instead, X < k

    So we have

    p(\theta|X<k) = \frac {p(X<k|\theta)p(\theta)} {p(X<k)}

    The LHS should be still you desired posterior; and in the RHS you will involve the conditional CDF and marginal CDF (by integrating the \theta out) in the numerator and denominator respectively.

+ Reply to Thread


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts

Advertise on Talk Stats