# Posterior Distribution

#### MrAnon9

##### New Member
Suppose I have $$X_{1}..X_{n}$$ which are i.i.d as Bernoulli random variables with $$P(X_{i} = 1) = \frac{e^\theta}{1+e^\theta}$$ and the prior is normal with mean 0 and variance 100. How can I find the posterior distribution if the number of $$X_{i}$$ equal to one is 5 and n is 16?

What do I use for the likelihood here? Do I differentiate $$\frac{e^\theta}{1+e^\theta}$$ since it's CDF of logistic distribution or do I use the pdf of a bernoulli?

The question confuses me

Last edited:

#### Dason

If I told you P(Xi = 1) = p then what would the likelihood be? Ok now substitute $$\frac{e^\theta}{1 + e^\theta}$$ in for p and you have your likelihood.

#### MrAnon9

##### New Member
so I just have to find the product of the 1 over 1 + e ?

#### Dason

What? The likelihood would be a binomial distribution: (16 choose 5) * p^5 (1-p)^11 and replace p with the quantity above.

#### Dason

Also are you sure the prior has a variance of 0?

#### BGM

##### TS Contributor
Just want to say that a prior with variance 0 is very stubborn - it will not change.

#### Dason

Just want to say that a prior with variance 0 is very stubborn - it will not change.
I myself am a fan of point mass priors. It makes analyzing the posterior very easy.

#### BGM

##### TS Contributor
If you have a point mass prior, you are "not Bayesian"

Anyway back to OP problem, since they are not in a conjugate class, so after using the Bayes theorem to write down the posterior, not much can be simplified.

#### Dason

If you have a point mass prior, you are "not Bayesian"
Why not? You're just a very confident Bayesian. We do it all the time when we actually set what we think some of the hyperparameters are - this is equivalent to giving them a point mass prior.

#### MrAnon9

##### New Member
Why is it a binomial by the way? Also, can I not get rid of the 16 choose 5 as a constant of proportionality~?

#### BGM

##### TS Contributor
The combinatoric coefficient in the front is not important. You will cancel it anyway. You can also think the sample condition on the parameter is just Bernoulli.

#### MrAnon9

##### New Member
If Bernoulli has PMF $$\theta^{X_{i}} (1-\theta)^{1-X_{i}}$$ and I find the product of this to get the likelihood function of theta, then where does N choose Xi come into it?

#### Dason

Because you don't know the order that the successes came. There is only 1 way to order having no successes. There are 16 ways to have 1 success... it's the same argument as in the development of the binomial distribution. Mainly because it is a binomial distribution.

#### MrAnon9

##### New Member
Hi again, I have found the posterior to be $$e^{5\theta - \frac{\theta^2}{200}} (1+e^{\theta})^{-16}$$

Is this correct? And also, now I need to use use a normal prior as the proposal density and describe a rejection sampler for sampling from this posterior density, any ideas on how to do that?

Last edited:

#### BGM

##### TS Contributor
Yes, you are correct to say the posterior pdf is proportional to the function you have found.

Have you learn the rejection sampling?

#### MrAnon9

##### New Member
Erm yeah, if it's the V < target / M*proposal ? But can that be used here?

#### MrAnon9

##### New Member
Or do I have to use Markov Chain Monte Carlo simulation?