One nice way to get a decent approximation to a variance that will give you a good acceptance rate (between 25-40%) is to figure out a normal approximation to the posterior distribution. It's not too hard to do in practice but I know of a couple VERY easy ways to accomplish this in R. What are you using to program your algorithm?
Well what you're doing is a metropolis within Gibbs step so what you really need is the conditional posterior distribution which is easy because it's just proportional to the joint posterior. So really you just need something is proportional to the joint posterior which is itself just proportional to the likelihood times the priors which is easy to get.
Ok, I am confused.
I have already calculated p(g_i|a,b,x_1,...,x_n,y_1,...,y_n,g_1,...,g_(i-1),g_(i+1),...,g_n)
I know that with this I can implement the Gibbs sampler to sample new values of the g_i's.
What I am stuck on is implementing the Metropolis-Hastings to sample new values of a and b.
I know that I need to get a proposed distribution let us G centered at the previous a^(t-1). Then we sample a^t from this and calculate the acceptance ratio. Then we sample u from a uniform(0,1) and if u < R we accept a^t else we set it as the previous value.
So my problem is in calculating R. The formula for R is [p(a*|y)/p(a^(t-1)|y)][G(a^(t-1)|a*)/G(a*|a^(t-1)].
So my problem is in calculating p(a|y).
Or based on the information given to me, since we have g_i|a,b and the priors a,b,
would we just find a|g_i and b|g_i then use this for our acceptance ratio?