Bayes Estimator under quadratic loss function

#1
Let \(X,\theta\) be two random variables with a joint distribution, and \(\mu(\theta)=E[X|\theta]\). Suppose we have n observations on \(X\) given \(\theta\), with the further condition that the observations \(X_1,...,X_n|\theta\) are independent. Show that the Bayes estimator minimizing the quadratic loss function is given by \(\tilde{\mu(\Theta)}=E[\mu(\Theta)|X_1,...,X_n]\). Any pointers to start?
 
Last edited:
#2
Let \(X,\theta\) be two random variables with a joint distribution, and \(\mu(\theta)=E[X|\theta]\). Suppose we have n observations on \(X\) given \(\theta\), with the further condition that the observations \(X_1,...,X_n|\theta\) are independent. Show that the Bayes estimator minimizing the quadratic loss function is given by \(\tilde{\mu(\Theta)}=E[\mu(\Theta)|X_1,...,X_n]\). Any pointers to start?
All solved