# Bayes Estimator under quadratic loss function

#### MathJack

##### New Member
Let $$X,\theta$$ be two random variables with a joint distribution, and $$\mu(\theta)=E[X|\theta]$$. Suppose we have n observations on $$X$$ given $$\theta$$, with the further condition that the observations $$X_1,...,X_n|\theta$$ are independent. Show that the Bayes estimator minimizing the quadratic loss function is given by $$\tilde{\mu(\Theta)}=E[\mu(\Theta)|X_1,...,X_n]$$. Any pointers to start?

Last edited:

#### MathJack

##### New Member
Let $$X,\theta$$ be two random variables with a joint distribution, and $$\mu(\theta)=E[X|\theta]$$. Suppose we have n observations on $$X$$ given $$\theta$$, with the further condition that the observations $$X_1,...,X_n|\theta$$ are independent. Show that the Bayes estimator minimizing the quadratic loss function is given by $$\tilde{\mu(\Theta)}=E[\mu(\Theta)|X_1,...,X_n]$$. Any pointers to start?
All solved