So than how would i conclude it?
could i say that T is the unbiased estimator of σ2. im confused because normally you have to work out E(T) and than your answer should equal σ2.
Looks fine with me.k[2σ2+2σ2]=σ2
k[4σ2]=σ2
k=1/4
is this correct or have i completely lost it???
So than how would i conclude it?
could i say that T is the unbiased estimator of σ2. im confused because normally you have to work out E(T) and than your answer should equal σ2.
once you found the k value how should i prove its an unbiased estimator of sigma squared?
could i say
k[2σ2+2σ2]=σ2
k[4σ2]=σ2
k=1/4
therefore T= 1/4[2σ2+2σ2]= σ2
T=σ2
therefore T is unbiased estimator of σ2.
Also i just want to say thank you for the help i really appreciate it. only if i could be a help to you.
Originally, from the definition of given in your question,
is not known yet; however, once we fix it,
we can determine whether it is an unbiased estimator for or not.
Similarly, we can solve for such that it is an unbiased estimator.
The thing I want to say is that the question want you to show
Of course the converse is also true:
However, the question is given the condition " is an unbiased estimator"
It will be pretty strange and weird to present like we guess for a value of
and then check whether it is an unbiased estimator or not.
Your argument is fine. But please note that the equation is
but not .
is an estimator, which is random before you take the
realization (the estimates).
ok thanks i will just write the whole solution out and i just want you to confirm if it right or not.
my answer;
X1, X2...X4~N(µ ,σ2)
E(Xi) = µ
V(Xi) = σ2
E[(x1-x2)^2] = X1^2 - 2X1X2 +X2^2
E[X1^2-2X1X2+X2^2] = E[X1^2]-2E[X1]E[X2]+E[X2^2]
E[X1] =µ , E[X1^2] = Var(X1) + (E[X1])^2 =σ2 +µ2
E[X1^2]-2E[X1]E[X2]+E[X2^2] = σ2 +µ2 - 2 µ µ + σ2 +µ2 = 2 σ2
for E[(X3-X4)^2] = 2 σ2
T is an unbiased estimator for σ2
if and only if E(T)= σ2
T= k[2σ2+2σ2]
σ2= k[4σ2]
σ2/4σ2=k
k= 1/4
then i concluded it by writing T is an unbiased estimator of σ2.
i hope all my question is answered and there is nothing left.
Many thanks
I got it
X1, X2, ….Xn is a random sample from a uniform distribution U(0,α). Suppose that we consider the following three statistics as estimators of the parameter α.
Tn= sample maximum Un=aTn Vn=b ,
where a, b are constants. You are given the following results.
E(Tn) = nα/(n+1)
V(Tn)= nα^2 /(n+1)^2 (n+2)
Also given is that Un and Vn are both unbiased estimators of the parameter α.
(i) Find the values of the constants a and b.
(ii) Show that Un and Vn are consistent estimators of α.
ive tried this question and ive uploaded the solutions, please can anyone chekc that is it correct or not if not please guide me thanks.
X1....Xn ~ U(0,α)
E(Xi) = α/2
V(Xi) = α^2 / 12
(i)
Un= aTn
Vn = bXbar
E(Un) = E(aTn)= aE(Tn)
= a x nα/(n+1)
a= n+1/n
E(Vn) = E(bXbar) = bE(Xbar)
= b x α/2 = α(Vn=Un) = b =2
b=2
please check if part (i) is correct.
Could somebody continue with the second part?
Show that the mean squared error is minimized at c=(n+1)^(-1). This estimator is known as the Pitman estimator for σ^2 in the Gaussian model. Show that the unbiased estimator has c=(n-1)^(-1) and compare its mean squared error to that of the Pitman estimator.]
arimya (11-13-2013)
Thanks! it was so easy that I didn't see it .
Tweet |