Question about normally distributed estimator with a constant

#1
Hi all i'm having serious difficulties with the following problem :shakehead any help would be much appreciated.

_________________________________________________

Assume that some random variable x is normally distributed with mean μ and variance σ². consider the following estimator where ci are constant values.

x-star =∑cixi

1. What is the expected value of x-star?

2. What is the least restrictive assumption that you can make about the ci such that x-star is an unbiased estimator of the population mean μ?

3. Assuming the covariances between all the different values of x in the sample are zero, what is the variance of x-star?

4. Assume that your condition of part B holds. Can we choose between x-bar (defined in the usual way) and x-star on the basis of unbiasedness? Why?

5. Use Chebychev's inequality to show why we prefer x-bar to x-star.
__________________________________________________

1. E(x-star) =E(∑cixi)
=E(ci∑xi)
=ci∑E(xi)
=ci∑μ
=cinμ

2. I do not know what to do here. If x-star is an unbiased estimator of μ,
then E(x-star) = μ
if ci = 1/n then E(x-star) = μ, but that would make x-star equal to x-bar defined in the usual way: x-bar = 1/n∑xi. :confused:


3. var(x-star)=var(∑cixi)
=ci²var(∑xi)
=ci²(∑var(xi)+∑∑cov(xi,xj))
=ci²∑var(xi)
 

Dason

Ambassador to the humans
#2
1. E(x-star) =E(∑cixi)
=E(ci∑xi)
=ci∑E(xi)
=ci∑μ
=cinμ
The \(c_i\) are not constant so you can't pull them outside the summation (the value of \(c_i\) could change with respect to i).

Start instead by doing this...
\(E(x^*) = E(\sum_{i=1}^nc_ix_i) = \sum_{i=1}^nE(c_ix_i)\). Now see what you can do with that.

Note that it's fine to pull the \(c_i\) outside of the expectation but you can't pull it outside of the summation.
 
#3
The \(c_i\) are not constant so you can't pull them outside the summation (the value of \(c_i\) could change with respect to i).

Start instead by doing this...
\(E(x^*) = E(\sum_{i=1}^nc_ix_i) = \sum_{i=1}^nE(c_ix_i)\). Now see what you can do with that.

Note that it's fine to pull the \(c_i\) outside of the expectation but you can't pull it outside of the summation.
Thanks for the reply, I think i've made some progress:

1. What is the expected value of x*?

E(x*)=E(∑cixi)
=∑E(cixi)
=∑(ciE(xi))
=∑(ciμ)

2. What is the least restrictive assumption that you can make about the ci such that x* is an unbiased estimator of the population mean μ?

E(x*)=∑(ciμ)
=μ∑ci
So the lest restrictive assumption is that the sum of the constants are 1? ∑ci =1
then:
E(x*)=μ
 

Dason

Ambassador to the humans
#4
That all looks good to me. Although I would say you could use the first set of equalities you have in (2) to reduce your answer in (1) a little bit more as well.
 
#5
Thanks once again Dason, I've attempted the other questions, still unsure about part 5

3. Assuming the covariances between all the different values of x in the sample are zero, what is the variance of x-star?

var(x*)=var(∑aixi)
=∑var(aixi)
=∑ai²var(xi)+2aiajcov(xi,xj)
=∑ai²var(xi) (assuming cov(xi,xj)=0)
=∑ai²σi²

4. Assume that your condition of part B holds. Can we choose between x-bar (defined in the usual way) and x-star on the basis of unbiasedness? Why?

No

In part 2, under the condition ∑ai=1, we proved x* is an unbiased estimator of μ:
E(x*)=μ

xbar (defined in the usual way) is also an unbiased estimator of μ:
E(xbar)=E(1/n∑xi)
=1/n∑E(xi)
=1/n∑μ
=(1/n)(nμ)


We cannot choice between x* and xbar on the basis of unbiasedness.

5. Use Chebychev's inequality to show why we prefer x-bar to x-star.

Not sure what to do here

The inequality i'm looking at is:
For every random variable y with mean μ and variance σ² there holds:
P(|y-μ|≥cσ)≤ 1/c²

any ideas?