View Full Version : Expectation Operator

08-04-2010, 09:06 PM
I understand the Expectation (and the expectation operator) in a very crude sense. For example, when applied to both discrete or continuous distributions to find various summary statistics for a data set.

What I don't get is how its used in a more theoretical context. I'm just not bridging the gap correctly. Or maybe I really dont understand expectation fully.

For example.

Say I want to find the asymptotic variance of pi_hat from a binomial distribution.

Many texts say take the negative expectation of second derivative of the log-likelihood function with respect to pi_hat.

I've got the second derivative: y/pi^2 + (n-y)/(1-pi)^2

How do you take the expectation of this function? That is, what does it amount to actually doing in the computation...what does it look like....can you show it step by step?

Many results, including the one above, aren't fully derived in some of my sources but make reference to what to do or how it was done, without showing the steps explicitly. To no surprise, the expectation operator is used.

I know its a weird question, but was really hoping for some insights!


08-04-2010, 09:26 PM
Note that the only random variable in your expression is y and that expectation has certain properties that make it a little easier. Mainly E[Ay+B] = A*E[y]+B. This simplifies the expectation quite a bit.

08-05-2010, 11:49 AM
Doh. My question had a very obvious answer - I just couldn't see it.

As you put it, Y is the only RV in the log-likelihood function (or fisher information). So we can use the properties of expectation to isolate it and then use the fact that for a binomial RV, the expectation is: E[Y] = n*pi

Substitute that in and we're golden.

Thanks a lot Dason, it wasn't immediately obvious to me. :)