View Full Version : Binomial Distribution as N goes to Infinity

messianic

07-15-2010, 04:38 PM

I know that the Central Limit Theorem says that all types of distributions go to normal distribution as N goes to infinity. However, is there some way to prove this using the mean and variance of a binomial distribution?

Masteras

07-15-2010, 04:49 PM

what do you mean exactly? if you open a book its there? what are you looking for exaclty? the mathematical proof? i think that is through taylor expansion series

Dason

07-15-2010, 05:04 PM

That's not exactly what the CLT says. You need to be a little more careful with the terminology. Why did you start a new thread about this? I'm going to post what I posted in the previous thread you made...

How much math do you know?

You're basically just going to prove the central limit theorem for a specific case. But I think the easiest way for this particular case is to look at the moment generating function for Y/n and then take the limit as n goes to infinity. If you can show that it goes to the MGF of some normal distribution then you proved it.

I'll note that Masteras' tip was pretty spot on. The thing is that you want to look at the taylor series expansion of the MGF and then using Taylor's theorem you can basically say that you can throw away the higher order terms as n goes to infinity and then you get to take the limit.

Masteras

07-15-2010, 05:43 PM

of the mean of course

Dason

07-15-2010, 06:20 PM

of the mean of course

Of course. I guess I didn't explicitly state that in the previous post but I do mention that you're looking at the MGF of Y/n when I quote myself. And I mean there's a lot more going on. You really should be subtracting the null mean and dividing by the standard deviation over sqrt(n) yadda yadda yadda insert central limit details here.

Masteras

07-15-2010, 07:19 PM

what you are saying is a way, but i was saying about the log-likelihood and the taylor expansion there. but anyway, it works.

Dragan

07-15-2010, 07:36 PM

I know that the Central Limit Theorem says that all types of distributions go to normal distribution...

Ahem, be careful, a basic prerequisite is that the distribution will need to have finite mean and finite (positive) variance.

Powered by vBulletin™ Version 4.1.3 Copyright © 2016 vBulletin Solutions, Inc. All rights reserved.