Prometheus

New Member
Just want to check whether i undertsand something about SD correctly.

Am i right in thinking that generally as a sample size increases, its SD will get smaller. If true, does the SD stop getting smaller as it converges to the SD of the theoretical distribution from which it is sampled?

Thanks as always.

Dason

I think you have the concepts of standard error and standard deviation somewhat mixed up (although that's understandable since the standard error is a standard deviation - but of a different distribution).

The standard deviation of the data won't necessarily get bigger or smaller as the sample size increases. It might fluctuate a little bit if you're doing something like looking at the rolling standard deviation as you keep adding observations to the data but it's not going to automatically get smaller as the sample size increases.

I think what you're thinking of is the standard deviation of the sample mean (also referred to as the standard error). Basically the sample mean becomes more precise (has less variability) for large sample sizes. This sort of makes sense - you might not expect a sample with 3 observations to have the mean be really really close to the true population mean but if you have 2345023402304230420 observations then you expect the sample mean to be really really close to the population mean. That's the standard deviation that decreases - the standard deviation of the sample mean (i.e. the standard error). But if you looked at the actual standard deviaton of those 2345023402304230420 observations there is no reason to expect that it will be tiny - it will be really close to whatever the true standard deviation in the population is.

Prometheus

New Member
That makes sense. I think i had the pieces but thanks for putting them together clearly.

hlsmith

Omega Contributor
This would be nice to see in a simulation. I think you could say it would get more consistent.