SD of an average


New Member
I have a statistics question, relevant to an engineering project.

A quantity was measured N times, with an average of A and a standard deviation of s.

If N measurements were to be made repeatedly, each time calculating the average, how would I calculate the expected standard deviation of this average.

Appreciate any help anyone out there can give! Thanks.
Standard error

It sounds like you are asking about the standard error - the standard deviation of a sampling distribution. (In this case, you are interested in the standard deviation of sampling distribution of the mean.)

If you know the standard deviation of the population (sd) and you know the number of observations in the sample (n), you can compute the standard error (se) as follows:

se = sd/sqrt(n) where sqrt refers to the square root

This is explained in more detail (and maybe a little clearer) at .

Good luck.