Standard deviation of sampling distribution when n=1

#1
Hi everyone,

I'm reading this when it comes to the theory around the sampling distribution: "The standard deviation of the sample means (known as the standard error of the mean) will be smaller than the population standard deviation and will be equal to the standard deviation of the population divided by the square root of the sample size."

But what if we have a sample size of 1? Then the standard deviation of the sampling distribution (the standard error of the mean) would be the same as the standard deviation of the population. (In other words, it can't always be smaller than the population standard deviation, if it would be the same when n=1).

Or is this illogical in this case, because a sampling distribution of all possible samples of size 1 is basically the population itself? Can you even have a sample size of 1? Is this considered a "sample"?

Thank you very much in advance for your help.

Frodo
 

obh

Well-Known Member
#2
Hi Frodo,

Clearly, the standard deviation statistic has practical meaning when you use a sample that is greater than 1.
So why are you interest in this question?

For your question ...
If the entire population contains only one subject then the standard deviation is 0, as there is no deviation from the average.

If your sample size is 1, then the sample standard deviation is not defined ... (clearly not 0)
 
#3
Hi Frodo,

Clearly, the standard deviation statistic has practical meaning when you use a sample that is greater than 1.
So why are you interest in this question?

For your question ...
If the entire population contains only one subject then the standard deviation is 0, as there is no deviation from the average.

If your sample size is 1, then the sample standard deviation is not defined ... (clearly not 0)
Hi obh,

Thank you for your response. This makes sense to me. However, the question I have is about the standard deviation of the sampling distribution.

Even if the standard deviation of the sample is not defined with a sample size of 1, wouldn't the standard deviation of the sampling distribution end up being the same as (equal to) the standard deviation of the population, when the sample size is 1?

In the quote I provided above, it says that the standard deviation of the population is always greater than the standard deviation of the sampling distribution, because the standard deviation of the sampling distribution is a function of n (it will be equal to the standard deviation of the population divided by the square root of the sample size). But how can that be if our sample size is 1? Then the standard deviation of the sampling distribution would be the same as the standard deviation of the population, not less than. (Because the square root of 1 is 1.)

I hope you can understand what I'm saying. Thanks again for your help.

Frodo
 
Last edited:

katxt

Active Member
#4
You are right. They should say "the standard deviation of the population is always greater than the standard deviation of the sampling distribution for any sensible sample."
 

spunky

Doesn't actually exist
#5
I'm reading this when it comes to the theory around the sampling distribution: "The standard deviation of the sample means (known as the standard error of the mean) will be smaller than the population standard deviation and will be equal to the standard deviation of the population divided by the square root of the sample size."
Was this quote taken from an textbook in intro stats/methodology aimed at social scientists?
 

katxt

Active Member
#6
Also, these are only average estimates. With samples of n=2, the calculated SEM is greater than the population SD about 15% of the time.