#### jcl3

##### New Member
Here is the question

Given you were administering a standardized test with a mean of 100 and a standard deviation of 10 and you knew these data were computed on a sample of 10,000 people. Would you feel that 2 people to whom you administered the test had scores that differed beyond the 0.05 level of confidence given one scored 101 and the other 99.5? Explain your answer.

- The sample mean is a number that gives information about the "central tendency" of a set of numerical data collected on a sample A.K.A the average. So the average Standardized test score is 100, correct?

- The standard deviation of a data set is an indication of the "dispersion" or the "spread" of the measurements in the data set.So the spread would be from 90 – 110. correct?

According to the question I am 95% confident of the mean and standard deviation, correct? If that is the case test scores of 101 and 99.5 have not differed beyond my confidence level.

I am I on track

John #### ohms_law

##### New Member
So the average Standardized test score is 100, correct?
Basically, yes.

So the spread would be from 90 – 110. correct?
With a standard deviation of 10 yes, the spread would be from 90 to 110.

I am I on track

John looks that way to me. 