+ Reply to Thread
Results 1 to 6 of 6

Thread: Standard error v standard deviation

  1. #1
    Points: 3,183, Level: 34
    Level completed: 89%, Points required for next Level: 17

    Posts
    14
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Standard error v standard deviation




    What is the difference between Standard error v standard deviation, please?

    Thank you

  2. #2
    Points: 11,397, Level: 70
    Level completed: 37%, Points required for next Level: 253

    Posts
    512
    Thanks
    29
    Thanked 6 Times in 6 Posts
    Standard deviation is a measure of spread or variability for a given set of scores. The standard error quantifies how much variability exists between you're sample statistic and the population parameter.

  3. #3
    Points: 3,504, Level: 37
    Level completed: 3%, Points required for next Level: 146

    Posts
    139
    Thanks
    1
    Thanked 1 Time in 1 Post

  4. #4
    Points: 2,828, Level: 32
    Level completed: 52%, Points required for next Level: 72

    Location
    st. john's, newfoundland
    Posts
    51
    Thanks
    0
    Thanked 0 Times in 0 Posts
    this is assuming you mean the standard error of the mean:

    if X ~ N(mu, sigma^2)
    (that is, if X follows a normal distribution with mean mu and variance sigma^2)

    then Xbar ~ N(mu, sigma^2 / n)
    (the sample mean Xbar follows a normal distribution with mean mu and variance sigma^2 / n)

    'the' standard deviation usually refers to the square root of the variance of X's distribution,
    sqrt(sigma^2) = sigma

    the standard error (of the mean) refers to the square root of the variance of Xbar's distribution,
    sqrt(sigma^2 / n) = sigma/sqrt(n)

    standard deviation = square root of variance of X's distribution
    standard error (of the mean) = square root of variance of Xbar's distribution
    the little heart beats so fast (8)

  5. #5
    Points: 4,120, Level: 40
    Level completed: 85%, Points required for next Level: 30

    Posts
    6
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Let's get away from stats-speak for a minute and talk about what the difference of these terms are for the average person.

    Standard deviation is a measure of dispersion. When you are looking at individual datapoints, standard deviation gives you a measuring tool to put a probability value on the difference of the datapoint and the mean of the population.

    Standard error is EXACTLY the same thing... a measure of dispersion... only not for individual datapoints but sample means. When you take a sample of a certain size "n", the mean of that sample is measured against the population mean using standard error as the measuring stick instead of standard deviation. Remember... if you took EVERY POSSIBLE sample combination of a certain size "n" from a population "N", the distribution of the sample means would be NORMAL!!! That is why standard error is applicable to samples.

    The underlying logical reason for this is that the mean of a sample would be expected to be more representative of the population mean than an individual datapoint. And, the larger the sample the more accurate we expect it to be in representing the population. That's why standard error gets smaller as the sample size gets larger... we are 'judging' the sample means by a tougher and tougher standard as the sample size grows. Look at the formula for standard error. If the sample size is 1... you get the standard deviation formula. And, if the sample size is the entire population, you would get... what??

    Hope this helps.

  6. #6
    Points: 1,493, Level: 21
    Level completed: 93%, Points required for next Level: 7

    Posts
    108
    Thanks
    11
    Thanked 7 Times in 7 Posts

    Re: Standard error v standard deviation


    In trying to find a good explaination for this I went through this thread. After reviewing the formulas I see mathematically that when sample size is 1 you get the standard deviation and when the sample size is the entire population the standard error is 0. What I dont understand is how the standard error is a measure of dispersion of the sample means. the formula for standard error as mentioned above is sample standard deviation divided by squrt(n). In the sample standard deviation formula we square the difference between xi and xmu, divide by n-1 and square root the whole thing. I dont understand how using the xi's in our sample gives us information about the dispersion of all the sample means. Are we assuming here that the xi's represent where all the possible xbarmu's might be and taking the difference between the xi's and the xmu is a good approximation to taking the difference between all the xbarmu's and the true mu?

+ Reply to Thread

           




Similar Threads

  1. Replies: 8
    Last Post: 09-03-2014, 12:48 PM
  2. Replies: 5
    Last Post: 10-28-2010, 07:04 PM
  3. Standard error of the sample standard deviation
    By Taqman in forum Statistics
    Replies: 5
    Last Post: 06-10-2010, 09:50 PM
  4. Standard error of the sampling standard deviation
    By Taqman in forum Psychology Statistics
    Replies: 1
    Last Post: 06-09-2010, 11:24 AM
  5. Dummy variables, standard deviation, standard error
    By Fabio Pieri in forum Statistics
    Replies: 1
    Last Post: 02-04-2008, 10:56 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts






Advertise on Talk Stats