Difference between RMS & Standard Deviation

#1
Hi there,

I am trying to figure out the difference between RMS and Standard Deviation. Are they two ways of saying the same thing? I understand that the variance is calculated with the following formula (excuse the notation!)

s2 = Sum(xi-µ)2/N

where
µ is the population mean
N is the population size.

I then found the definition of RMS to be

RMS (Root Mean Squared) Error. To calculate the RMS (root mean squared) error the individual errors are squared, added together, divided by the number of individual errors, and then square rooted. Gives a single number which summarizes the overall error.

However, I though that (xi-µ) would be the error. Therefore, the variance is the individual errors squared and then added together, divided by the number of individual errors. Then the square root is the RMS, which would also be the standard deviation.

Is this correct? Sorry for being a bit dumb!

Thanks

Tim
 

JohnM

TS Contributor
#2
From what I am able to gather, the term has different meanings in different contexts, but it's basically the same thing as a standard deviation.