This may be a naïve question but I have read in several sources (example) that calculating the relative standard deviation is unreliable if the mean is close to zero.
My questions are:
1.) How close should the mean be to zero before RSD(%) becomes unreliable for expressing statistical spread?
2.) Would RSD % be suitable for a set of data such as the one shown below:
Suppose my replicate values are as follows (n = 12):
Data points: 0.1223, 0.1251, 0.1302, 0.1287, 0.1211, 0.1195, 0.1338, 0.1240, 0.1263, 0.1252, 0.1279, 0.1216
Sample mean: 0.1255
Sample standard deviation: 0.00416
Skewness: 0.5134
Kurtosis: -0.1534
D'Agostino & Pearson omnibus normality test (alpha = 0.05)? Passed? Yes
Relative standard deviation: 3.32%
Thank you!
My questions are:
1.) How close should the mean be to zero before RSD(%) becomes unreliable for expressing statistical spread?
2.) Would RSD % be suitable for a set of data such as the one shown below:
Suppose my replicate values are as follows (n = 12):
Data points: 0.1223, 0.1251, 0.1302, 0.1287, 0.1211, 0.1195, 0.1338, 0.1240, 0.1263, 0.1252, 0.1279, 0.1216
Sample mean: 0.1255
Sample standard deviation: 0.00416
Skewness: 0.5134
Kurtosis: -0.1534
D'Agostino & Pearson omnibus normality test (alpha = 0.05)? Passed? Yes
Relative standard deviation: 3.32%
Thank you!
Last edited: