Apologies if this is in the wrong forum, and please move if this is the case (I have also posted this in the Statistical Research forum).

Just a query that might be intractable.
When calculating Standard Errors of the Measure (SEM's), we take the root of the Mean Square Error of the residual (MSE from the ANOVA table), then multiply by 1.96 and by root 2 to get the Smallest Detectable Change (SDC).
My query is, when the MSE is less than 1, and the mean of the value is around 1, we get some anomalies.
If, say we have a mean of 1, and MSE's of the residual, of say 0.05, now the SDC becomes 0.62, or 62% of the mean, and this looks lousy. If we multiply everything by 100 before we start (say we choose to arbitrarily use Newton-centimetres instead of Newton-metres) then with the same values (mean now of 100, and MSE now of 5) our SDC is 6.2, or only 6% of the mean, and everything is sunshine and roses again.
I am inclined to think that this is a bug in the generation of SDC's, but want to make sure I'm not committing a howler here.
Any thoughts?