When can I consider a Béta (standardized coefficient) big?

#1
Hello Guys,

I need your help once again (unfortunatelly). Below you'll find my regression model. But I don't know how to interpreted the Béta (ß) (standardized coefficents). I know it is meant to compare the variables, but how do I do that correct? When can I consider something big? I get that a value of 0,01 is small. But can I call 0,07 small? Can I call that 0,20 big? I know the beta is measured on a -1 to 1 scale. But on that scale a 0,20 is big.

What is the rule of thumb for this? I can't find it anywhere in my statistics manual.

1653379539460.png
 

Karabiner

TS Contributor
#2
You could consider the b's (such as: for each 5 year increase in age, the dependent variable increase by 5*0.36 points).
Whether such observations are interesting, or theoretically or practical important, you'll probably have to judge
for yourself, based on theoretical and/or practical context of the study. Mind that you cannot interpret noisy estimates
from a sample of small or medium size straightforwardly as "effect sizes". The true effect sizes (those in the population)
are larger or smaller than your estimates from the sample data.

There's been some discussion on betas as effect size measures, see for example
https://www.researchgate.net/post/Do-effect-size-heuristics-for-standardised-betas-exist
https://www.semanticscholar.org/pap...iemi/461c004f11b92a6331b7aa0f24bc44c4da83b6ab

Just my 2pence

Karabiner
 
#3
Thank you for your response. I found the following on one of the websites you recommended:

"Limitations of Standardized regression coefficients
The standardized coefficients are misleading if the variables in the model have different standard deviations means all variables are having different distributions."

Maybe a stupid question, but I did not found anything about it online. How can i calculate the mean of a standarddeviation? Also in SPSS I can't find that option.