Linear Regression (Age Interval, 10 years)

jle

New Member
#1
Hi guys,

I initially ran a linear regression, however we found that age (by year) generated a negligible B coefficient.

In response we conducted the same test using age as a 10 year interval.

To achieve this, I simply divided the age variable by 10 which increased the B coefficient.

Is this the correct process? I worry I have done this incorrectly. However the change in the co-efficient would convince me it is appropriate.
 

Dason

Ambassador to the humans
#2
Did the B coefficient increase tenfold? Did you notice the standard error increase tenfold as well...
 
Last edited:

maartenbuis

TS Contributor
#4
Don't feel silly. I do this all the time. Unless we are talking about small children, a year increase in age is small change, and I would not expect that to have a big impact. So I often measure age in decades rather than years, using the exact same trick as you used. This won't change the model, as you just noticed, but in my applications it often makes the coefficients easier to interpret. The bigger issue with age is that its effect is often non-linear, espcially if age covers a large range.
 

hlsmith

Omega Contributor
#5
Probably a moot point, but given your program you can also leave age in as a one year increment and then just run model estimates to get the 10 year or other unit changes.
 

Dason

Ambassador to the humans
#6
Is age the only predictor? If so what does the scatterplot look like? Even if it isn't what does the residual by age plot look like?