# Thread: [Logistic regression: Stepwise] Why does the model keep a non-significant predictor?

1. ## [Logistic regression: Stepwise] Why does the model keep a non-significant predictor?

Hello there,

I'm running a logistic regression with four independent variables. I use the Stepwise: Backwards LR method for the regression.

SPSS stops the elimination process at step 3 and thus keeping two variables in the model. One of them (A) is significant at the 95 % interval (p=.046), the other one (B) is non-significant (p=.081). The constant for the model at step 3 is significant (p=.003). Omnibus test at step 3 is significant so i guess the model fits the data.

How should i interpret this? Can I conclude that the model (A and B together) has an effect on the outcome variable? And if so, is the effect of A+B combined significant even though variable A is non-significant on its own?

Thank you

2. ## Re: [Logistic regression: Stepwise] Why does the model keep a non-significant predict

How many variables do you have? Forget stepwise, just do it yourself. Back in the day there weren't TV remotes, people had to get up and turn the station themselves If the models are all clustered within each other, you can use the differences in -2loglikelihood values on the chi-square distribution with df = difference in the number of terms in the models.

3. ## Re: [Logistic regression: Stepwise] Why does the model keep a non-significant predict

Originally Posted by hlsmith
How many variables do you have? Forget stepwise, just do it yourself. Back in the day there weren't TV remotes, people had to get up and turn the station themselves If the models are all clustered within each other, you can use the differences in -2loglikelihood values on the chi-square distribution with df = difference in the number of terms in the models.
Thank you for answering and sorry for my delayed response

I have 4 predictors in my model. They are all on ordinal level of measurement. I have one outcome variable which is nominal. My thesis is whether or not any of these four indipendent variables can predict the outcome in my dependent variable. SPSS keeps 2 variables (A and B) in the model and eliminates the others (C and B). C and B is eliminated due to their lack of power in raising the predictive value in the model.

I looked at the differences in -2loglikelihood values and the change is significant based on the model A+B (chi-square= 11.182; p=,004). A and B together describe 24 % of the variance in the outcome variable.

Based on the significant change in -2loglikelihood, my question is:
Can i conclude that the model (A+B combined) is statistical significant at the 95 % criteria - even though one of the predictors (B) itself is non-significant (p=.08) accordning to the 95 % criteria. That A and B together has a significant effect but B alone does not have it.

These statistical analyse methods is beyond my degree...

 Tweet

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts