# How2interpret inconsistent Beta values in different steps of hierarchical regression?

#### Cyrs

##### New Member
I did hierarchical regression analysis on my data due to having moderation effects in my research model.
View attachment 2527
R2 increased from .695 in model1 (main effect only) to .734 in model2 (main &interaction effects)(sig. F change = .000). All the assumptions for the regression analysis have been met. I have two problems with the "coefficients" table:
View attachment 2528
1. As u can see in the table, the insignificant beta value of ZSC in model1 became significant in model2! Is it ok? I'm confused! Which value should i consider to reject/accept the related hypothesis? B of model1 (which rejects the hypothesis) or 2 (which confirms it!!)?

2. Although the Beta value for ZSC_X_CS is significant, its positive sign is against the hypothesis! it's supposed to have a negative sign according to the literature & also logic! How should i treat this hypothesis? Accept? Reject? Partially accept?!!!

#### noetsi

##### No cake for spunky
Re: How2interpret inconsistent Beta values in different steps of hierarchical regress

A question about your method which you call hiearchial regression. The way I have seen that method used before, it has multiple levels in the analysis (with the parameters at lower levels explaining variation in parameters at higher levels). This is often called multilevel analysis. I don't see that in your model so I wanted to be sure that is what you are talking about. Or are you simply doing OLS and adding variables at each level?

Many factors can cause a variable to change its coefficient or signficance from model to model. Multicolinarity is one, although I would not think that is true simply by adding an interaction effect. Are you adding new main effects in the 2nd model or just an interaction effect? You should not use Rsquared if you are adding variables to assess which model is better. You should use adjusted R squared. Or simply use the F change test. This is to assess the overall model. If you want to know whether you should accept the slope coefficients (not the overall model) you should utilize the model that makes the most theoretical sense.

It would help if you listed which variables are in each model (I can not see the table). A reversal of signs may reflect a moderator effect (where a variable is having indirect effect on the DV through another IV) or multicolinarity. Normally this would not be an issue if you are only adding an interaction term, but I am not sure that is what is occuring since I dont know the two models.

#### trinker

##### ggplot2orBust
Re: How2interpret inconsistent Beta values in different steps of hierarchical regress

My understanding of the hierarchical regression you're doing is that you are adding predictors to the model in a stepwise/blockwise fashion and looking at significance. The term for this process has changed with the introduction of HLM (hierarchical linear model; multilevel modeling) which means something very different and some call it forward stepwise/block wise regression (not to be confused with the analysis in which the computer determines what to include in the model also called stepwise selection). Generally I see the term for this procedure you're using as hierarchical multiple regression. The F change tells you the significance of a predictor while holding the other previous variables constant. ie does x3 acount for change in y above and beyond x1 and x2?

This info is from what I learned in my Linear Regression course so please check what I'm telling you:
The important outputs are the F change and R change stats. The F change tells you whether the inclusion of a predictor in the model is a significant factor in Y. The R change (delta R) is a strength of effect measure. As far as betas you discuss them in terms of significance at each step/block of the model after you discuss the delta F and delta R^2. This is your omnibus test. Some will say discussion of the betas after determining delta F to be non significant is not necessary. Often these betas are standardized when the scale of the measurement is unfamiliar. Betas are not affected by order of entry in the model where as the delta R squared is. BEtas are affected by number of predictors in the model where as delta R^2 is not.

As far as which coefficients table to include: Some include just the coefficients from the full model and some include the coefficients at each step/block of the model. Generally for rejection of Ho in hierarchical multiple regression you're primarily concerned with delta F, it's p value, and the strength of effect (which is delta R and can be interpreted as percent change in variation of Y explained by Xn). Hopefully this gives direction rather than confuses.

Maybe other more knowledgeable people will provide more clarification.

#### noetsi

##### No cake for spunky
Re: How2interpret inconsistent Beta values in different steps of hierarchical regress

If you are comparing different blocks in hiearchial regression then the F change statistic and the adjusted Rsquare are the key indicatiors on whether adding new variables has improved the model or not. You want the adjusted Rsquare to be higher and the Fchange test to be significant.