+ Reply to Thread
Results 1 to 3 of 3

Thread: Multicollinearity

  1. #1
    Points: 3,646, Level: 37
    Level completed: 98%, Points required for next Level: 4

    Posts
    2
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Multicollinearity




    Hello at everyone, I'm new at the site, and I'd appreciate your suggestions.

    I have a multiple -through the origin- regression model. Working with SPSS, I get very high F significance and all t-values are also very significative, all the model assumptions are valid. But, in spite of this, I get a high evidence of multicollinearity, VIF's are very high, (all higher than 10) and condition index is bigger than 25.
    I guess that having no intercept creates a problem for the collinearity tests, because when I run auxiliary regressions among the independent variables (and including intercepts here) I get rather very low Rsquared's in every case. The SPSS calculates the auxiliary regressions without intercept, as in the original model, and get high levels of multicollinearity .

    Theory suggests to exclude the intercept in the original model, because it would had no sense. However, when I include the intercept in the original model, the VIF's are very acceptable (all below 2.00), condition index is low (though with high variance proportions in some variables).
    Also, to run the auxiliary regressions among the independents variables, the intercept is rather a need because theorically it's acceptable.

    my question is: must I run the model through the origin and do the auxiliary regressions aside with the intercept?
    I'm worried because when I exclude the intercept in the auxiliary regressions I get a high level of collinearity, I must forget about this fact?.

  2. #2
    Points: 4,007, Level: 40
    Level completed: 29%, Points required for next Level: 143

    Posts
    46
    Thanks
    4
    Thanked 0 Times in 0 Posts
    You get statistically significant results, right? Why to even worry about Multicollinearity? I though the only problem Multicollinearity created is that it made it harder get statistically significant results.

  3. #3
    Points: 3,646, Level: 37
    Level completed: 98%, Points required for next Level: 4

    Posts
    2
    Thanks
    0
    Thanked 0 Times in 0 Posts

    The problem is that I'm very interested in the parameters interpretation.
    The Damodar Gujarati book, indicates that when there are significant results (t-tests and F test) and multicollinearity persists, perhaps it's not a serious problem, also he puts a note of John Johnston book (Econometric Methods 1984 edition) citing that it can happen when exists whether an overestimation (or subestimation) of the true parameters, but t-values are still significant. I have no access to that book.
    But my problem is I'm first interested in parameters interpretation and the results with the auxiliary regression without intercept suggest an extreme degree of multicollinearity.
    Does auxiliary regressions have to include the intercept, to test the dependency among independant variables? Can this be a choice ?

+ Reply to Thread

           




Similar Threads

  1. Multicollinearity and SigmaStat
    By t2oo5 in forum Regression Analysis
    Replies: 1
    Last Post: 08-03-2010, 01:58 PM
  2. multicollinearity diagnostics
    By Ghada in forum Psychology Statistics
    Replies: 0
    Last Post: 06-24-2007, 06:04 PM
  3. Multicollinearity....
    By Zzzz in forum Statistics
    Replies: 0
    Last Post: 11-11-2006, 04:57 PM
  4. Multicollinearity in SPSS
    By Love in forum SPSS
    Replies: 1
    Last Post: 08-14-2006, 02:55 AM
  5. Multicollinearity in SPSS
    By Love in forum SPSS
    Replies: 2
    Last Post: 08-11-2006, 09:11 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts






Advertise on Talk Stats