- Thread starter janhen2
- Start date
- Tags data dependent independant data stat statistics

You dont need a test if you just want a basic idea. If you look at the plots of citation flow vs trust flow you can decide for yourself. It appears in most of those plots that as citation flow increases, so does trust flow, therefore it would appear, at least at first sight, that these two variables depend on the other.

Last edited:

R squared values show how much (in a bivariate model) the IV explains in the DV. There is no definition as far as I know of independence, other than formally having a R squared of zero (which is unlikely in a real world problem even by chance). So you would decide if substantively the r squared suggest independence.

(I'm a spss user) as far as i know, multicollinearity checking only possible in linear regression. after defining your dependent variable and independent variable, click "statistics" - checked in "collinearity diagnostic".

in the output regression there will be additional columns "Tolerance" and "VIF" (Variant Inflation Factor). How to interpret the result? u can choose booth or one of them. for example: if we will use VIF then if the number is close to 1 then there is no correlation between independent variables, if VIF value getting bigger then those variables dependent with one/more others independent variable

If r=0 and we have a dependent relationship, the simple linear model is an incorrect model.

Code:

```
> x <- seq(-10, 10)
> y <- x^2
> o <- lm(y ~ x)
> o
Call:
lm(formula = y ~ x)
Coefficients:
(Intercept) x
3.667e+01 -5.121e-16
> summary(o)
Call:
lm(formula = y ~ x)
Residuals:
Min 1Q Median 3Q Max
-36.67 -27.67 -11.67 27.33 63.33
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 3.667e+01 7.498e+00 4.89 0.000102 ***
x -5.121e-16 1.238e+00 0.00 1.000000
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 34.36 on 19 degrees of freedom
Multiple R-squared: 2.926e-32, Adjusted R-squared: -0.05263
F-statistic: 5.559e-31 on 1 and 19 DF, p-value: 1
```

I have shown in this case that there can be an exact dependence and still we get R^2 = 0.

My point is that R^2 is, per difinition, a measure of covariation between the IV´s and the DV, as far as I know. If we have a real relationship but an R^2=0, then the model from where we calculate R^2 is a bad model.