Binary Logistic Regression- SPSS Output

Dear TalkStats Forum,

I am currently looking into a data-set which measures previous cyber-crime victimization as well as whether the participants have changed their behavior by making fewer online purchases or making less use of online banking. Since the behavior change is dichotomously measured (either they did make fewer online purchases or not), I have to rely on Binary Logistical Regression. The SPSS output tells me that 15463 respondents did not change their online purchase behavior and 3248 did reduce the amount of online purchases. I used previous Cybercrime victimization as the the variables in the equation, and 4 out of 7 variables are significant. Nevertheless, the Nagelkerke R Square gives me a value of .002 which I find strikingly low. It basically would tell me that previous cybercrime victimization has pretty much no explanatory power. I attached the SPSS output and wanted to ask whether this output is realistic or whether and if yes where I did a mistake.

Best regards and thanks a lot for your help,



Fortran must die
There is a lot of dispute about the pseudo R squares who's meaning are not well defined unlike linear R squares and which have problems I am told such as generally low levels. In addition there are many possible pseudo R squares and software rarely reports more than two. Goodness of fit such as Hosmer-Lemeshow and the global test such as negative 2 LL are actually stressed more in the literature I have seen. For very complex phenomenon you might have a low R squared value even in linear regression because many variables explain the results.

What have others in this area encountered in terms of their Nagelkerke values?