How to adjust Pseudo-R2 in logistic regression?

I was looking for a Pseudo R2 in logistic regression with the precise aim to have a good measure of the variability explained by the model. So I found that the best pseudo R2 for this aim is the McKelvey & Zavoina Pseudo-R^2. But I'm interested in using this measure also in multiple logistic regression. Anyway, I have to consider that, like in the standard R2 of linear models, this can tend to increase in a biased way as the number of predictors increase. So, Is there an adjusted version of this Pseudo R2, like the Adjusted R2 for classical linear models that allow me to have reliable data about the explained variance by the model? I state that my precise aim is to quantify the explained variation.


Less is more. Stay pure. Stay poor.
I will note, most people don't recommend using R^2 with logistic models, given you have a different data structure than in linear regression. Better approaches usually include looking at model accuracy and calibration curve plots.


Fortran must die
There are many many pseudo r squared for logistic regression (20 or more I think probably more). I don't think there is any agreement which is better. Not all would agree they are even valid to use.

Personally I don't think even in linear regression R squared is that useful a measure. I pay it almost no attention.