# Search results

1. ### Series of doubled numbers

I'm telling you to use math. Hint: logarithms You see there's this rule here about having to show effort toward solving your question... ;)
2. ### Series of doubled numbers

The series is 3 * 2^n for n = 0, 1, 2, ... To find the value of n that yields 3M, set the expression above equal to 3M and solve for n.
3. ### Fleeting/Random Thoughts

Sounds good, gotta try that

In my opinion these residual plots look fine enough, I wouldn't be too worried.
5. ### Factor analysis

Another, similar way to approach this would be partial least squares regression.
6. ### R to Python List manipulation

If (and only if) you're working with a dictionary where you know that every value is a list, then you could cast the dict values (discarding the keys) into a list of lists using list(x.values()) and then apply one of the recipes here for flattening a list...
7. ### R to Python List manipulation

First of all, that's a dictionary, not a list. You construct dictionaries with curly braces and lists with square brackets. A dictionary maps keys to values. A dictionary's .keys() method will return, well, its keys: in this case the list names ['Titles', 'Entities']. The .values() method will...
8. ### Multilevel regression with two clusters

Like others already mentioned, this is a crossed random effects model, which can easily be fit in most (but not all) stats packages, including lme4 in R, SAS PROC MIXED/GLIMMIX, and others. The syntax is package-specific of course but usually it's as simple as just add separate random effect...
9. ### Overdispersion/ unobserved heterogenity in logistic regression.

"Unobserved heterogeneity" in logistic regression is nothing to be afraid of. I address this here, arguing directly against Allison and Mood: http://jakewestfall.org/blog/index.php/2018/03/12/logistic-regression-is-not-fucked/ Overdispersion is a completely different issue. In logistic...
10. ### What's the difference between running an ANCOVA and running an ANOVA with the residuals (of the covariate) as the response?

Yes, I agree. Mainly I think the alternative method is interesting as a way of understanding what ANCOVA is doing "under the hood." But in practice you wouldn't normally literally do it that way.
11. ### What's the difference between running an ANCOVA and running an ANOVA with the residuals (of the covariate) as the response?

This is almost true, but not quite. You're missing one step here: you also need to regress the independent variable (IV) on the covariate and save those residuals too. Then if you regress the DV residuals (which you already mentioned) on the IV residuals (which I just mentioned), the resulting...
12. ### R squared and correlation in R

In your own example, r = 0.53 and R^2 = 0.28....so clearly they're not the same.
13. ### SAS v R

Feel free to explain the methodological problems you spotted.
14. ### SAS v R

There is data. There is data. There is data. There is data. There is data. There is data. http://r4stats.com/articles/popularity/ You cannot keep ignoring this and spewing this unfounded **** about SAS being more popular than R in general. Are there specific pockets of industry where SAS is more...

16. ### Unobserved heterogeneity in logistic regression

Yes, I agree. We discussed that in the chatbox starting with the following two chats:
17. ### Unobserved heterogeneity in logistic regression

Thanks, yes, this is an excerpt from Pearl's book, which I own and have read. I've definitely re-read that section several times while working on my blog post.
18. ### Bayesian Posterior: SD = SE?

Technically the answer to this is always no, since these two quantities have totally different conceptual meanings -- one is the SD of the marginal posterior distribution for a parameter, one is the SD of the sampling distribution of the estimate of the parameter. With that said, we might still...

Nah
20. ### Unobserved heterogeneity in logistic regression

spunky: And this only works when you have like a continuous covariate and a categorical predictor? spunky: Or is it irrespective of the type of data? Jake: so now we can imagine that for any logistic regression equation that we estimate, there's almost certainly lots of variables we omitted that...