Is regression more important/broader than Anova?

#1
Hello

In my book which has been used for 3 courses in probability and statistics they give Anova and regression almost the same abount of space. But in the course that is closest to what is a "continuation" of the introductory courses in statistics and probability, it seems like they mention regression a lot, but never anova. The course is called "introduction to generalized linear models". And the word regression is mentioned 218 times in the coursebook, anova or "analysis of variance" is never mentioned. Does this mean that regression is a bigger subject than anova?, and can be more generalized?
And that Anova and the f-test is more limited in its use?
 

Dason

Ambassador to the humans
#2
"Regression" and "ANOVA" are both just special cases of the "general linear model". You can think of ANOVA as a special case of regression where you just have categorical variables as your predictors (which get turned into multiple dummy variables).
 
#3
"Regression" and "ANOVA" are both just special cases of the "general linear model". You can think of ANOVA as a special case of regression where you just have categorical variables as your predictors (which get turned into multiple dummy variables).
Ah I see,that is a good way to see it then yes, thanks.

I just can't resist to ask a follow up question since you mentioned somthing. I have read a paper that showed that for a fixed anova model, they could indeed transform it to a multiple regression model. But the notation was extremely messy. But they showed that it could be done for fixed effect anova. Now comes my question: My book also mention random effect anova, where the F-tests are equivalent for fixed effect, but where we actually are testing if a variance is 0 or not. But can this Anova model be transformed to a regression model? The paper didn't mention anything for random effects models, but are they also in a way contained in the usual regression model?
 

noetsi

No cake for spunky
#4
While ANOVA and Regression are theoretically the same method in practice they are commonly run differently and generate results that look very differently. I think, outside a few fields such as agriculture, medicine, and maybe psychology you would find regression run more often. It is taught more I believe in schools, looks simpler, and can handle nominal, ordinal, and binary dependent variables. While I assume ANOVA can do this as well I have never seen an ANOVA equivalent of logistic regression even in advanced courses on ANOVA.

I think the "extremely messy" notation of ANOVA and the simplicity of slopes is a major reason for this.

You can do random effects regression. Multilevel models (some call these hiearchical linear models) are an example.
 

Dason

Ambassador to the humans
#5
While ANOVA and Regression are theoretically the same method in practice they are commonly run differently and generate results that look very differently.
Then you're either doing it wrong or not interpreting the results correctly if you're getting results that are different.
 

noetsi

No cake for spunky
#6
If your ANOVA and regression results look the same - well you are awesome Dason. :) No ANOVA or regression printout I have ever run or seen in any text or class looked like each other.

Below is ANOVA output. I will leave to wiser souls to decide if that looks like regression output. Ignoring that post hoc tests commonly aren't printed out in ANOVA tables.

https://statistics.laerd.com/spss-tutorials/one-way-anova-using-spss-statistics-2.php

I never realized ANOVA results were identical to slopes for continuous predictors (although I realized you could convert back and forth between them).