Sometimes I find x1, x2 and x3 statistically significant if considered separately. When I put them together in the same model they become non statistically significant.

Is this a problem of specification?

Thanks.

regards.

N.

Not in general and therefore: NO. But that being said: there could be a problem of misspecification. (And I know this is so inprecise as not to be helpful but you cannot expect me to lie

)

Figure out what happens to the variance of your OLS-(beta)-estimators in a multiple linear regression ass you add variables (should be explained in any introductory book on multiple linear regression). Even though we "suppose that I have no collinearity between x1, x2 and x3" this could mean perfect collinearity but there might still be multicollinearity - so read about this problem and this should be related to the variance of your estimator. Which again is related to the outcome of you're t- or F-tests.

2) Regarding misspecifaction:

Again my best advice is probably to pick up a book about multiple linear regression in general. I study economics and have found Woolridge Introductory Econometrics to be a fine practically oriented introduction to multiple linear regression.

Otherwise I would advise you to state more clearly what you are trying to achieve - rather than stating the problem with generic variables x1,x2,x3 - what kind of data you are working on, what are you're theoretical hypothesis or simply commonsense expectations that you are trying to test and often most importantly assuming that this is schoolrelated: What are the teachers expectations?

That being said one of the basic assumptions of multiple linear regression is that the model is linear in the parameters but this does not mean linear in x1 x2 x3.

There can be interactioneffects (as suggested by hlsmith in #2): The effect of x1 might be higher when x2 is higher.

There can be increasing or decreasing marginal effect: The effects of x1 might be higher for high values of x1 that for low values of x1.

There could be other important variables missing resulting in autocorrelated errorterm and worse bias and inconsistency of estimators.

and the list goes on...

If youre work is schoolrealated and youre not expected to know about this stuff the solution is simply to forget you ever heard about it.

My reason for not giving you a better answer is simply because for you to be able to understand misspecificationtests you need to understand how multiple linear regression works. This cannot be explained in a simple way in a few lines and is much better explained be people more knowing on the subject that I am hence the referral to books. And the same goes for the misspecificationtests themselves...