Error estimates with r2 in linear regression

I have been given the results of some regression analysis, where some variable y is a function of x1 and x2. I have a r2 value for the analysis, and I would like to know if I can use that info to estimate a maximum and minimum value for y (say some 95% estimates), given x1 and x2. e.g. y could be as high as 20, and as a low as 10.

Is that possible, and if so, how could I do it?


Dark Knight
What I understand from your question is:
can we come up with CI for y ( given x1 and x2) by only using Multiple R2 ?

No. It is not possible.

If you have some assumptions/ or information about regression coefficient then some attempt can be done.
What if one were to assume that the residuals were distributed normally from the line of best fit? Would that change things?

Thanks for your reply